CN109760045B - Offline programming track generation method and double-robot cooperative assembly system based on same - Google Patents

Offline programming track generation method and double-robot cooperative assembly system based on same Download PDF

Info

Publication number
CN109760045B
CN109760045B CN201811610540.6A CN201811610540A CN109760045B CN 109760045 B CN109760045 B CN 109760045B CN 201811610540 A CN201811610540 A CN 201811610540A CN 109760045 B CN109760045 B CN 109760045B
Authority
CN
China
Prior art keywords
entity
robot
assembly
information
occ
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811610540.6A
Other languages
Chinese (zh)
Other versions
CN109760045A (en
Inventor
吕红强
郝乐乐
韩九强
郑辑光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201811610540.6A priority Critical patent/CN109760045B/en
Publication of CN109760045A publication Critical patent/CN109760045A/en
Application granted granted Critical
Publication of CN109760045B publication Critical patent/CN109760045B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

An off-line programming track generation method and a double-robot cooperative assembly system based on the method are disclosed, and the system comprises a double-robot workpiece assembly area with a visual unit, a computer and a wireless module for communication. And starting the double-robot cooperative assembly system, triggering the vision unit to acquire an image of the workpiece to be assembled, and operating the off-line programming software by the two computers to obtain the track required by the workpiece assembly and the code corresponding to the track by using a track generation method. By running the code, the two robots can finish the independent assembly work of a single machine and the cooperative assembly work of the two machines under the state of mutual communication. The invention can safely and stably realize the simulation and actual assembly process of the double-robot cooperative assembly work based on the offline programming software environment, has safe and reliable whole assembly process, high precision and strong robustness, is suitable for the assembly environment of various workpieces, and can meet the application requirement of the double-robot cooperative assembly.

Description

Offline programming track generation method and double-robot cooperative assembly system based on same
Technical Field
The invention belongs to the technical field of intelligent manufacturing and robots, and particularly relates to an off-line programming track generation method and a double-robot cooperative assembly system based on the method.
Background
With the rapid development of the manufacturing industry in China, the automation technology is widely applied, and the industrial robot plays an increasingly important role. A single robot often cannot complete more complex work tasks due to its own limitations. The duplex robot has the advantages of strong adaptability and good flexibility, and is an effective way for realizing intelligent manufacturing. However, the problems of collaborative path planning, collaborative position calibration, path error compensation and the like of the duplex robot are key technical problems which are urgently needed to be solved in the application and popularization processes of the robot in the industrial environment. Meanwhile, along with the continuous improvement of the requirements of the production process, the requirements on the precision of the processing track of the industrial robot are also continuously improved. The traditional teaching mode has the characteristics of complex self operation and low efficiency, and cannot be applied to the operation requirements of the existing industrial robot. Due to the lack of a visual processing unit in the existing off-line programming software, human errors introduced in the workpiece calibration process cannot be avoided. Meanwhile, the existing offline programming technology for the assembly action with angle screwing can be completed only by determining the rotation angle by combining teaching, and the offline programming in the complete sense is not realized.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide an offline programming track generation method and a two-robot cooperative assembly system and method based on the method. The double-robot cooperative assembly system is a cooperative assembly system combining an off-line programming track generation method and a visual processing unit, can be used as a model system of an intelligent automatic system for in-depth research, fully embodies the concept of industrial robot cooperative operation and the concept of industrial robot replacing manual operation, has strong expandability and provides a foundation for further research on the group industrial robot intelligent automatic system.
In order to achieve the purpose, the invention adopts the technical scheme that:
an off-line programming track generation method comprises a STEP model reconstruction algorithm and a track generation algorithm, and is characterized by comprising the following STEPs:
reading the STEP model file, reading the information of the STEP model file line by line in a character string reading mode, selecting a character segmentation algorithm suitable for a sentence structure of a file data segment, classifying and sorting the information represented by each line of character string, and storing the information in a memory of a computer.
Extracting geometric information and topological data, extracting entity instance information described by each row of character strings, comparing keywords, and obtaining model geometric information and topological data described by an EXPRESS language.
And (4) model reconstruction, namely generating a corresponding model in OCC (OpenCASCADE) by using the extracted STEP model geometric information and topological data and displaying the corresponding model in computer offline programming software.
And (3) track generation, wherein the grabbing points and the placing points are defined in a mode of picking points on the OCC model of the workpiece, a series of interpolation points are generated between the grabbing points and the placing points by using an isoparametric interpolation algorithm, and finally all the points form the assembly working track of the robot.
The STEP model file reading is to extract effective information of a STEP model file stored in a character string form in a line-by-line reading mode and store the effective information in a computer memory, and a corresponding character segmentation algorithm is selected according to the STEP file format to segment each line of character strings and extract effective entity information in the character strings. The specific operation is that, starting from the data segment of the STEP file, for each row of character strings, the number between "#" and "#" is extracted to represent the entity number, the character string between "#" and "(" is extracted to represent the type of the entity, the character string between "(" and "," is extracted to represent the name of the entity, the first of "(" and ")", "the following character string is passed through", "is divided to obtain each character string to represent the parameter of the entity, the entity number, the entity type, the entity name and the parameter information of each entity are stored in the form of structures, which also form the information of the STEP model in the memory, for example, #206 ═ direcrion (", (0.E +000, -1.)) after character division, the entity DIRECTION (0, 0, -1).
The extracted geometric information is the geometric information which represents the STEP model file through the geometric entity in the STEP AP203 protocol. Specifically, geometric information in the entity is described uniformly by adopting a parametric curve and a parametric surface. Generally, all geometric information can be accurately described by adopting a B-spline entity representation method. Extracting topology data is to represent topology data in a STEP model file by a topology entity in the STEP AP203 protocol. Specifically, the topology data of the model is represented under the STEP AP203 protocol by using a hierarchical relationship from low to high.
The model reconstruction is to create an OCC object corresponding to the entity information in the STEP file. And for each entity structure body, inquiring the mapping relation between the STEP entity and the OCC object according to the type of the entity structure body, creating a corresponding OCC object through a related function in the OCC, and initializing the object by using entity parameters. If there is a parameter that an entity is another entity, the entity needs to be created as an OCC object with parameter attributes and initialized. In order to avoid the repeated initialization of the OCC object, a data structure of map is selected to store initialization data. The steptogemom class in OCC contains methods for converting a STEP geometric entity into an OCC geometric object, and the steptopops package provides methods for converting a STEP topological entity into an OCC topological shape. By using the method, the OCC object corresponding to the entity information can be quickly created.
The track generation is to define the grabbing point and the placing point of the workpiece by picking up the points on the OCC model of the workpiece to be assembled. The middle track point between the grabbing point and the placing point needs to be realized by using a track generation algorithm. The specific operation is that the track between the grabbing point and the placing point is divided into a plurality of straight line segments, the generation of a middle track point is realized by adopting an equal parameter interpolation algorithm aiming at each straight line segment, and the grabbing point and the placing point which are defined before are combined to form the assembly track of the robot.
The double-robot cooperative assembly system based on the offline programming track generation method comprises a double-robot cooperative workpiece assembly area 1, single-robot independent workpiece assembly areas 2 and 3, a robot control area 4 and display screens 5 and 6, wherein:
the double-robot cooperative workpiece assembly area 1 comprises two industrial robots 20 and 21 working oppositely, and clamping jaws 13 and 16, vision acquisition units 12 and 17 and assembly body placing trays 14 and 15 are respectively arranged at the tail ends of the two industrial robots 20 and 21;
the single-robot independent workpiece assembly areas 2 and 3 comprise workpieces to be assembled and assembly body placing bases 18 and 19 which are placed on a workbench and fixed on the workbench;
the robot control area 4 comprises computers 7 and 10, robot control cabinets 8 and 9 and a wireless module 11, wherein the wireless module 11 undertakes communication between the robot control cabinets 8 and 9 and the computers 7 and 10 and between the two computers 7 and 10;
the vision acquisition units 12 and 17 are used for acquiring pictures of the workpieces to be assembled to obtain the placing angle information of the workpieces to be assembled, and the off-line programming track generation method is operated in the computers 7 and 10 to control the motion tracks of the robots 20 and 21.
The wireless module 11 is a star-shaped local area network with wireless routers as centers formed among the computers 7 and 10 and the robot control cabinets 8 and 9 through the technology of internet of things, all nodes in the local area network are forwarded through the wireless routers to complete the communication of the whole system, wherein, the command takes "#" as an initial symbol and "/" as a character string of a termination symbol, if a plurality of commands are sent at one time, the commands need to be separated by "+", when a server or a client receives a message, the character string is firstly traversed, if the initial symbol and the termination symbol are found to be incomplete, the command is not complete, and the command is discarded to wait for re-reception; if the command is complete, the character string is divided by the plus separator to obtain a plurality of sub-character strings, different processing modules are selected according to the command type corresponding to each character string, an asynchronous mechanism based on message response is used as a message processing mechanism, and a mutual exclusion lock method is used for preventing a plurality of threads from simultaneously accessing the message processing modules.
According to the double-robot cooperative assembly system, when a fault occurs in the double-robot cooperative assembly process, the two robots 20 and 21 can be switched to a single-machine mode to carry out independent intelligent assembly in the single-robot independent workpiece assembly areas 2 and 3 respectively.
In the two-robot cooperative assembly method based on the offline programming track generation method, the two robots respectively execute the offline programming track generation step, so that the workpiece assembly work under the actual environment is completed in the cooperative assembly system, and the method specifically comprises the following steps:
step 1: starting the double-robot cooperative assembly system;
step 2: triggering the vision acquisition units 12 and 17, acquiring pictures of the workpieces to be assembled, and calculating the angle of the tail end of the robot required to rotate in the assembling process by using the image information of the workpieces required to be screwed in the assembling process;
and step 3: measuring coordinate information of the workpieces in the assembly area by using a three-point calibration method and storing calibration data corresponding to each workpiece in a computer memory;
and 4, step 4: running off-line programming software in computers 7 and 10, importing relevant models of the robots and the assembled workpieces, calibrating each model, combining the angle calculation result of the step 2, performing simulation, collision detection and track generation of independent workpiece assembly and cooperative workpiece assembly of the two robots, and entering the step 5 after waiting for the required track generation;
and 5: converting the track generated in the simulation environment into executable codes through a post code conversion function in the off-line programming software, and importing the executable codes into the robot control cabinets 8 and 9 for waiting execution;
step 6: setting the two robots 20 and 21 to be in an online mode, and running the executable codes imported in the step 5;
and 7: and after the assembly work is finished, closing the double-robot cooperative assembly system. Compared with the prior art, the invention has the beneficial effects that:
the two-robot cooperative assembly system fully embodies the concept of industrial robot cooperative operation. Compared with the existing off-line programming software technology, the invention adds the visual angle recognition unit and the visual error compensation unit, can adapt to the assembly work of various workpieces, and has high efficiency and strong expandability. On the basis of the software technology and the hardware device, the mode of the intelligent operation of the group industrial robots can be further researched, and a powerful basis is provided for relevant research and practice of enterprises and colleges.
The off-line programming track generation method provided by the invention replaces the traditional manual teaching programming mode, and improves the programming efficiency and the working precision of the industrial robot. Compared with the existing offline programming software technology, the visual information processing unit is added, so that the track generated by the method can be suitable for assembly work of various workpieces, and the method is high in adaptability and strong in transportability.
Drawings
FIG. 1 is a flowchart of a STEP model reconstruction algorithm in off-line programming according to the present invention.
FIG. 2 is a flowchart of a trajectory generation algorithm in off-line programming according to the present invention.
Fig. 3 is an architecture diagram of the dual robot cooperative assembly system of the present invention.
Fig. 4 is an operation flow of the dual-robot cooperative assembling system of the present invention.
Detailed Description
The embodiments of the present invention will be described in detail below with reference to the drawings and examples.
As shown in fig. 1, for the off-line programming-based trajectory generation method proposed herein, a model reconstruction algorithm flow in the method is described in detail in conjunction with an example of workpiece assembly, and the specific steps are as follows:
(a) reading the STEP model file, extracting effective information of the STEP model file stored in a character string form in a line-by-line reading mode, storing the effective information in a computer memory, selecting a corresponding character segmentation algorithm according to the STEP file format to segment each line of character strings, and extracting effective entity information in the character strings. The specific operation is that, starting from the data segment of the STEP file, for each row of character strings, the number between "#" and "#" is extracted to represent the entity number, the character string between "#" and "(" is extracted to represent the type of the entity, the character string between "(" and "," is extracted to represent the name of the entity, the first of "(" and ")", "the following character string is passed through", "is divided to obtain each character string to represent the parameter of the entity, the entity number, the entity type, the entity name and the parameter information of each entity are stored in the form of structures, which also form the information of the STEP model in the memory, for example, #206 ═ direcrion (", (0.E +000, -1.)) after character division, the entity DIRECTION (0, 0, -1).
(b) Extracting geometric information and topological information, wherein the extracted geometric information is the geometric information which represents a STEP model file through a geometric entity in a STEP AP203 protocol. Specifically, geometric information in the entity is described uniformly by adopting a parametric curve and a parametric surface. For example:
#191=LINE(”,#192,#193);
#192=CARTESIAN_POINT(”,(-2.0,1.0,2.0));
#193=VECTOR(”,#194,1.);
#194=DIRECTION(”,(-1.,0.E+000,0.E+000));
represents a straight line segment with an origin (-2,1,2) and a magnitude of 1 in the negative x-axis direction.
Extracting topology data is to represent topology data in a STEP model file by a topology entity in the STEP AP203 protocol. The specific operation is to use a hierarchical relationship from low to high to represent the topological data of the model under the STEPAP203 protocol, for example:
a closed shell entity in the STEP model file is defined as:
#13=CLOSED_SHELL(”,(#14,#257,#558,#709,#803,#874,#944,#1052,#1151,#1232))
the topological data is the entity number 13 of the closed shell, which is composed of 10 high-level surfaces defined by #14, #257, #558, #709, #803, #874, #944, #1052, #1151, # 1232.
(c) OCC model conversion, creating corresponding OCC object according to entity information in STEP file. The specific operation is that an assembly body in the STEP model file is mapped to TopodS _ Compound in the OCC, parts in the assembly body are mapped to various shape contained in the TopodS _ Compound, and a geometric entity and a topological entity which form the parts are mapped to a geometric object and a topological shape in the OCC, so that the mapping relation between the STEP model and the OCC model is established. And for each entity structure body, according to the entity type of the entity structure body, creating a corresponding OCC object by using a known mapping relation between the STEP entity and the OCC object and initializing the OCC object by using entity parameters. If an entity is a parameter of another entity, an OCC object of the entity as the parameter is immediately created and initialized. All initialized objects are stored in a map, and repeated initialization is avoided. The StepToGeom class in OCC provides a method for converting a STEP geometric entity into an OCC geometric object, and the steptopods package provides a method for converting a STEP topological entity into an OCC topological shape. With these methods, the corresponding OCC object can be created quickly.
As shown in fig. 2, based on the model reconstruction algorithm, the main flow of the trajectory generation algorithm is further described in conjunction with the workpiece assembly example, and the steps are as follows:
(a) introducing a STEP model of a required robot and a STEP model of a workpiece into software, generating an OCC (optical center control) model by using a model reconstruction algorithm, and calibrating the model by using a three-point calibration method;
(b) defining a grabbing point and a placing point of a workpiece on an OCC model of the workpiece;
(c) defining a track auxiliary point required by a robot assembly track;
(d) dividing the assembly track of the robot into a plurality of straight line segments, and selecting a parameter equation corresponding to the straight line segments, wherein the parameter equation of the three-dimensional straight line in the OCC is as follows:
C(u)=P+u·D,u∈(-∞,+∞)
in the formula: u is a curve parameter; p is a point in three-dimensional space; d is a direction in three-dimensional space.
(e) And (5) carrying out equal-parameter curve interpolation. Selecting an equal-parameter curve interpolation algorithm aiming at a straight line segment to generate a middle track point between a grabbing point and a placing point, and specifically comprising the following steps of:
i. determining the number N of points needing interpolation;
setting step size of aliquoting parameter
Figure GDA0002528152450000081
Parameter space [ u ] for straight line segment according to step size Δ umin,umax]Uniformly dividing to form an arithmetic series { u }min,umin+Δu,umin+2Δu,…,uma}x
Calculating the point on the straight line segment for each parameter in the arithmetic progression according to the formula c (u) ═ P + u · D, u ∈ (— infinity, + ∞).
(f) The attitude interpolation is essentially the calculation of three euler angles α, β, γ at the interpolation point. A local coordinate system is typically used in trajectory generation to define the pose of the end of the robot. Specifically, the function in the geomlprep packet is used to calculate the tangent vector of the straight line segment at the track point as DirX of the local coordinate system of the track point, and calculate the normal vector of the plane at the track point on the plane where the straight line segment is located as DirZ of the local coordinate system of the track point, so that Z, X, Y directions in the local coordinate system of the robot end are respectively: DirZ, DirX, DirY ═ (-DirZ) × DirX, from which the corresponding attitude matrix is obtained, so that the transformation matrix between the base coordinate system and the local coordinate system can be found
Figure GDA0002528152450000082
In the formula: c is shorthand for cos and s is shorthand for sin
If the slave base coordinate system C is knownOTransformation into a local coordinate system CO'Of the rotation matrix
Figure GDA0002528152450000083
Three Euler angles corresponding to the attitude matrix of the tail end of the robot can be obtained according to the transformation matrix;
Figure GDA0002528152450000091
Figure GDA0002528152450000092
Figure GDA0002528152450000093
(g) and generating a series of codes containing the positions and postures of the track points, and downloading the corresponding codes to the corresponding robots to wait for running.
As shown in fig. 3, the two-robot cooperative assembly system based on the offline programming trajectory generation method of the present invention includes a two-robot cooperative workpiece assembly area 1, single-robot independent workpiece assembly areas 2 and 3, a robot control area 4, and computer display screens 5 and 6.
As shown in fig. 3, the dual-robot cooperative workpiece assembling area 1 includes two industrial robots 20, 21 working in opposite directions, the ends of the two industrial robots 20, 21 are respectively provided with clamping jaws 13, 16, vision collecting units 12, 17 and assembly body placing trays 14, 15, the clamping jaws are used for clamping workpieces in the single-robot independent workpiece assembling areas 2, 3, the smart camera is used for identifying the types of the workpieces and the placing angles of the workpieces, and the assembly body placing trays are devices for placing the workpieces by the two robots in the cooperative assembling process. The single-robot independent workpiece assembly areas 2 and 3 comprise workpieces to be assembled which are placed on a workbench and assembly body placing bases 18 and 19 which are fixed on the workbench. On the basis of normal communication, the two robots can alternately assemble in a mode that one robot clamps and puts workpieces and the other robot lifts the workpieces, and finally the two robots can complete the assembly work of a complete assembly body in a mutually matched mode. Particularly, when a fault (such as communication disconnection) occurs in the cooperative assembly process of the two robots, the two robots can be switched to a single-machine mode to carry out independent intelligent assembly in the independent workpiece assembly area. The assembly operations to be screwed in the stand-alone mode are performed by the robot by means of the bases 18, 19 fixed to the table. The robot control area 4 comprises computers 7 and 10, robot control cabinets 8 and 9 and a wireless module 11, wherein the wireless module 11 undertakes communication between the robot control cabinets 8 and 9 and the computers 7 and 10 and between the two computers 7 and 10;
as shown in fig. 4, the operation flow of the two-robot cooperative assembling system is as follows:
(a) starting a double-robot cooperative assembly system;
(b) triggering a visual unit to acquire and process the placing angle information of the workpiece;
when the workpiece angle information is acquired, pictures of all workpieces are acquired through the tail end of the mobile robot. And processing the acquired picture and extracting effective information by computer offline programming software. The picture processing steps are as follows:
i. and (4) preprocessing a target area. And setting an ROI (region of interest) of the original image according to the workpiece identification result region, and converting the ROI of the image from an RGB (red, green and blue) color space to a CMYK (hue, saturation and chroma) color space. Wherein, the RGB and CMYK color space conversion relationship is as follows:
Figure GDA0002528152450000101
object segmentation. And extracting an image of a specific channel of an ROI (region of interest) region of the image according to the type of the workpiece, and performing threshold segmentation and noise reduction on the image to obtain a target workpiece binary image.
And iii, extracting the central point of the workpiece. And (4) solving a circumscribed rectangle of the target workpiece binary image, wherein the center of the circumscribed rectangle is the workpiece center coordinate.
Distance filtering. And carrying out filtering processing according to the distance between the angle identification mark and the center of the workpiece, and preliminarily determining the position of the identification mark.
Morphological treatment. And carrying out certain corrosion expansion and area filtering on the preliminarily determined area to obtain an image finally subjected to angle calculation.
Calculating the workpiece angle. And calculating the center coordinates of the connected domain of the identification mark, and calculating the angle of the workpiece according to the center and the center coordinates of the workpiece.
(c) And extracting the world coordinate information of the workpiece by using a three-point calibration method to obtain calibration information.
(b) Importing a model of the required robot and the workpiece into offline programming software, and calibrating the model by using calibration information;
(e) running off-line programming software, simulating the process of the cooperative assembly of the two robots and the process of the independent assembly of the single robot, and generating executable codes;
(f) the two robots establish TCP/IP communication, carry out dual-robot cooperative assembly under the condition of normal communication, and automatically switch to a single-robot independent assembly mode to work if a certain robot fails or communication fails;
(g) and (5) closing the system after assembly.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that these embodiments are merely preferred embodiments of the invention, and that any modifications, equivalents, improvements and the like made within the spirit of the invention and the scope of the claims are included in the scope of the invention.

Claims (8)

1. An off-line programming track generation method comprises a STEP model reconstruction method and a track generation method, and specifically comprises the following STEPs:
reading the STEP model file, reading the information of the STEP model file line by line in a character string reading mode, selecting a character segmentation algorithm suitable for a sentence structure of a file data segment, classifying and sorting the information represented by each line of character string and storing the information in a memory of a computer;
extracting geometric information and topological data, extracting entity instance information described by each row of character strings, comparing keywords, and obtaining model geometric information and topological data described by an EXPRESS language;
model reconstruction, namely generating a corresponding model in OCC (OpenCASCADE) by using the extracted STEP model geometric information and topological data and displaying the corresponding model in computer offline programming software;
track generation, wherein grabbing points and placing points are defined in a mode of picking points on the OCC model of the workpiece, a series of interpolation points are generated between the grabbing points and the placing points by using an isoparametric interpolation algorithm, and finally all the points form an assembly working track of the robot;
the method is characterized in that OCC objects corresponding to the STEP files are created according to entity information in the STEP files, for each entity structural body, the mapping relation between the STEP entities and the OCC objects is inquired according to the type of the entity structural body, the corresponding OCC objects are created through related functions in the OCC, and the objects are initialized by using entity parameters; if the parameters of one entity which is another entity exist, the entity is created into an OCC object with parameter attributes and initialized; in order to avoid repeated initialization of OCC objects, a data structure of map is selected to store initialization data; the steptoGeom class in OCC comprises a method for converting a STEP geometric entity into an OCC geometric object, and the steptoTopodS package provides a method for converting a STEP topological entity into an OCC topological shape; with the above method, an OCC object corresponding to entity information is quickly created.
2. The off-line programming track generation method of claim 1, wherein the STEP model file stored in a character string form is read line by line to extract effective information thereof and stored in a computer memory, and a corresponding character segmentation algorithm is selected according to the STEP file format to segment each line of character strings, thereby extracting effective entity information therein, and the specific operation is: starting from a data segment of the STEP file, for each row of character strings, extracting a number between "#" and "═ to represent an entity number, extracting a character string between" ═ and "(" to represent the type of the entity, extracting a character string between "(" and ") to represent the name of the entity, dividing the first character string and the subsequent character string in" ("and") "by" and "to represent a parameter of the entity, and storing the entity number, the entity type, the entity name and the parameter information of each entity in the form of structures, wherein the structures also form information of the STEP model in the memory.
3. The off-line programming track generation method of claim 1, wherein the geometric information extraction means that geometric information of the STEP model file is represented by geometric entities in the STEP AP203 protocol, and the specific operations are as follows: the geometric information in the entity is described uniformly by adopting a mode of a parameter curve and a parameter surface, the topological data extraction means that the topological data in the STEP model file is represented by a topological entity in a STEP AP203 protocol, and the specific operation is as follows: the topological data of the model is represented under the STEP AP203 protocol by adopting a hierarchy relationship from low to high.
4. The off-line programming track generation method according to claim 1, wherein the grabbing points and the placing points of the workpiece are defined by picking up points on the OCC model of the workpiece to be assembled, and the middle track point between the grabbing points and the placing points is realized by using a track generation algorithm, and the specific operations are as follows: dividing the track between the grabbing point and the placing point into a plurality of straight line segments, generating a middle track point by adopting an equal parameter interpolation algorithm aiming at each straight line segment, and combining the grabbing point and the placing point which are defined before to form the assembling track of the robot.
5. The dual-robot cooperative assembly system based on the offline programming trajectory generation method of claim 1, comprising a dual-robot cooperative workpiece assembly area (1), a single-robot independent workpiece assembly area (2, 3), a robot control area (4), and a display screen (5, 6), wherein:
the double-robot cooperative workpiece assembly area (1) comprises two industrial robots (20 and 21) working in opposite directions, and clamping jaws (13 and 16), vision acquisition units (12 and 17) and assembly body placing trays (14 and 15) are respectively arranged at the tail ends of the two industrial robots (20 and 21);
the single-robot independent workpiece assembly areas (2 and 3) comprise workpieces to be assembled and assembly body placing bases (18 and 19) which are placed on the workbench and fixed on the workbench;
the robot control area (4) comprises computers (7, 10), robot control cabinets (8, 9) and a wireless module (11), wherein the wireless module (11) is used for communication between the robot control cabinets (8, 9) and the computers (7, 10) and between the two computers (7, 10);
and acquiring pictures of the workpieces to be assembled by using the vision acquisition units (12 and 17) to obtain the placing angle information of the workpieces to be assembled, and operating the off-line programming track generation method in the computers (7 and 10) to control the motion tracks of the robots (20 and 21).
6. The double-robot cooperative assembly system according to claim 5, wherein a star-type local area network with a wireless router as a center is formed between the computers (7, 10) and the robot control cabinets (8, 9) through the internet of things technology, each node in the local area network forwards through the wireless router to complete the communication of the whole system, wherein the command is a character string with "#" as an initial symbol and "/" as a termination symbol, if a plurality of commands are sent at one time, the commands need to be separated by "+", when a server or a client receives a message, the character string is traversed first, if an initial symbol and a termination incomplete symbol are found, the command is not a complete command, and the command is discarded and waits for re-reception; if the command is complete, the character string is divided by the plus separator to obtain a plurality of sub-character strings, different processing modules are selected according to the command type corresponding to each character string, an asynchronous mechanism based on message response is used as a message processing mechanism, and a mutual exclusion lock method is used for preventing a plurality of threads from simultaneously accessing the message processing modules.
7. The twin-robot cooperative assembling system according to claim 5, wherein both the robots (20, 21) can be switched to a single-robot mode for individual intelligent assembling in the single-robot independent workpiece assembling area (2, 3) when a trouble occurs in the twin-robot cooperative assembling process.
8. The operation method of the dual-robot cooperative assembly system according to claim 5, wherein the two robots respectively use the offline programming trajectory generation method to complete the dual-robot cooperative assembly work in the actual environment, and specifically includes the following steps:
step 1: starting the double-robot cooperative assembly system;
step 2: triggering the vision acquisition units (12, 17) to acquire pictures of the workpieces to be assembled, and calculating the angle of the tail end of the robot required to rotate in the assembling process by using the image information of the workpieces required to be screwed in the assembling process;
and step 3: measuring coordinate information of the workpieces in the assembly area by using a three-point calibration method and storing calibration data corresponding to each workpiece in a computer memory;
and 4, step 4: running off-line programming software in computers (7 and 10), importing relevant models of the robots and the assembled workpieces, calibrating each model, combining the angle calculation result of the step 2, performing simulation, collision detection and track generation of independent workpiece assembly and cooperative workpiece assembly of the two robots, and entering the step 5 after a required track is generated;
and 5: converting the track generated in the simulation environment into executable codes through a post code conversion function in the off-line programming software, and importing the executable codes into robot control cabinets (8, 9) for waiting execution;
step 6: setting two robots (20, 21) to be in an online mode, and operating the executable codes imported in the step 5;
and 7: and after the assembly work is finished, closing the double-robot cooperative assembly system.
CN201811610540.6A 2018-12-27 2018-12-27 Offline programming track generation method and double-robot cooperative assembly system based on same Active CN109760045B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811610540.6A CN109760045B (en) 2018-12-27 2018-12-27 Offline programming track generation method and double-robot cooperative assembly system based on same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811610540.6A CN109760045B (en) 2018-12-27 2018-12-27 Offline programming track generation method and double-robot cooperative assembly system based on same

Publications (2)

Publication Number Publication Date
CN109760045A CN109760045A (en) 2019-05-17
CN109760045B true CN109760045B (en) 2020-11-17

Family

ID=66451059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811610540.6A Active CN109760045B (en) 2018-12-27 2018-12-27 Offline programming track generation method and double-robot cooperative assembly system based on same

Country Status (1)

Country Link
CN (1) CN109760045B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110394554B (en) * 2019-06-14 2021-03-02 广东镭奔激光科技有限公司 Robot motion track offline programming method for impeller disc laser shock peening
CN111381815B (en) * 2020-02-14 2022-01-11 西安交通大学 Offline programming post code conversion method and dual-robot cooperative intelligent manufacturing system and method based on same
CN111230880B (en) * 2020-02-24 2021-06-22 西安交通大学 Complex curved surface processing track generation method in offline programming
CN113276112B (en) * 2021-04-30 2022-12-13 北京卫星制造厂有限公司 Mobile double-robot-based weak rigid member machining process planning method
CN114347038A (en) * 2022-02-17 2022-04-15 西安建筑科技大学 Intersection pipeline double-arm cooperative welding robot and control system
CN114505869A (en) * 2022-02-17 2022-05-17 西安建筑科技大学 Chemical reagent intelligent distribution machine control system
CN115091402B (en) * 2022-07-29 2024-03-26 广域铭岛数字科技有限公司 Product assembling method for assembling island

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011048621A (en) * 2009-08-27 2011-03-10 Honda Motor Co Ltd Robot off-line teaching method
CN103192397B (en) * 2012-01-09 2015-08-12 沈阳新松机器人自动化股份有限公司 Vision robot's off-line programing method and system
CN103085072B (en) * 2013-03-11 2014-10-29 南京埃斯顿机器人工程有限公司 Method for achieving industrial robot off-line programming based on three-dimensional modeling software
CN103406905B (en) * 2013-08-20 2015-07-08 西北工业大学 Robot system with visual servo and detection functions
CN105034008B (en) * 2015-09-15 2017-03-22 南京航空航天大学 Intelligent flexible production line with double robot cooperative automatic assembling and operation method for same
JP6114361B1 (en) * 2015-11-02 2017-04-12 ファナック株式会社 Offline robot programming device
JP6576255B2 (en) * 2016-01-25 2019-09-18 キヤノン株式会社 Robot trajectory generation method, robot trajectory generation apparatus, and manufacturing method
CN106914896A (en) * 2017-03-27 2017-07-04 刘程秀 A kind of construction method of robot off-line programming
CN107291045B (en) * 2017-06-27 2020-08-11 华中科技大学 Workshop programming system
CN107486858A (en) * 2017-08-08 2017-12-19 浙江工业大学 More mechanical arms collaboration off-line programing method based on RoboDK
CN107908152A (en) * 2017-12-26 2018-04-13 苏州瀚华智造智能技术有限公司 A kind of movable robot automatic spray apparatus, control system and method

Also Published As

Publication number Publication date
CN109760045A (en) 2019-05-17

Similar Documents

Publication Publication Date Title
CN109760045B (en) Offline programming track generation method and double-robot cooperative assembly system based on same
CN108656107B (en) Mechanical arm grabbing system and method based on image processing
CN109816725A (en) A kind of monocular camera object pose estimation method and device based on deep learning
Lin et al. Using synthetic data and deep networks to recognize primitive shapes for object grasping
CN109815847B (en) Visual SLAM method based on semantic constraint
CN109079794B (en) Robot control and teaching method based on human body posture following
Bateux et al. Visual servoing from deep neural networks
CN113370217B (en) Object gesture recognition and grabbing intelligent robot method based on deep learning
Schröder et al. Real-time hand tracking with a color glove for the actuation of anthropomorphic robot hands
CN110135277B (en) Human behavior recognition method based on convolutional neural network
Zhang et al. Detect in RGB, optimize in edge: Accurate 6D pose estimation for texture-less industrial parts
CN112801064A (en) Model training method, electronic device and storage medium
Ma et al. Crlf: Automatic calibration and refinement based on line feature for lidar and camera in road scenes
CN111814823A (en) Transfer learning method based on scene template generation
Wu et al. Learning affordance space in physical world for vision-based robotic object manipulation
Chen et al. 3d model-based zero-shot pose estimation pipeline
Gan et al. Towards a robotic Chinese calligraphy writing framework
Fiestas et al. RPA and L-system based synthetic data generator for cost-efficient deep learning model training
Xu et al. OD-SLAM: Real-time localization and mapping in dynamic environment through multi-sensor fusion
Kiyokawa et al. Efficient collection and automatic annotation of real-world object images by taking advantage of post-diminished multiple visual markers
Luo et al. Robot artist performs cartoon style facial portrait painting
Lin et al. Target recognition and optimal grasping based on deep learning
Sun et al. Sparse pointcloud map fusion of multi-robot system
Gulde et al. Ropose-real: Real world dataset acquisition for data-driven industrial robot arm pose estimation
Zhao et al. A multi-robot collaborative monocular SLAM based on semi-direct method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant