CN112372631B - Rapid collision detection method and device for robot machining of large complex component - Google Patents
Rapid collision detection method and device for robot machining of large complex component Download PDFInfo
- Publication number
- CN112372631B CN112372631B CN202011067113.5A CN202011067113A CN112372631B CN 112372631 B CN112372631 B CN 112372631B CN 202011067113 A CN202011067113 A CN 202011067113A CN 112372631 B CN112372631 B CN 112372631B
- Authority
- CN
- China
- Prior art keywords
- robot
- collision detection
- point set
- points
- track
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Numerical Control (AREA)
Abstract
The invention belongs to the field of large-scale complex component robot processing, and discloses a rapid collision detection method and device for large-scale complex component robot processing, wherein the method comprises the following steps: 1) obtaining a point cloud model of the surface to be processed of the large complex component by off-line programming; 2) robot kinematics modeling and CAD model simplification; 3) presetting a robot base position and designating an operation area; 3) thinning a task target point set; 5) solving the optimal inverse kinematics unique solution; 6) determining a collision detection feature point set which is most likely to generate collision; 7) and performing collision detection between the robot space configuration corresponding to all the feature points and the large complex component. The invention can realize the high-efficiency and accurate global collision detection between the large-scale complex component and the robot under the condition of determining the base pose and the working area of the robot, and can provide a basis for area division, layout optimization, motion planning and the like in the processing process of the robot for the large-scale complex component.
Description
Technical Field
The invention belongs to the field of large-scale complex component robot machining, and particularly relates to a rapid collision detection method and device for large-scale complex component robot machining.
Background
Large complex components, such as wind power blades, high-speed rail bodies, aviation components and ship components, are widely applied to the fields of energy and transportation, and the manufacturing level of the large complex components represents the core competitiveness of the national manufacturing industry. After these components are formed by a specific process, the components are usually subjected to surface drilling, milling and other processing, and polishing, painting and other post-treatment. For a long time, the surface drilling, polishing and paint spraying of large-scale complex components are mainly carried out by manual operation, the processing quality seriously depends on the experience of workers, and the problems of high labor intensity, severe operating environment, low production efficiency, difficult guarantee of product quality consistency and the like generally exist; the surface milling is mainly performed by chemical treatment or special machining, but the chemical treatment has environmental pollution, and the special machining has the problems of poor system flexibility, high cost and the like. These factors result in limited capacity and slow development of the related industries, and it is difficult to meet the requirements of the larger and larger industries and the higher and higher quality.
With the development of the robot technology, the industrial robot has wide application prospect in surface processing of large-scale complex components. There are several new problems and challenges with large complex component robotic machining compared to traditional small workpiece robotic machining. Due to the large size and the complex shape of the large complex component, the robot generally needs to perform area division and layout optimization before performing motion planning. In order to better accomplish the above work, an efficient and accurate collision detection method is particularly important.
However, the existing collision detection method is generally difficult to be effectively applied to robot machining of large and complex components. For example, bounding box methods based on AABB, OBB, and the like are difficult to effectively simulate complex free-form surfaces, and have problems of poor closeness and the like, which leads to inaccurate collision detection results; although detection methods based on hierarchical bounding boxes, DOPs, convex hulls, and the like can achieve high compactness, it is generally difficult to achieve a compromise in detection efficiency. In general, an efficient and accurate collision detection method for large-scale complex component robot machining is still lacking at present.
Disclosure of Invention
The invention provides a method and equipment for rapidly detecting collision in the machining of a large-scale complex component robot, aiming at solving the defects or the improvement requirements of the prior art and the problems mentioned in the background part, and aiming at realizing the efficient and accurate global collision detection between the large-scale complex component and the robot under the condition of determining the pose of a robot base and a working area, thereby providing a basis for area division, layout optimization, motion planning and the like in the machining process of the large-scale complex component robot and promoting the development and the application of the machining of the large-scale complex component robot.
In order to achieve the purpose, the invention provides a rapid collision detection method for large-scale complex component robot machining, which comprises the following steps:
s1: acquiring a CAD model of a surface to be processed of a large-scale complex component, and combining a processing technology to perform off-line programming on the surface to be processed to obtain a point cloud model of a region to be processed;
s2: establishing a robot kinematic model by using a DH method, and replacing a robot core component by using a regular geometric body with the minimum envelope volume to obtain a robot simplified model for collision detection;
s3: presetting the pose of a robot base, and designating a working area, wherein the working area is marked as a task target point set { TP };
s4: according to the characteristic size of the robot simplified model in the step S2, performing sparsification on the task target point set { TP } obtained in the step S3 to obtain a sparse target point set { STP };
s5: determining the optimal inverse kinematics unique solution expression of the robot according to the rigidity performance, and completing inverse kinematics solution of all track points in the sparse target point set { STP };
s6: screening out a characteristic point set which is most likely to collide from a sparse target point set { STP };
s7: and traversing collision detection between the robot space configuration corresponding to each point in the feature point set obtained by the step S6 and the large-scale complex component.
Further, the offline programming method in step S1 includes the following sub-steps:
s101: acquiring a robot processing track line by using an equal section method;
s102: and (4) dispersing each processing track line obtained in the step (S101) by adopting a unified curve discretization method to obtain processing track points. Further, the step of performing the thinning on the task target point set { TP } in step S4 is as follows:
s401: determining a characteristic dimension L of a simplified model of a robotminCharacteristic dimension LminDefining the minimum value of all the size parameters of the regular geometric bodies which can collide with the large complex curved surface;
s402: numbering all track points in the task target point set { TP } according to the rules from left to right and from bottom to top, and recording Pi,jIs the jth trace point on the ith trace in { TP };
s403: remember(s) ═ floor (L)min(d), deleting all trace points whose trace number does not satisfy i ═ 1+ n (s +1), floor (·) is an integer function, meaning that only the integer part of the data in parentheses is retained;
s404: and deleting track points which are too close to each other on each track line obtained after the deletion in the step S403.
Further, step S404 is specifically as follows: for any ith trace, first note Pi,1Is a key point PkeyThen sequentially calculating PkeyTo the subsequent points Pi,jDistance d ofi,jUntil a condition d is found to be satisfiedi,k≤Lmin≤di,k+1Point of track Pi,kThen mark Pi,kDetermining all the key points on the ith track line for the new key point according to the rule traversal, wherein Pi,niAutomatically recording as key points, wherein ni is the number of the tail end track point of the ith track line; and (4) sequentially executing the operation on each trajectory line obtained after the deletion in the step (S403) and deleting all non-key points to obtain a sparse target point set { STP }.
Further, the step S5 includes the following sub-steps:
s502: randomly selecting N different points P in the sparse target point set { STP } obtained in the step S41~PNThen, inverse kinematics calculation is respectively carried out to obtain each point pairAll inverse kinematics solutions to be given are PmAt all inverse solutions to thetam,l,m=1,2,…N,l=1,2,…NIK,θm,lRepresenting point PmCorresponding set I inverse kinematics solution, NIKIs a point PmThe number of groups of corresponding inverse solutions of the robot;
s503: calculating to obtain each group of inverse solution thetam,lAverage stiffness of robot in corresponding configurationKsm,lIs inverse solution of thetam,lThe robot stiffness in the corresponding configuration;
s504: recording average stiffnessThe maximum solution group number is l-No, and the inverse solution expression corresponding to the solution group is determined to be the optimal inverse kinematics unique inverse solution expression thetam,No;
S505: using the unique inverse solution expression θ obtained in step S504m,NoCarrying out inverse kinematics solution on the sparse target point set { STP } to obtain all track points PmCorresponding inverse solution thetam,No,m1,2,…NP,NPIs the total number of trace points in the sparse target point set STP.
Further, the method for determining the collision detection feature point set in step S6 specifically includes:
s601: traversing and calculating all track points in the sparse target point set { STP } to the origin O of the robot connecting rod coordinate system {2}2Distance D ofm=dist(Pm,O2) Determining DmTaking the trace point P corresponding to the minimum value and the maximum valuemin、Pmax;
S602: finding the vector in all track points in the sparse target point Set (STP)Andangle alpha ═ PminO2O3Taking track point P 'of minimum value'min;
S603: determining boundary corner points P of sparse target point set { STP }c1~Pc4;
S604: prepared from P'min、PmaxAnd Pc1~Pc4And integrating into a collision detection feature point set { P }, and removing the same points in the collision detection feature point set.
Further, the collision detection principle in step S7 is specifically as follows:
s701: let PkCalculating one point in the feature point set { P } for collision detection according to the inverse kinematics solution result in the step S505 to obtain the center point of the robot tool positioned in PkWhen it simplifies each regular geometry in the model that may collideQ is the number of the regular geometry;
s702: calculating all track points P in the sparse target point set { STP } in sequencemWith possible collision rule geometryCentroid ofIs a distance ofRespectively find the distancesNearest trace point
S703: sequentially judging the track pointsWhether or not to correspond to regular geometric bodiesInside ifIn a regular geometryIf the collision detection is not performed, judging the other track points to be collision points, otherwise, continuing to judge the other track pointsCorresponding to regular geometryThe relationship of (1);
s704: executing steps S701-S703 for other track points in the collision detection feature point set, if all the track points are in the collision detection feature point setAre all out of regular geometryInside, no collision is considered.
To achieve the above object, the present invention further provides a computer-readable storage medium having a computer program stored thereon, which when executed by a processor implements the fast collision detection method as described in any one of the preceding claims.
In order to achieve the above object, the present invention also provides a rapid collision detection apparatus for robot machining of large complex components, including the computer-readable storage medium as described above and a processor for calling and processing a computer program stored in the computer-readable storage medium.
In general, the above technical solutions contemplated by the present invention can achieve the following advantageous effects compared to the prior art.
1. The invention provides an efficient and accurate large complex component robot processing collision detection method for robot processing of large complex components, and can provide a basis for region division, layout optimization, motion planning and the like in the large complex component robot processing process.
2. According to the method, the point cloud information obtained by offline programming of the large complex curved surface is utilized, models such as ABB, OBB, DOP and convex hull do not need to be established, the problem of compactness does not exist, and the collision detection accuracy is high; meanwhile, point cloud sparsification can be carried out by combining the characteristic size of the robot simplified model, and collision detection of the characteristic point set is used for replacing collision detection of all points, so that the collision detection efficiency is greatly improved.
3. The invention only needs to use a large-scale complex component CAD model, does not limit the type of the complex curved surface, does not limit the operation form, is suitable for various workpieces with various specifications and processes, can also be suitable for different robots, and has strong universality and practicability.
Drawings
FIG. 1 is a flow chart of a method for detecting a fast collision in a robot process for machining a large complex component according to a preferred embodiment of the present invention;
FIG. 2 is a schematic diagram of a robot processing offline programming for large complex components according to a preferred embodiment of the present invention;
FIG. 3 is a diagram illustrating a robot DH parameterized model according to a preferred embodiment of the present invention;
FIG. 4 is a schematic diagram of a CAD prototype of a robot and a simplified model thereof, in which a is the CAD prototype and b is the simplified model;
FIG. 5 is a schematic diagram of a regular geometry of possible collisions in a simplified model of a robot according to an embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating determination of closest points in a set of collision detection points according to a preferred embodiment of the present invention;
fig. 7 is an analysis diagram of a collision detection principle provided by a preferred embodiment of the present invention, where a is a sphere collision detection principle, b is a cylinder collision detection principle, and c is a rectangular parallelepiped collision detection principle.
The same reference numbers will be used throughout the drawings to refer to the same or like elements or structures, wherein:
21, machining the surface of the large-scale complex component; 22, equidistant cross section; 23, trace lines and trace points; 301-a, 302-a, robot base prototype, simplified model; 302-a, 302-b, link 1 prototype, simplified model; 303-a, 303-b, link 2 prototype, simplified model; 304-a, 304-b, link 3 prototype, simplified model; 305-a, 305-b, link 4 prototype, simplified model; 306-a, 06-b, link 5 prototype, simplified model; 307-a, link 6 prototype, simplified model; 501, regular geometry of possible collisions G1(ii) a 502, regular geometry of possible collisions G2(ii) a 503 regular geometry G of possible collisions3(ii) a 504, regular geometry G of possible collisions4。
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Fig. 1 is a flowchart of a fast collision detection method for large complex component robot processing according to a preferred embodiment of the present invention, and as shown in fig. 1, the method for detecting a fast collision of a large complex curved surface robot constructed by the present invention mainly includes the following steps:
s1: and acquiring a CAD model of the surface to be processed of the large-scale complex component, and combining the processing technology to perform off-line programming on the surface to be processed to obtain a point cloud model of the area to be processed. The off-line programming method comprises the following specific steps:
s101: as shown in FIG. 2, the robot processing track line, the section spacing d and the normal vector are obtained by using the equal section methodIs determined by specific process parameters;
s102: dispersing each track line obtained in the step S101 by adopting a unified curve discretization method to obtain a processing track point; the curve discretization method here may be a conventional method.
S2: robot kinematics modeling is performed by using a DH (Denavit-Hartenberg) method, and a DH parameterized model schematic diagram shown in FIG. 3 is given by taking an ABB IRB4600 robot as an example. Then, a regular geometric body is used for replacing a robot core component (for some more complex components, a plurality of regular geometric bodies can be adopted for combination and simplification), the regular geometric bodies are usually spheres, cylinders, cuboids and the like, the principle of minimum envelope volume is adopted for optimization, the regular geometric bodies are mainly used for replacing robot joints, connecting rods and the like, and assembly is carried out through a kinematic constraint relation, and a CAD prototype and a simplified model of the robot are respectively shown in a and b of FIG. 4.
S3: determining a robot task: presetting a robot base positionR0For the direction rotation matrix, t, of the robot base coordinate system relative to the world coordinate system0Is a position coordinate of the robot base coordinate system under a phase world coordinate system; and (3) specifying an operation area, namely a task target point set { TP }, { TP } is a subset of the point cloud model, and selecting from the point cloud model according to experience.
S4: determining a characteristic dimension L of a simplified model of a robotminBased on this, the task target point set { TP } in S3 is thinned. The characteristic size determination and task target point set { TP } sparsification method specifically comprises the following steps:
s401: regular geometric bodies which are possible to collide with large complex curved surfaces in the robot simplified model, namely the regular geometric bodies 501-504 shown in FIG. 5, are determined through geometric analysis and are marked as GqAnd q is 1-4, then respectively determining the size parameters of the regular geometric bodies, namely the diameter of the sphere, the diameter of the circle on the end surface of the cylinder, the minimum side length of the cuboid and the like, and selecting the minimum value of the size parameters of all the regular geometric bodies as the characteristic size L of the simplified model of the robotmin;
S402: numbering all track points in the task target point set { TP } according to the rules from left to right and from bottom to top, and recording Pi,jIs the jth trace point on the ith trace in { TP };
s403: remember(s) ═ floor (L)minD), deleting all trace points with trace numbers not satisfying i ═ 1+ n · (s +1), floor (·) is an integer function, and only retaining integer part of data in parentheses;
s404: respectively thinning the trace points on the undeleted trace in step S303, taking the ith trace as an example, first recording Pi,1Is a key point PkeyThen sequentially calculating PkeyTo the subsequent points Pi,jDistance d ofi,jUntil a condition d is found to be satisfiedi,k≤Lmin≤di,k+1Point of track Pi,kThen mark Pi,kAnd traversing all the key points on the ith track line according to the rule for the new key point. In particular, Pi,niAutomatically marked as a key point, and ni is the number of the tail track point of the ith track line. And finally, deleting all non-key points, namely completing the sparsification of the task target point set { TP } to obtain a sparse target point set { STP }.
S5: and performing multi-solution screening according to the rigidity performance and the collision probability of the robot, determining an optimal solution group, and completing inverse kinematics solution. The specific steps of inverse kinematics solution are as follows:
s501: and according to the robot kinematics model established in the step S2, deriving an inverse kinematics analytical expression Θ (ikine) (T) of the robot, wherein T is a 4 × 4 homogeneous pose matrix of the center point of the robot tool, ikine (·) is an inverse kinematics solution function, and Θ is N obtained by solving the inverse kinematicsIK×NdofJoint angle matrix, NIKMaximum number of inverse kinematics solutions at the same pose, NdofFor the number of degrees of freedom of the robot, take ABB IRB4600 robot as an example, NIK=8,Ndof=6;
S502: randomly selecting N different points P in the sparse target point set { STP } obtained in the step S41~PNThen, inverse kinematics calculation is respectively carried out to obtain the corresponding place of each pointWith inverse kinematics solution, let PmTo all inverse solutions of thetam,l=ikine(Tm,l),m=1,2,…N,l=1,2,…NIK,NIKNumber of sets, ikine (T), inverse solution for the robotm,l) Is a point PmThe expression corresponding to the first group of inverse solutions;
s503: respectively calculating different inverse solutions theta of each pointm,lStiffness ks of the robot in the corresponding configurationm,lThe calculation formula for stiffness ks is:and has the following components:wherein C is the flexibility matrix (i.e. the inverse of Cartesian stiffness matrix K) of the robot, J is the Jacobian matrix of the robot, and KθIs a joint stiffness matrix of the robot, CttIs a 3 multiplied by 3 matrix at the upper left corner of the matrix C, and then each group of inverse solutions theta are obtained by calculationm,lAverage stiffness of robot in corresponding configurationKsm,lIs inverse solution of thetam,lThe robot stiffness in the corresponding configuration;
s504: recording average stiffnessThe largest solution group number is l ═ No, and the unique inverse solution expression is determined as follows: thetam,l=ikine(Tm,No);
S505: using the unique inverse solution expression of the step S504 to perform inverse kinematics traversal solution on each point in the sparse target point set { STP }, so as to obtain all track points P in the sparse target point set { STP }m,m=1,2,…NPCorresponding optimal inverse solution thetam,No,NPThe number of trace points in the sparse target point set STP.
S6: determining a collision detection characteristic point set, wherein the characteristic point set is defined as a set of most probable collision trajectory points in a sparse target point set { STP }, and the determination method specifically comprises the following steps:
s601: traversing and calculating all track points in the sparse target point set { STP } to the robot connecting rod coordinate system {2} origin O2Distance D ofm=dist(Pm,O2) Determining DiTaking the trace point P corresponding to the minimum value and the maximum valuemin、PmaxPreferably, in calculating the distance DmWhen D is considered by approximationm=dist(Pm,O1) To find P more quicklymin、PmaxDist (-) is a function of calculating the distance between two points;
s602: as shown in fig. 5, finding all trace points in the sparse target point set { STP } so that α ═ P is equal to PminO2O3Taking track point P 'of minimum value'minPreferably, < P > can be usedminO1O3Approximate & lt PminO2O3That is, the α' can be selected approximately so that α ═ PminO1O3Taking the track point of the minimum value as P'min;
S603: determining boundary corner points P of sparse target point set { STP }c1~Pc4The boundary corner point is defined as the end point of the first trajectory line and the last trajectory line in the { STP };
s604: prepared from P'min、PmaxAnd Pc1~Pc4And merging into a collision detection feature point set { P }, and removing the same points in the collision detection feature point set { P }.
S7: and traversing collision detection between the robot space configuration corresponding to each point in the feature point set obtained by the step S6 and the large-scale complex component, wherein the collision detection process can be performed according to a conventional collision detection method in the field. Preferably, the present embodiment provides a collision detection method, which has the following principle:
s701: let PkCalculating one point in the feature point set { P } for collision detection according to the inverse kinematics solution result in the step S505 to obtain the center point of the robot tool positioned in PkWhen it simplifies each regular geometry in the model that may collideThe spatial pose of (a);
s702: calculating all track points P in the sparse target point set { STP } in sequencemWith possible collision rule geometryCentroid ofIs a distance ofRespectively find the distancesNearest trace point
S703: sequentially judging the track pointsWhether or not to correspond to regular geometric bodiesInside ifIn a regular geometryIf the collision detection is not successful, judging the other track pointsCorresponding to regular geometryThe relationship of (1);
s704: centralizing the characteristic points of collision detectionThe steps S701 to S703 are executed if all the trace points areAre all out of regular geometryInside, no collision is considered.
Preferably, as shown in fig. 6, the formula for determining whether a point P is inside a regular geometric body G (sphere, cylinder, cuboid) is:
sphere: dmin (P, G) ═ dist (P, O) -s · r
wherein dmin (P, G) ≦ 0 indicates that point P is inside the regular geometry G, otherwise point P is outside the regular geometry G; o is the centroid of the regular geometry G; for b, P of FIG. 71Is the projected point of point P in the XOY plane, P2Is the projection point of the point P on the Z axis of the coordinate system; for c, P of FIG. 72、P3、P4Respectively, the projection points of the point P on the axes of the coordinate system X, Y, Z; s is a safety factor, and preferably, the value of s is 1-1.5; r is the radius of the sphere (or the end face of the cylinder), l, w, h are the length, width, height (or height of the cylinder) of the cuboid respectively.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (7)
1. A rapid collision detection method for large-scale complex component robot processing is characterized by comprising the following steps:
s1: acquiring a CAD model of a surface to be processed of a large-scale complex component, and combining a processing technology to perform off-line programming on the surface to be processed to obtain a point cloud model of a region to be processed;
s2: establishing a robot kinematic model by using a DH method, and replacing a robot core component by using a regular geometric body with the minimum envelope volume to obtain a robot simplified model for collision detection;
s3: presetting the pose of a robot base, and designating a working area, wherein the working area is marked as a task target point set { TP };
s4: according to the characteristic size of the robot simplified model in the step S2, performing sparsification on the task target point set { TP } obtained in the step S3 to obtain a sparse target point set { STP };
the step of performing the sparsification on the task target point set { TP } in step S4 is as follows:
s401: determining a characteristic dimension L of a simplified model of a robotminCharacteristic dimension LminDefining the minimum value of all the size parameters of the regular geometric bodies which can collide with the large complex curved surface;
s402: numbering all track points in the task target point set { TP } according to the rules from left to right and from bottom to top, and recording Pi,jIs the jth trace point on the ith trace in { TP };
s403: remember(s) ═ floor (L)min(d), deleting all trace points whose trace number does not satisfy i ═ 1+ n (s +1), floor (·) is an integer function, meaning that only the integer part of the data in parentheses is retained; d is the section space when the robot processes the track line by using the equal section method;
s404: deleting track points which are too close to each other on each track line obtained after deletion in the step S403;
s5: determining the optimal inverse kinematics unique solution expression of the robot according to the rigidity performance, and completing inverse kinematics solution of all track points in the sparse target point set { STP }; recording a solution group with the maximum average rigidity and determining an inverse solution expression corresponding to the solution group as an optimal inverse kinematics unique inverse solution expression;
s6: screening out a characteristic point set which is most likely to collide from a sparse target point set { STP };
the method for determining the set of collision detection feature points in step S6 includes the following sub-steps:
s601: traversing and calculating all track points in the sparse target point set { STP } to the origin O of the robot connecting rod coordinate system {2}2Distance D ofm=dist(Pm,O2) Determining DmTaking the trace point P corresponding to the minimum value and the maximum valuemin、Pmax;
S602: finding the vector in all track points in the sparse target point Set (STP)Andangle alpha ═ PminO2O3Taking track point P 'of minimum value'min;
S603: determining boundary corner points P of sparse target point set { STP }c1~Pc4;
S604: prepared from P'min、PmaxAnd Pc1~Pc4Integrating into a collision detection feature point set { P }, and removing the same points;
s7: and traversing collision detection between the robot space configuration corresponding to each point in the feature point set obtained by the step S6 and the large-scale complex component.
2. The rapid collision detection method according to claim 1, wherein the off-line programming method in step S1 includes the following sub-steps:
s101: acquiring a robot processing track line by using an equal section method;
s102: and (4) dispersing each processing track line obtained in the step (S101) by adopting a unified curve discretization method to obtain processing track points.
3. The fast collision detection according to claim 1The method is characterized in that step S404 is as follows: for any ith trace, first note Pi,1Is a key point PkeyThen sequentially calculating PkeyTo the subsequent points Pi,jDistance d ofi,jUntil a condition d is found to be satisfiedi,k≤Lmin≤di,k+1Point of track Pi,kThen mark Pi,kDetermining all the key points on the ith track line for the new key point according to the rule traversal, wherein Pi,niAutomatically recording as key points, wherein ni is the number of the tail end track point of the ith track line; and (4) sequentially executing the operation on each trajectory line obtained after the deletion in the step (S403) and deleting all non-key points to obtain a sparse target point set { STP }.
4. The rapid collision detection method according to any one of claims 1 to 3, wherein the step S5 includes the following sub-steps:
s501: deducing an inverse kinematics analysis expression of the robot according to the robot kinematics model established in the step S2;
s502: randomly selecting N different points P in the sparse target point set { STP } obtained in the step S41~PNThen, inverse kinematics calculation is respectively carried out to obtain all inverse kinematics solutions corresponding to each point, and P is setmAt all inverse solutions to thetam,l,m=1,2,…N,l=1,2,…NIK,θm,lRepresenting point PmCorresponding set I inverse kinematics solution, NIKIs a point PmThe number of groups of corresponding inverse solutions of the robot;
s503: calculating to obtain each group of inverse solution thetam,lAverage stiffness of robot in corresponding configurationKsm,lIs inverse solution of thetam,lThe robot stiffness in the corresponding configuration;
s504: recording average stiffnessMaximum ungrouped numberIs No, and determines the inverse solution expression corresponding to the solution group as the optimal inverse kinematics unique inverse solution expression thetam,No;
S505: using the unique inverse solution expression θ obtained in step S504m,NoCarrying out inverse kinematics solution on the sparse target point set { STP } to obtain all track points PmCorresponding inverse solution thetam,No,m=1,2,…NP,NPIs the total number of trace points in the sparse target point set STP.
5. The rapid collision detection method according to claim 1, wherein the collision detection principle in step S7 is specifically:
s701: let PkCalculating one point in the feature point set { P } for collision detection according to the inverse kinematics solution result in the step S505 to obtain the center point of the robot tool positioned in PkWhen it simplifies each regular geometry in the model that may collideQ is the number of the regular geometry;
s702: calculating all track points P in the sparse target point set { STP } in sequencemWith possible collision rule geometryCentroid ofIs a distance ofRespectively find the distancesNearest trace point
S703:Sequentially judging the track pointsWhether or not to correspond to regular geometric bodiesInside ifIn a regular geometryIf the collision detection is not performed, judging the other track points to be collision points, otherwise, continuing to judge the other track pointsCorresponding to regular geometryThe relationship of (1);
6. A computer-readable storage medium, having a computer program stored thereon, which, when being executed by a processor, implements the fast collision detection method according to any one of claims 1 to 5.
7. A fast collision detection device for robotic machining of large complex components, comprising a computer readable storage medium according to claim 6 and a processor for invoking and processing a computer program stored in the computer readable storage medium.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011067113.5A CN112372631B (en) | 2020-10-05 | 2020-10-05 | Rapid collision detection method and device for robot machining of large complex component |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011067113.5A CN112372631B (en) | 2020-10-05 | 2020-10-05 | Rapid collision detection method and device for robot machining of large complex component |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112372631A CN112372631A (en) | 2021-02-19 |
CN112372631B true CN112372631B (en) | 2022-03-15 |
Family
ID=74580976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011067113.5A Active CN112372631B (en) | 2020-10-05 | 2020-10-05 | Rapid collision detection method and device for robot machining of large complex component |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112372631B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023274000A1 (en) * | 2021-06-29 | 2023-01-05 | 武汉联影智融医疗科技有限公司 | Robot system, and evaluation method and control method therefor |
CN113987666B (en) * | 2021-12-29 | 2022-08-12 | 深圳市毕美科技有限公司 | BIM (building information modeling) model examination method, device, equipment and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105945942A (en) * | 2016-04-05 | 2016-09-21 | 广东工业大学 | Robot off line programming system and method |
CN108213757A (en) * | 2018-01-16 | 2018-06-29 | 华东理工大学 | A kind of collision checking method for welding robot |
US10477180B1 (en) * | 2018-05-22 | 2019-11-12 | Faro Technologies, Inc. | Photogrammetry system and method of operation |
CN110842918A (en) * | 2019-10-24 | 2020-02-28 | 华中科技大学 | Robot mobile processing autonomous locating method based on point cloud servo |
WO2020078784A1 (en) * | 2018-10-15 | 2020-04-23 | Schuler Pressen Gmbh | Method for checking for freedom of movement |
CN111429514A (en) * | 2020-03-11 | 2020-07-17 | 浙江大学 | Laser radar 3D real-time target detection method fusing multi-frame time sequence point clouds |
CN111496849A (en) * | 2020-07-01 | 2020-08-07 | 佛山隆深机器人有限公司 | Method for detecting rapid collision between material frame and clamp |
CN111538335A (en) * | 2020-05-15 | 2020-08-14 | 深圳国信泰富科技有限公司 | Anti-collision method of driving robot |
WO2020182591A1 (en) * | 2019-03-08 | 2020-09-17 | Osram Gmbh | Component for a lidar sensor system, lidar sensor system, lidar sensor device, method for a lidar sensor system and method for a lidar sensor device |
EP3715782A1 (en) * | 2019-03-28 | 2020-09-30 | Nikon Metrology NV | Method for slot inspection |
-
2020
- 2020-10-05 CN CN202011067113.5A patent/CN112372631B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105945942A (en) * | 2016-04-05 | 2016-09-21 | 广东工业大学 | Robot off line programming system and method |
CN108213757A (en) * | 2018-01-16 | 2018-06-29 | 华东理工大学 | A kind of collision checking method for welding robot |
US10477180B1 (en) * | 2018-05-22 | 2019-11-12 | Faro Technologies, Inc. | Photogrammetry system and method of operation |
WO2020078784A1 (en) * | 2018-10-15 | 2020-04-23 | Schuler Pressen Gmbh | Method for checking for freedom of movement |
WO2020182591A1 (en) * | 2019-03-08 | 2020-09-17 | Osram Gmbh | Component for a lidar sensor system, lidar sensor system, lidar sensor device, method for a lidar sensor system and method for a lidar sensor device |
EP3715782A1 (en) * | 2019-03-28 | 2020-09-30 | Nikon Metrology NV | Method for slot inspection |
CN110842918A (en) * | 2019-10-24 | 2020-02-28 | 华中科技大学 | Robot mobile processing autonomous locating method based on point cloud servo |
CN111429514A (en) * | 2020-03-11 | 2020-07-17 | 浙江大学 | Laser radar 3D real-time target detection method fusing multi-frame time sequence point clouds |
CN111538335A (en) * | 2020-05-15 | 2020-08-14 | 深圳国信泰富科技有限公司 | Anti-collision method of driving robot |
CN111496849A (en) * | 2020-07-01 | 2020-08-07 | 佛山隆深机器人有限公司 | Method for detecting rapid collision between material frame and clamp |
Non-Patent Citations (3)
Title |
---|
A Novel Vehicle Reversing Speed Control Based on Obstacle Detection and Sparse Representation;Zutao Zhang;《A Novel Vehicle Reversing Speed Control Based on Obstacle Detection and Sparse Representation》;IEEE Transactions on Intelligent Transportation Systems;20141008;第3卷(第16期);第1321-1334页 * |
基于快速动力学辨识的免外部传感器机器人碰撞检测;甘亚辉等;《基于快速动力学辨识的免外部传感器机器人碰撞检测》;控制理论与应用;20190930;第36卷(第9期);第1509-1519页 * |
大型复杂构件机器人移动加工技术研究;陶波等;《大型复杂构件机器人移动加工技术研究》;《中国科学》杂志社;20181105;第48卷(第12(2018)期);第1302-1312页 * |
Also Published As
Publication number | Publication date |
---|---|
CN112372631A (en) | 2021-02-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Shen et al. | Research on large-scale additive manufacturing based on multi-robot collaboration technology | |
CN110039538B (en) | Robot track planning method based on large complex component point cloud information | |
Vahedi et al. | Caging polygons with two and three fingers | |
CN112372631B (en) | Rapid collision detection method and device for robot machining of large complex component | |
CN109876968B (en) | Automatic path planning method for steel structure robot spraying | |
CN107972034B (en) | Complex workpiece trajectory planning simulation system based on ROS platform | |
CN112439601B (en) | Spraying robot automatic trajectory planning method for outer vertical surface of large ship | |
CN112508895B (en) | Propeller blade quality assessment method based on curved surface registration | |
CN109597354B (en) | Multi-constraint numerical control machining track generation method of triangular mesh model | |
CN111975767A (en) | Multi-robot visual detection system collaborative motion planning method based on multi-stage task allocation | |
Shen et al. | An image-based algorithm for generating smooth and interference-free five-axis sweep scanning path | |
CN108227620B (en) | Robot spraying track generation method based on three-dimensional model | |
CN110363801B (en) | Method for matching corresponding points of workpiece real object and three-dimensional CAD (computer-aided design) model of workpiece | |
Li et al. | A novel path generation method of onsite 5-axis surface inspection using the dual-cubic NURBS representation | |
CN113536488A (en) | Blank quality containment analysis and allowance optimization method based on registration algorithm | |
CN115937468A (en) | Automatic generation method for machining program of countless-module robot | |
Dai et al. | Multiaxis wire and arc additive manufacturing for overhangs based on conical substrates | |
Sadaoui et al. | Touch probe measurement in dimensional metrology: A review | |
Bhatt et al. | Incorporating tool contact considerations in tool-path planning for robotic operations | |
Wang et al. | A deep learning based automatic surface segmentation algorithm for painting large-size aircraft with 6-DOF robot | |
Du et al. | Wire arc additive manufacturing from the perspective of remanufacturing: A review of data processing | |
CN111898219B (en) | Area division method and equipment for large-scale complex component robotic surface machining | |
CN116680958A (en) | Robot spraying track optimization method for bilateral cooperation of fan blades | |
Liu et al. | Direct 5-axis tool posture local collision-free area generation for point clouds | |
CN110370276A (en) | The industrial robot machining locus automatic planning decomposed based on threedimensional model Morse |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |