CN108519055A - A kind of dual robot relative pose online calibration method of view-based access control model - Google Patents
A kind of dual robot relative pose online calibration method of view-based access control model Download PDFInfo
- Publication number
- CN108519055A CN108519055A CN201810385326.9A CN201810385326A CN108519055A CN 108519055 A CN108519055 A CN 108519055A CN 201810385326 A CN201810385326 A CN 201810385326A CN 108519055 A CN108519055 A CN 108519055A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- robot
- target
- camera
- pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 230000009977 dual effect Effects 0.000 title abstract description 6
- 238000005259 measurement Methods 0.000 claims abstract description 18
- 238000005516 engineering process Methods 0.000 claims abstract description 5
- 239000011159 matrix material Substances 0.000 claims description 36
- 239000013598 vector Substances 0.000 claims description 19
- 230000000007 visual effect Effects 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 5
- 238000013519 translation Methods 0.000 claims description 5
- 230000009466 transformation Effects 0.000 claims description 4
- 238000012935 Averaging Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 3
- 230000005484 gravity Effects 0.000 claims description 2
- 150000001875 compounds Chemical class 0.000 abstract 1
- 238000013461 design Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 238000002360 preparation method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000000227 grinding Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000003754 machining Methods 0.000 description 2
- 238000003801 milling Methods 0.000 description 2
- 238000005498 polishing Methods 0.000 description 2
- 238000005507 spraying Methods 0.000 description 2
- 238000003466 welding Methods 0.000 description 2
- 241000287196 Asthenes Species 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005553 drilling Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Manipulator (AREA)
Abstract
The invention belongs to multirobot relative poses to demarcate field, and specifically disclose a kind of dual robot relative pose online calibration method of view-based access control model, and this method comprises the following steps:The first step determines the position orientation relation of the basis coordinates system { A } and camera coordinates system { C } of robot A offline respectivelyATCAnd the position orientation relation of the basis coordinates system { B } and target co-ordinates system { M } of robot BBTM;Second step obtains the position orientation relation of camera coordinates system and target co-ordinates system by vision measurement technology onlineCTM, according toATC、BTMWithCTMCoordinate system closed loop builds calibration equation to calculate dual robot relative pose relationshipATB, the calibration of dual robot basis coordinates system is completed with this.The present invention only needs a determined off-line that can carry out multiple high-precision on-line proving on demand, is suitble to the often variation and higher to work compound required precision of multirobot relative pose, needs the occasion of frequent, quick carry out relative pose calibration.
Description
Technical Field
The invention belongs to the field of robot relative pose calibration, and particularly relates to a vision-based double-robot relative pose online calibration method.
Background
With the development of robotics and the increase of complexity of working environment and task, a multi-robot cooperative operation mode represented by two robots is required in more and more occasions, such as: the method comprises the steps of hole making assembly and milling processing of thin-wall parts such as airplane skin and the like, grinding and polishing processing of a double-robot facing a large-scale wind power blade, and traditional double-robot coordination welding, spraying, carrying and the like.
Because of the exacting precision requirements typically imposed by the manufacturing industry, each robot must accurately know the relative pose relationship between the base coordinate systems of each other when operating in tandem with a dual/multi-robot system. However, in the existing dual/multi-robot system, the position of the robot is usually fixed throughout the entire operation; and once the position of a certain robot in the system changes, a more complex calibration process is necessary again.
Currently, there are two main types of existing calibration methods: contact methods and non-contact methods. The contact method mainly uses a calibration tool at the tail end of a double robot to carry out motion for satisfying a specific constraint relation for a plurality of times, then constructs a closed-loop motion chain and establishes a calibration equation set to solve the relation of a base coordinate system of the double robot, and mainly comprises a three-point method, a four-point method and the like according to different principles, and can be divided into a calibration needle, a shaft hole method, a calibration plate and the like according to different calibration tools; the non-contact method mainly records the motion information of the marking point at the tail end of each mechanical arm by means of three-dimensional measuring equipment (such as a laser tracker, a vision sensor, an electronic theodolite and the like), and establishes a corresponding calibration equation set to solve the relation of the base coordinate system of the double robots. Compared with the prior art, the contact method has simple principle, high manual participation and low efficiency; the preparation work before the non-contact method calibration is relatively complicated, the calibration speed still stays at a minute level, and the calibration speed is slow.
Disclosure of Invention
Aiming at the defects or the improvement requirements in the prior art, the invention provides a vision-based online calibration method for the relative pose of two robots, which realizes the calibration of the relative pose of a cooperative robot by constructing two intermediate coordinate systems, namely a camera coordinate system { C } and a target coordinate system { M }, combining a base coordinate system { A } and a base coordinate system { B } of the two robots and adopting a mode of combining offline measurement and online calibration.
In order to achieve the purpose, the invention provides a vision-based online calibration method for the relative pose of two robots, the calibration method constructs two intermediate coordinate systems, namely a camera coordinate system { C } and a target coordinate system { M }, through a camera C and a target M, wherein the camera C is fixedly connected with a robot A, the base coordinate system of the robot A is { A }, the target M is fixedly connected with a robot B, the base coordinate system of the robot B is { B }, and the calibration method is used for resolving the relative pose relationship of the two robots in a mode of combining offline measurement with online measurement to solve the relative pose relationship of the two robotsATBWhich comprises the following steps:
s1 off-line measurement: respectively determining the relative pose relationship between the base coordinate system { A } and the camera coordinate system { C } of the robot AATCAnd the relative position and orientation relation between the base coordinate system { B } and the target coordinate system { M } of the robot BBTM;
S2 on-line measurement: obtaining the relative pose relation between the camera coordinate system { C } and the target coordinate system { M } through a vision measurement technologyCTMAccording to the position and pose relation matrixATC、BTMAndCTMconstructing a calibration equationATB=ATC·CTM·(BTM)-1According to the calibration equation, the relative position relationship between the base coordinate systems of the double robots can be calculatedATBThus, the calibration of the base coordinate system of the two robots is completed.
Further preferably, the pose relationship of the base coordinate system { A } of the robot A and the camera coordinate system { C } is determined in step S1ATCThe method specifically comprises the following substeps:
s11, mounting the camera C on a bracket fixedly connected with the robot A, and mounting a target on a tail end flange of the robot A, wherein the target is marked as M1;
s12, adjusting the pose of the robot A to enable the target M1 to completely appear in the visual field range of the camera C, recording the joint angle information of the robot A, and then calculating to obtain a pose matrix of the robot A and the target M1ATM1;
S13 obtaining an image of the target M1 by using the camera C, and then solving a position matrix of the cameras C and M1 according to the imageCTM1And then get solvedATC=ATM1·(CTM1)-1;
S14 repeating the steps S12-S13 to solveATCIs calculated from the expected value of (c).
Further preferably, the pose relationship between the base coordinate system { B } and the target coordinate system { M } of the robot B is determined in step S1BTMThe method specifically comprises the following substeps:
s15 detaching the target from the end flange of robot a and mounting it on the end flange of robot B, this target being now denoted as M2;
s16 adjusting robots A and B respectively to finish the target M2All appear in the visual field range of the camera C, the joint angle information of the robot B is recorded, and then the position and orientation matrix of the robot B and the target M2 is obtained through calculationBTM2;
S17 obtaining an image of the target M2 by using the camera C, and then solving a position matrix of the cameras C and M2 according to the imageCTM2;
S18, detaching the target from the end flange of the robot B, and mounting the target on a bracket fixedly connected with the robot B, wherein the target is marked as M;
s19 obtaining an image of the target M by using the camera C, and then solving a pose matrix of the cameras C and M according to the imageCTMAnd then get solvedBTM=BTM2·(CTM2)-1·CTM;
S110 repeating the steps S16-S19 to solveBTMIs calculated from the expected value of (c).
As a further preference, the following solution is particularly adoptedATCOrBTMDesired value of (a):
1) decomposing pose matrixATCOrBTMObtainingATCComponent of rotationARCAnd translational componentAPCOrBTMComponent of rotationBRMAnd translational componentBPM;
2) Respectively will rotate by a componentARCOrBTMConverting into an RPY angle form, then summing the RPY angle form to obtain an average value, and converting into a rotation matrix form to obtain a rotation component expected value;
3) for the translational componentAPCOrBPMSumming and averaging to obtain a translation component expected value;
4) the expected value of the rotation component and the period of the translation component are comparedThe inspection value is expressed by a homogeneous transformation position matrix to obtainATCOrBTMIs calculated from the expected value of (c).
Further preferably, the camera C is used to acquire the image of the target M in step S2, and the pose matrix of the camera coordinate system { C } and the target coordinate system { M } is solved according to the imageCTM。
As a further preferred, the corresponding pose matrix is solved according to the image specifically in the following manner:
1) selecting any 3 non-collinear mark points on the target image, and respectively marking as P1、P2、P3And solving the position and orientation matrix of the coordinate system { O } and the camera coordinate system { C } formed by the 3 mark pointsCTOWherein the origin of the coordinate system { O } and the triangle P1P2P3Have coincident centers of gravity and have:
wherein,respectively, the directions of XYZ axes of the coordinate system { O },is a vector pointed to P2 by point P1,as vectorsThe length of the die (c) is,is a vector pointed to P3 by point P1,as vectorsThe length of the die (c) is,as vectorsAndthe cross product of (a) and (b),as vectorsThe die length of (2);
2) calculating the position and orientation matrix of the coordinate system { M } and the coordinate system { O }MTOAccording toCTOAndMTOcomputingCTM=CTO·(MTO)-1。
Generally, compared with the prior art, the above technical solution conceived by the present invention mainly has the following technical advantages:
1) once preparation and multiple use: in the online calibration method for the relative pose of the cooperative robot based on the vision, the camera C is fixedly connected with the robot A, and the target M is fixedly connected with the robot B, so that the pose calibration of the camera C and the robot A and the pose calibration of the target M and the robot B can be completed at one time in an offline measurement stage, and the repeated operation is not needed in the subsequent use process;
2) the operation is simple, and the efficiency is very high: for a multi-robot mobile operation/processing system, the relative poses of any two robots can be changed frequently, the calibration is frequent, the online calibration process in the scheme of the invention does not need manual intervention, the self-motion of the robots is not needed, the programming is easy to realize, the automation degree is extremely high, and the speed is fast;
3) the flexibility is high, and the suitability is strong: the scheme of the invention can be applied to various industries, such as: the method comprises the steps of drilling assembly and milling machining of thin-wall parts such as aircraft skins and the like, grinding and polishing machining of a double-robot facing a large-scale wind power blade, traditional double-robot coordination welding, spraying, carrying and the like, no strict installation requirements exist among calibration equipment, and the method can be flexibly constructed according to specific application.
Drawings
FIG. 1 is a flow chart of a vision-based on-line calibration method for relative poses of two robots according to the present invention;
FIG. 2 is a schematic diagram of the coordinate system distribution of the two-robot calibration system of the present invention;
FIG. 3 is a schematic diagram of three points defining a coordinate system;
FIG. 4 is a flowchart of an algorithm for solving the pose relationship between the camera coordinate system and the target coordinate system.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1, in the method for calibrating the relative pose of two robots based on vision provided in the embodiment of the present invention, a camera C (internal parameters are known) and a target M (design parameters are known, at least 3 of which are not collinear and have a relative position relationship already exist are used to calibrate the relative pose of two robots based on visionKnown mark points) to construct two intermediate coordinate systems, namely a camera coordinate system { C } and a target coordinate system { M }, wherein the camera C is fixedly connected with the robot A, a base coordinate system of the robot A is { A }, the target M is fixedly connected with the robot B, a base coordinate system of the robot B is { B }, the base coordinate system { A }, the camera coordinate system { C }, the target coordinate system { M } and the base coordinate system { B } form a coordinate system closed loop, and the calibration method solves the relative pose relationship of the two robots by combining an online measurement mode in an offline mannerATBThe base coordinate system { a }, the base coordinate system { B }, the camera coordinate system { C } and the target coordinate system { M } may be established by any existing coordinate system establishing method, which is the prior art and is not described herein again.
The invention relates to a vision-based online calibration method for the relative pose of two robots, which specifically comprises the following steps:
s1 off-line measurement: respectively determining the relative pose relationship between the base coordinate system { A } and the camera coordinate system { C } of the robot AATCAnd the relative position and orientation relation between the base coordinate system { B } and the target coordinate system { M } of the robot BBTM;
S2 on-line measurement: obtaining the relative pose relation between the camera coordinate system { C } and the target coordinate system { M } through a vision measurement technologyCTMAccording to the position and pose relation matrixATC、BTMAndCTMconstructing a calibration equationATB=ATC·CTM·(BTM)-1According to the calibration equation, the relative position relationship between the base coordinate systems of the double robots can be calculatedATBThus, the calibration of the base coordinate system of the two robots is completed.
For step S1, it includes the following sub-steps:
s11 mounting the camera C on the bracket fixedly connected to the robot a, and mounting the target on the end flange of the robot a, where the target is denoted as M1 and its coordinate system is { M1}, and the target can be set according to the targetThe relative position relation between the coordinate system { M1} and the robot A end flange coordinate system {6} obtained by calculating the parameters is6TM1;
S12 adjusting the pose of the robot A to make the target M1 completely appear in the visual field range of the camera C, and recording the joint angle information of the robot A, namelyAq=[θ1,θ2,θ3,θ4,θ5,θ6]And then calculating to obtain a pose matrix of the robot A and the target M1ATM1;
In particular, positive kinematics by the robot yields:
wherein,a coordinate transformation matrix of a robot connecting rod coordinate system { i } relative to a coordinate system { i-1}, wherein i is 1,2,3,4,5, 6;
and then calculating to obtain the pose relationship between the robot A and the target M1:
ATM1=0T6(Aq)·6TM1;
s13 obtaining an image by using the camera C, and then solving the position matrix of the cameras C and M1 according to the imageCTM1And then get solvedATC=ATM1·(CTM1)-1:
Specifically, an image about the target M1 is acquired by the camera C, and assuming that P1, P2 and P3 are three non-collinear mark points on the target M1, the coordinates of P1, P2 and P3 under the camera coordinate system { C } are calculated according to the solution method of the Perspective-n-Point (abbreviated as PnP) problemCertainly, other solving methods in the prior art can also be adopted to solve the coordinates, which are not described herein, and then the pose relationship between the coordinate system { O } and the coordinate system { C } determined by P1, P2, and P3 is solvedComprises the following steps:
wherein,directions of XYZ axes of a coordinate system { O }, respectively;is the origin of the coordinate system { O }, is also Δ P1P2P3And has:
wherein,respectively, the directions of XYZ axes of the coordinate system { O },is a vector pointed to P2 by point P1,as vectorsThe length of the die (c) is,to point to P from point P1The vector of 3. the vector of,as vectorsThe length of the die (c) is,as vectorsAndthe cross product of (a) and (b),as vectorsThe die length of (2);
meanwhile, the coordinates of P1, P2 and P3 in the target coordinate system { M1} are known according to the design parameters of the targetThe pose relationship between the coordinate system { O } and the coordinate system { M1} determined by P1, P2 and P3 can be obtainedM1TOComprises the following steps:
the XYZ axis directions and the origins of the coordinate systems { O } here are determined in the same manner as above, except that the coordinates of the three points P1, P2, P3 are coordinates in different coordinate systems, one is a coordinate in the camera coordinate system { C } and one is a coordinate in the target coordinate system { M1}, and therefore the XYZ axis directions and the origins of the two coordinate systems { O } are determined in different manners except that the determination methods are the same;
further onSolving the pose relationship between the coordinate systems { C } and { M1}CTM1Comprises the following steps:
then, the relative pose relationship between the coordinate systems { A } and { C } is solvedATCComprises the following steps:
ATC=ATM1·(CTM1)-1;
s14 repeating the steps S12-S13 n times and solvingATCExpected value of E (E: (ATC) Wherein n preferably satisfies 1. ltoreq. n.ltoreq.10, the relative pose relationship between the { A } and { C } coordinate systems is obtained every time steps S12 to S13 are performedATCThe relative pose relationship obtained at the ith time is represented by (ATC)i,1≤i≤n。
Further, solving forATCExpected value of E (E: (ATC) The method specifically comprises the following steps:
1) decompose the pose matrix (ATC)iAcquiring the rotation component thereof (ARC)iAnd a translational component (AtC)iNamely:
for example (a)ATC)iIs shown asThen there are:
2) let (ARC)i XYZ(γ,β,α)=(ARC)iI.e. the rotational component (ARC)iWith RPY angle (α, gamma)iAnd is represented by:
wherein c is cos, Atan2 denotes the arctangent function, andARC)i XYZ(γ, β) shows that after the coordinate system { C } is overlapped with the coordinate system { A }, the { C } is rotated around the X-axis of { A } by γ angle, then rotated around the Y-axis of { A } by β angle, and finally rotated around the Z-axis of { A } by α angle;
then, pair (α, gamma)iThe sum and average are taken to obtain the expected value E (α, γ) as:
wherein E (-) is an expectation solver operator;
converting the expected value E (α, gamma) into a rotation matrix form to obtain a rotation component expected value:
wherein c is cos, s is sin;
3) for translational component (AtC)iSumming and averaging to obtain expected value E (AtC) Comprises the following steps:
4) reusing the rotation component expected value and the translation component expected value to convert bits in a same timeDescription of attitude matrix, obtainingATCExpected value of E (E: (ATC):
S15, detaching the target from the end flange of the robot A and installing the target on the end flange of the robot B, wherein the target is marked as M2, and the associated coordinate system is { M2}, and the relative pose relationship between the coordinate system { M2} and the coordinate system {6} of the end flange of the robot B can be calculated according to the target design parameters6TM2;
S16 adjusting the robots A and B respectively to make the target M2 completely appear in the visual field range of the camera C, and recording the joint angle information of the robot BBq=[θ1,θ2,θ3,θ4,θ5,θ6]And then calculating to obtain the pose relation between the robot B and the target M2BTM2;
In particular, positive kinematics by the robot yields:
wherein,a coordinate transformation matrix of a robot B connecting rod coordinate system { i } relative to a coordinate system { i-1}, wherein i is 1,2,3,4,5, 6;
then, the pose relation between the robot B and the target M2 is obtained through calculationBTM2Comprises the following steps:
BTM2=0T6(Bq)·6TM2;
s17 obtaining an image by using the camera C, and then solving the position matrix of the cameras C and M2 according to the imageCTM2:
Specifically, an image of the target M2 is acquired by the camera C, and assuming that P1, P2 and P3 are three non-collinear mark points on the target M2, the coordinates of P1, P2 and P3 in the camera coordinate system { C } are respectively calculated according to the solution method of the PnP problemThen solving the pose relation between the coordinate system { O } and the coordinate system { C } determined by the P1, the P2 and the P3CTOComprises the following steps:
wherein,directions of XYZ axes of a coordinate system { O }, respectively;is the origin of the coordinate system { O }, is also Δ P1P2P3And has:
meanwhile, the coordinates of P1, P2 and P3 in the target coordinate system { M2} are known according to the design parameters of the targetThe pose relationship between the coordinate system { O } and the coordinate system { M2} determined by P1, P2 and P3 can be obtainedM2TOComprises the following steps:
the XYZ axis directions and the origins of the coordinate systems { O } here are determined in the same manner as above, except that the coordinates of the three points P1, P2, P3 are coordinates in different coordinate systems, one is a coordinate in the camera coordinate system { C } and one is a coordinate in the target coordinate system { M2}, and therefore the XYZ axis directions and the origins of the two coordinate systems { O } are determined in different manners except that the determination methods are the same;
then, the pose relationship between the coordinate systems { C } and { M2} is solvedCTM2Comprises the following steps:
s18, detaching the target from the end flange of the robot B on the premise of not changing the relative position of the robot A and the robot B, and mounting the target on a bracket fixedly connected with the robot B, wherein the target is marked as M at this time, and the associated coordinate system of the target is { M };
s19 obtaining an image by using the camera C, and then solving the position matrix of the cameras C and M according to the imageCTMAnd then get solvedBTM=BTM2·(CTM2)-1·CTM:
Specifically, an image about the target M is acquired by the camera C, and assuming that P1, P2 and P3 are three non-collinear mark points on the target M, the coordinates of P1, P2 and P3 in the camera coordinate system { C } are respectively calculated to be P1, P2 and P3 according to the solution method of the PnP problemThen solving the pose relation between the coordinate system { O } and the coordinate system { C } determined by the P1, the P2 and the P3CTOComprises the following steps:
wherein,directions of XYZ axes of a coordinate system { O }, respectively;is the origin of the coordinate system { O }, is also Δ P1P2P3And has:
meanwhile, the coordinates of the P1, P2 and P3 in the target coordinate system { M } can be known according to the design parameters of the targetThe pose relationship between the coordinate system { O } and the coordinate system { M } determined by P1, P2 and P3 can be obtainedMTOComprises the following steps:
the XYZ axis directions and the origins of the coordinate systems { O } here are determined in the same manner as above, except that the coordinates of the three points P1, P2, P3 are coordinates in different coordinate systems, one is a coordinate in the camera coordinate system { C } and one is a coordinate in the target coordinate system { M }, and therefore the XYZ axis directions and the origins of the two coordinate systems { O } are determined in different manners except that the determination methods are the same;
further, the pose relation between the coordinate systems { C } and { M } is obtained through solutionCTMComprises the following steps:
then, the pose relationship between the coordinate system { B } and the coordinate system { M } is obtained as:
BTM=BTM2·(CTM2)-1·CTM;
s110 repeating the steps S16-S19 n times to solveBTMExpected value of E (E: (BTM) The solving method is the same as the step S14, wherein n preferably satisfies 1 ≦ n ≦ 10, and the relative pose relationship between the coordinate system { B } and the coordinate system { M } is obtained by performing the steps S16-S19 onceBTMThe relative pose relationship obtained at the ith time is represented by (BTM)i,1≤i≤n。
Compared with other calibration methods, the calibration method is simple, convenient, easy to use, efficient and time-saving, only needs conventional tools, does not need manual intervention in the subsequent calibration process once the preparation work is completed, does not need the self-movement of the robot, can be suitable for various industrial fields, and is particularly suitable for application scenes in which the base coordinate system of the robot frequently changes and the calibration speed is strictly required. The invention can perform multiple high-precision online calibrations as required only by one-time off-line measurement, and is particularly suitable for occasions where the relative pose of the base of the cooperative robot in an industrial field is frequently changed, the requirement on the precision of cooperative operation is high, and the calibration of the base coordinate system of the cooperative robot needs to be frequently and quickly performed, such as a multi-robot mobile operation/processing system.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (6)
1. A vision-based online calibration method for relative poses of two robots is characterized in that the calibration method constructs two intermediate coordinate systems, namely a camera coordinate system { C } and a target coordinate system { M } through a camera C and a target M, wherein the camera C is fixedly connected with a robot A, the base coordinate system of the robot A is { A }, the target M is fixedly connected with a robot B, the base coordinate system of the robot B is { B }, and the calibration method solves the relative pose relationship of the two robots in an offline combined online measurement modeATBWhich comprises the following steps:
s1 off-line measurement: respectively determineRelative pose relationship between base coordinate system { A } and camera coordinate system { C } of robot AATCAnd the relative position and orientation relation between the base coordinate system { B } and the target coordinate system { M } of the robot BBTM;
S2 on-line measurement: obtaining the relative pose relation between the camera coordinate system { C } and the target coordinate system { M } through a vision measurement technologyCTMAccording to the position and pose relation matrixATC、BTMAndCTMconstructing a calibration equationATB=ATC·CTM·(BTM)-1According to the calibration equation, the relative position relationship between the base coordinate systems of the double robots can be calculatedATBThus, the calibration of the base coordinate system of the two robots is completed.
2. The vision-based online calibration method for the relative poses of the two robots as claimed in claim 1, wherein the pose relationship between the base coordinate system { A } of the robot A and the camera coordinate system { C } is determined in step S1ATCThe method specifically comprises the following substeps:
s11, mounting the camera C on a bracket fixedly connected with the robot A, and mounting a target on a tail end flange of the robot A, wherein the target is marked as M1;
s12, adjusting the pose of the robot A to enable the target M1 to completely appear in the visual field range of the camera C, recording the joint angle information of the robot A, and then calculating to obtain a pose matrix of the robot A and the target M1ATM1;
S13 obtaining an image of the target M1 by using the camera C, and then solving a position matrix of the cameras C and M1 according to the imageCTM1And then get solvedATC=ATM1·(CTM1)-1;
S14 repeating the steps S12-S13 to solveATCIs calculated from the expected value of (c).
3. The vision-based of claim 1The perceived online calibration method for the relative poses of the double robots is characterized in that the pose relationship between the base coordinate system { B } and the target coordinate system { M } of the robot B is determined in step S1BTMThe method specifically comprises the following substeps:
s15 detaching the target from the end flange of robot a and mounting it on the end flange of robot B, this target being now denoted as M2;
s16, the robots A and B are respectively adjusted to enable the target M2 to completely appear in the visual field range of the camera C, joint angle information of the robot B is recorded, and then a pose matrix of the robot B and the target M2 is obtained through calculationBTM2;
S17 obtaining an image of the target M2 by using the camera C, and then solving a position matrix of the cameras C and M2 according to the imageCTM2;
S18 detaching the target from the end flange of the robot B and mounting it on a bracket fixedly connected to the robot B, where the target is denoted as M (the position of the robot A, B cannot be changed in the process);
s19 obtaining an image of the target M by using the camera C, and then solving a pose matrix of the cameras C and M according to the imageCTMAnd then get solvedBTM=BTM2·(CTM2)-1·CTM;
S110 repeating the steps S16-S19 to solveBTMIs calculated from the expected value of (c).
4. The vision-based on-line calibration method for the relative pose of the double robots as claimed in claim 2 or 3, wherein the following method is adopted to solveATCOrBTMDesired value of (a):
1) decomposing pose matrixATCOrBTMObtainingATCComponent of rotationARCAnd translational componentAPCOrBTMComponent of rotationBRMAnd translational componentBPM;
2) Respectively rotateAmount of rotationARCOrBTMConverting into an RPY angle form, then summing the RPY angle form to obtain an average value, and converting into a rotation matrix form to obtain a rotation component expected value;
3) for the translational componentAPCOrBPMSumming and averaging to obtain a translation component expected value;
4) representing the expected value of the rotation component and the expected value of the translation component by using a homogeneous transformation position matrix to obtainATCOrBTMIs calculated from the expected value of (c).
5. The vision-based on-line calibration method for relative poses of two robots as claimed in claim 1, wherein in step S2, the camera C is used to obtain the image of the target M, and the pose matrix of the camera coordinate system { C } and the target coordinate system { M } is solved according to the imageCTM。
6. The vision-based on-line calibration method for the relative poses of the two robots as recited in any one of claims 1 to 5, wherein the corresponding pose matrix is solved according to the image by adopting the following method:
1) selecting any 3 non-collinear mark points on the target image, and respectively marking as P1、P2、P3And the coordinates of the three mark points in the camera coordinate system are solved through a vision measurement technology, and then a position matrix of a coordinate system { O } and a camera coordinate system { C } formed by the 3 mark points is solvedCTOWherein the origin of the coordinate system { O } and the triangle P1P2P3Have coincident centers of gravity and have:
wherein,respectively XYZ-axes of a coordinate system { O }The direction of the light beam is changed,is a vector pointed to P2 by point P1,as vectorsThe length of the die (c) is,is a vector pointed to P3 by point P1,as vectorsThe length of the die (c) is,as vectorsAndthe cross product of (a) and (b),as vectorsThe die length of (2);
2) calculating the position and orientation matrix of the coordinate system { M } and the coordinate system { O }MTOAccording toCTOAndMTOcomputingCTM=CTO·(MTO)-1。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810385326.9A CN108519055A (en) | 2018-04-26 | 2018-04-26 | A kind of dual robot relative pose online calibration method of view-based access control model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810385326.9A CN108519055A (en) | 2018-04-26 | 2018-04-26 | A kind of dual robot relative pose online calibration method of view-based access control model |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108519055A true CN108519055A (en) | 2018-09-11 |
Family
ID=63429081
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810385326.9A Pending CN108519055A (en) | 2018-04-26 | 2018-04-26 | A kind of dual robot relative pose online calibration method of view-based access control model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108519055A (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109579805A (en) * | 2018-11-26 | 2019-04-05 | 成都经纬时空科技有限公司 | A kind of baseline self calibration measurement method |
CN109822577A (en) * | 2019-03-29 | 2019-05-31 | 北京卫星制造厂有限公司 | A kind of mobile robot's high-precision processing method of view-based access control model servo |
CN109877840A (en) * | 2019-04-02 | 2019-06-14 | 哈尔滨工程大学 | A kind of double mechanical arms scaling method based on camera optical axis constraint |
CN110017852A (en) * | 2019-04-25 | 2019-07-16 | 广东省智能机器人研究院 | A kind of navigation positioning error measurement method |
CN111768383A (en) * | 2020-06-29 | 2020-10-13 | 易思维(杭州)科技有限公司 | Three-dimensional target and method for recovering working function of visual sensor by using same |
CN112489132A (en) * | 2020-11-13 | 2021-03-12 | 复旦大学 | Large-size object measuring robot calibration system and method |
EP3809567A1 (en) * | 2019-10-16 | 2021-04-21 | Siemens Gamesa Renewable Energy A/S | Assembly of a multi-segment stator |
CN113787541A (en) * | 2021-11-17 | 2021-12-14 | 杭州灵西机器人智能科技有限公司 | Robot position correction method and robot positioning system |
CN114319142A (en) * | 2022-03-08 | 2022-04-12 | 中国铁路设计集团有限公司 | High-speed magnetic levitation track beam precision positioning method based on free target |
CN114750160A (en) * | 2022-05-16 | 2022-07-15 | 深圳市大族机器人有限公司 | Robot control method, robot control device, computer equipment and storage medium |
CN115139283A (en) * | 2022-07-18 | 2022-10-04 | 中船重工鹏力(南京)智能装备系统有限公司 | Robot hand-eye calibration method based on random mark dot matrix |
CN116294987A (en) * | 2022-11-25 | 2023-06-23 | 无锡中车时代智能装备研究院有限公司 | Coordinate conversion method and system in automatic measurement polishing system with double robots |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08272414A (en) * | 1995-03-29 | 1996-10-18 | Fanuc Ltd | Calibrating method for robot and visual sensor using hand camera |
CN105451461A (en) * | 2015-11-25 | 2016-03-30 | 四川长虹电器股份有限公司 | PCB board positioning method based on SCARA robot |
CN205630685U (en) * | 2016-03-30 | 2016-10-12 | 广东工业大学 | A equipment for demarcating many robot system base coordinate system |
CN106272444A (en) * | 2016-08-31 | 2017-01-04 | 山东中清智能科技有限公司 | A kind of realize trick relation and method that dual robot relation is demarcated simultaneously |
-
2018
- 2018-04-26 CN CN201810385326.9A patent/CN108519055A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08272414A (en) * | 1995-03-29 | 1996-10-18 | Fanuc Ltd | Calibrating method for robot and visual sensor using hand camera |
CN105451461A (en) * | 2015-11-25 | 2016-03-30 | 四川长虹电器股份有限公司 | PCB board positioning method based on SCARA robot |
CN205630685U (en) * | 2016-03-30 | 2016-10-12 | 广东工业大学 | A equipment for demarcating many robot system base coordinate system |
CN106272444A (en) * | 2016-08-31 | 2017-01-04 | 山东中清智能科技有限公司 | A kind of realize trick relation and method that dual robot relation is demarcated simultaneously |
Non-Patent Citations (7)
Title |
---|
JIAOLE WANG等: "Towards Simultaneous Coordinate Calibrations for Cooperative Multiple Robots", 《2014 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS》 * |
孙大涌等: "《先进制造技术》", 29 February 2000, 机械工业出版社 * |
徐德等: "《机器人视觉测量与控制(第3版)》", 31 January 2016, 国防工业出版社 * |
杨晓钧等: "《工业机器人技术》", 31 August 2015, 哈尔滨工业大学出版社 * |
柳洪义等: "《机器人技术基础》", 30 November 2002, 冶金工业出版社 * |
秦丽娟等: "《计算机单目视觉定位》", 30 April 2016, 国防工业出版社 * |
苏剑波: "双机器人系统的基坐标系标定", 《控制理论与应用》 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109579805A (en) * | 2018-11-26 | 2019-04-05 | 成都经纬时空科技有限公司 | A kind of baseline self calibration measurement method |
CN109822577A (en) * | 2019-03-29 | 2019-05-31 | 北京卫星制造厂有限公司 | A kind of mobile robot's high-precision processing method of view-based access control model servo |
CN109822577B (en) * | 2019-03-29 | 2021-02-05 | 北京卫星制造厂有限公司 | Mobile robot high-precision machining method based on visual servo |
CN109877840A (en) * | 2019-04-02 | 2019-06-14 | 哈尔滨工程大学 | A kind of double mechanical arms scaling method based on camera optical axis constraint |
CN109877840B (en) * | 2019-04-02 | 2021-09-28 | 哈尔滨工程大学 | Double-mechanical-arm calibration method based on camera optical axis constraint |
CN110017852A (en) * | 2019-04-25 | 2019-07-16 | 广东省智能机器人研究院 | A kind of navigation positioning error measurement method |
WO2021073841A1 (en) * | 2019-10-16 | 2021-04-22 | Siemens Gamesa Renewable Energy A/S | Assembly of a multi-segment stator |
EP3809567A1 (en) * | 2019-10-16 | 2021-04-21 | Siemens Gamesa Renewable Energy A/S | Assembly of a multi-segment stator |
CN111768383A (en) * | 2020-06-29 | 2020-10-13 | 易思维(杭州)科技有限公司 | Three-dimensional target and method for recovering working function of visual sensor by using same |
CN112489132A (en) * | 2020-11-13 | 2021-03-12 | 复旦大学 | Large-size object measuring robot calibration system and method |
CN112489132B (en) * | 2020-11-13 | 2023-05-05 | 复旦大学 | Calibration system and method for large-size object measurement robot |
CN113787541A (en) * | 2021-11-17 | 2021-12-14 | 杭州灵西机器人智能科技有限公司 | Robot position correction method and robot positioning system |
CN114319142A (en) * | 2022-03-08 | 2022-04-12 | 中国铁路设计集团有限公司 | High-speed magnetic levitation track beam precision positioning method based on free target |
CN114750160A (en) * | 2022-05-16 | 2022-07-15 | 深圳市大族机器人有限公司 | Robot control method, robot control device, computer equipment and storage medium |
CN115139283A (en) * | 2022-07-18 | 2022-10-04 | 中船重工鹏力(南京)智能装备系统有限公司 | Robot hand-eye calibration method based on random mark dot matrix |
CN115139283B (en) * | 2022-07-18 | 2023-10-24 | 中船重工鹏力(南京)智能装备系统有限公司 | Robot hand-eye calibration method based on random mark dot matrix |
CN116294987A (en) * | 2022-11-25 | 2023-06-23 | 无锡中车时代智能装备研究院有限公司 | Coordinate conversion method and system in automatic measurement polishing system with double robots |
CN116294987B (en) * | 2022-11-25 | 2023-12-08 | 无锡中车时代智能装备研究院有限公司 | Coordinate conversion method and system in automatic measurement polishing system with double robots |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108519055A (en) | A kind of dual robot relative pose online calibration method of view-based access control model | |
CN109822577B (en) | Mobile robot high-precision machining method based on visual servo | |
CN107972071B (en) | A kind of industrial robot link parameters scaling method based on distal point plane restriction | |
CN108748159B (en) | Self-calibration method for tool coordinate system of mechanical arm | |
CN107717993B (en) | Efficient and convenient simple robot calibration method | |
CN108692688B (en) | Automatic calibration method for coordinate system of scanner of robot measuring-processing system | |
Jiang et al. | A measurement method for robot peg-in-hole prealignment based on combined two-level visual sensors | |
CN109454281B (en) | Method for calibrating propeller workpiece coordinate system in robot milling | |
CN110757504B (en) | Positioning error compensation method of high-precision movable robot | |
CN109822574A (en) | A kind of method of industrial robot end six-dimension force sensor calibration | |
CN109163675B (en) | Method for detecting angular pendulum shaft position accuracy based on laser tracker | |
CN111702762A (en) | Industrial robot operation attitude optimization method | |
CN109623822B (en) | Robot hand-eye calibration method | |
CN112907682B (en) | Hand-eye calibration method and device for five-axis motion platform and related equipment | |
CN114102256B (en) | Machine tool rotating shaft geometric error identification method and device and storage medium | |
Geng et al. | A novel welding path planning method based on point cloud for robotic welding of impeller blades | |
CN114161425B (en) | Error compensation method for industrial robot | |
CN111426270A (en) | Industrial robot pose measurement target device and joint position sensitive error calibration method | |
Chen et al. | A vision-based calibration method for aero-engine blade-robotic grinding system | |
CN115179323A (en) | Machine end pose measuring device based on telecentric vision constraint and precision improving method | |
Zhang et al. | A novel accurate positioning method of reference hole for complex surface in aircraft assembly | |
Fan et al. | Binocular vision and priori data based intelligent pose measurement method of large aerospace cylindrical components | |
CN109773589A (en) | Method and device, the equipment of on-line measurement and processing guiding are carried out to workpiece surface | |
Kong et al. | Online measurement method for assembly pose of gear structure based on monocular vision | |
CN115781716A (en) | Industrial robot visual servo track compensation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180911 |