CN114833822A - Rapid hand-eye calibration method for robot - Google Patents
Rapid hand-eye calibration method for robot Download PDFInfo
- Publication number
- CN114833822A CN114833822A CN202210346172.9A CN202210346172A CN114833822A CN 114833822 A CN114833822 A CN 114833822A CN 202210346172 A CN202210346172 A CN 202210346172A CN 114833822 A CN114833822 A CN 114833822A
- Authority
- CN
- China
- Prior art keywords
- robot
- coordinate system
- terminal
- moves
- tail end
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 81
- 238000013519 translation Methods 0.000 claims abstract description 72
- 239000011159 matrix material Substances 0.000 claims description 88
- 230000009466 transformation Effects 0.000 claims description 32
- 239000013598 vector Substances 0.000 claims description 26
- 101100394003 Butyrivibrio fibrisolvens end1 gene Proteins 0.000 claims description 12
- 238000004422 calculation algorithm Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 6
- 230000014616 translation Effects 0.000 description 44
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012067 mathematical method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The invention provides a rapid hand-eye calibration method for a robot, aiming at solving the technical problem that the traditional hand-eye calibration method needs to take calibration plate images acquired by a vision system and corresponding robot pose information as input, and the process is complex. The calibration method comprises the following steps: step 1, calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the space point moves at the terminal of the robot according to the three-dimensional coordinate of the space point before the space point moves in the terminal coordinate system; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot; step 2, calibrating the rotation relationship of the hands and the eyes; step 3, calibrating the translation relation of hands and eyes; and 4, completing the calibration of the hands and eyes of the robot.
Description
Technical Field
The invention relates to the technical field of robot calibration, in particular to a rapid hand-eye calibration method for a robot.
Background
At present, in most of robot hand-eye calibration processes, the transformation pose between the robot tail end and the camera is calculated by utilizing the transformation pose between the robot base and the robot tail end and the transformation pose between the camera and the calibration plate at different photographing positions, so as to obtain a hand-eye calibration result.
Conventional calibration methods include two main categories: linear calibration methods and non-linear calibration methods. The linear calibration method is divided into a two-step method, a common calibration method, a mathematical method and a motion limiting method; the nonlinear calibration method is a nonlinear motion method.
The methods all need to input the terminal poses of the robot and correspond to the images acquired by the cameras one by one, and the operation process is complex.
Chinese patent (CN113814987A), name is: a multi-camera robot hand-eye calibration method, a multi-camera robot hand-eye calibration device, electronic equipment and a storage medium particularly disclose that eye-on-hand calibration between an end camera and the end and eye-on-hand calibration between the environment camera and a robot base are calculated according to initial angle data of the environment camera, the end camera and a calibration standard under a robot base coordinate system, but the eye-on-hand calibration method also adopts a pose relationship between the robot end and the robot base when calculating the eye-on-hand relationship, and needs to be applied to the later eye-on-hand relationship calibration.
Disclosure of Invention
The invention aims to solve the technical problem that the traditional hand-eye calibration method needs to take calibration plate images acquired by a vision system and corresponding robot pose information as input, and the process is complex, and provides a rapid hand-eye calibration method for a robot, which is suitable for calibrating the relation of eyes on the hand and the relation of eyes outside the hand.
The design idea of the invention is as follows: according to the same three-dimensional coordinates of the same space point in a world coordinate system and the pose transformation relation between the coordinate systems, the transformation pose between the robot base and the tail end of the robot is not required to be obtained, only the transformation pose between the camera and the calibration plate is required, and the pose between the robot and the vision system is calculated through images, so that the hand-eye calibration result can be quickly obtained.
In order to achieve the purpose, the invention adopts the technical scheme that:
a rapid hand-eye calibration method of a robot is characterized in that an eye-on-hand calibration method is adopted, and the method comprises the following steps:
step 1), calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the terminal of the robot moves according to the three-dimensional coordinate of the space point before the terminal coordinate system moves; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot;
the calibration plate is fixedly placed in the robot moving range and the camera view field, and the calibration plate is ensured to be in the camera view field range and to be clear in imaging;
step 2), calibrating the rotation relationship of the hands and the eyes;
2.1), keeping the robot in a fixed state, and acquiring an image of a first pose by a camera;
2.2) operating the robot to translate for multiple times along three coordinate axis directions of the terminal coordinate system, and acquiring multiple images with different translation poses;
2.3) calculating the rotation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of translation poses;
step 3), calibrating the translation relation of the hands and the eyes;
3.1), operating the robot to return to the initial position;
3.2) operating the robot to rotate for multiple times around the origin of the terminal coordinate system, and acquiring multiple images with different rotation poses;
3.3) calculating the translation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of rotation poses;
and 4) completing the calibration of the hands and eyes of the robot.
Further, step 1 specifically comprises:
1.1) calculating the three-dimensional coordinate P of the space point in the terminal coordinate system after the terminal of the robot moves end2 ;
wherein ,
P end1 is a point in spaceThree-dimensional coordinates in a robot tip end movement front tip end coordinate system;
a pose rotation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
a pose translation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
1.2) calculating the three-dimensional coordinate P of the space point in the coordinate system of the camera before the robot moves at the tail end camera1 ;
wherein ,
1.3) calculating the three-dimensional coordinate P in the camera coordinate system after the space point moves at the tail end of the robot camera2 ;
1.4) calculating the three-dimensional coordinate P of the space point in the coordinate system of the calibration plate before the tail end of the robot moves target1 ;
wherein ,
the pose rotation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves;
the pose translation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves; 1.5) calculating the three-dimensional coordinate P in the calibration plate coordinate system after the space point moves at the tail end of the robot target2 ;
wherein ,
the pose rotation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained;
the pose translation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained.
Further, the coordinates of the space point in the coordinate system of the calibration plate before and after the robot end moves are unchanged, P target1 =P target2 According to formula 4 and formula 5:
defining the space point before the robot end moves to the origin of the end coordinate system, P end1 =(0,0,0) T Root of Chinese characterThe coordinate relationship is obtained according to equation 6:
and calibrating the hand and the eye by utilizing the coordinate relation.
Further, the step 2.3) is specifically as follows: according to the images of the plurality of translation poses, a rotation transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
When the tail end of the robot does not rotate relatively before and after moving,obtaining the rotation relationship of the eyes and hands according to the formula 7
The tail end of the robot moves to perform a plurality of translations along three axes of x, y and z of a tail end coordinate system,andregarding the two vectors as two vectors, and simultaneously taking the unit vector for calculation;
Further, step 3.3) is specifically: according to the images of the plurality of rotary positions, a rotary transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not move before and after the movement,obtaining the translation relationship of the eyes and hands according to the formula 7
A rapid hand-eye calibration method of a robot is characterized in that an eye-outside calibration method is adopted, and the method comprises the following steps:
step 1), calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the space point moves at the terminal of the robot according to the three-dimensional coordinate of the space point before the space point moves in the terminal coordinate system; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot;
The calibration plate is fixedly placed in the robot moving range and the camera view field, and the calibration plate is ensured to be in the camera view field range and to be clear in imaging;
step 2), calibrating the rotation relationship of the hands and the eyes;
2.1), keeping the robot in a fixed state, and acquiring an image of a first pose by a camera;
2.2) operating the robot to translate for multiple times along three coordinate axis directions of the terminal coordinate system, and acquiring multiple images with different translation poses;
2.3) calculating the rotation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of translation poses;
step 3), calibrating the translation relation of the hands and the eyes;
3.1), operating the robot to return to the initial position;
3.2) operating the robot to rotate for multiple times around the origin of the terminal coordinate system, and acquiring multiple images with different rotation poses;
3.3) calculating the translation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of rotation poses;
and 4) completing the calibration of the hands and the eyes of the robot.
Further, step 1 specifically comprises:
1.1) calculating the three-dimensional coordinate P of the space point in the terminal coordinate system after the terminal of the robot moves end2 ;
wherein ,
P end1 three-dimensional coordinates of the space point in a terminal coordinate system before the robot terminal moves;
a pose rotation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
A pose translation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
1.2) calculating the three-dimensional coordinate P of the space point in the coordinate system of the camera before the robot moves at the tail end camera1 ;
wherein ,
is a calibration plate coordinate system before the movement of the tail end of the robotA pose rotation matrix to a camera coordinate system;
the method comprises the following steps of (1) calibrating a pose translation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves;
1.3) calculating the three-dimensional coordinate P in the camera coordinate system after the space point moves at the tail end of the robot camera2 ;
wherein ,
the method comprises the following steps that a pose rotation matrix from a calibration plate coordinate system to a camera coordinate system after the tail end of a robot moves is obtained;
the method comprises the following steps that a pose translation matrix from a calibration plate coordinate system to a camera coordinate system is obtained after the tail end of a robot moves;
1.4) calculating the three-dimensional coordinate P of the space point in the coordinate system of the calibration plate before the tail end of the robot moves target1 ;
wherein ,
1.5) calculating the three-dimensional coordinate P in the calibration plate coordinate system after the space point moves at the tail end of the robot target2 ;
Further, when the eyes are calibrated outside the hands, the coordinates of the space point in the camera coordinate system are not changed before and after the robot end moves, namely P camera1 =P camera2 According to formula 11 and formula 12:
defining the space point before the robot end moves to the origin of the end coordinate system, P end1 =(0,0,0) T The coordinate relationship is obtained according to equation 15:
and calibrating the hand and the eye by utilizing the coordinate relation.
Further, the step 2.3) is specifically as follows: according to the images of the plurality of translation poses, a rotation transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not rotate relatively before and after moving,obtaining the hand-eye rotation relationship according to the formula 16
The tail end of the robot moves to perform a plurality of translations along three axes of x, y and z of a tail end coordinate system,andregarding the two vectors as two vectors, and simultaneously taking the unit vector for calculation;
Further, step 3.3) is specifically: according to the images of the plurality of rotary positions, a rotary transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not move before and after the movement,obtaining the translation relationship between the hands and eyes according to the formula 16
Compared with the prior art, the invention has the following beneficial technical effects:
according to the rapid hand-eye calibration method provided by the invention, according to the same three-dimensional coordinates of the same space point in a world coordinate system and the pose transformation relation between the coordinate systems, the transformation pose between the robot base and the tail end of the robot is not required to be obtained, and the calibration of the relation between the eyes on the hands and the relation between the eyes outside the hands can be rapidly completed by controlling the robot to move and acquiring calibration plate images according to the camera.
Drawings
FIG. 1 is a flow chart of the steps of the rapid hand-eye calibration method of the robot of the present invention;
fig. 2 is a schematic diagram of a calibration board used for calibration in the embodiment of the present invention.
Detailed Description
In order to make the objects, advantages and features of the present invention more clear, a rapid hand-eye calibration method for a robot according to the present invention is further described in detail with reference to the accompanying drawings and specific embodiments. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention and are not intended to limit the scope of the present invention.
As shown in fig. 1, the method for rapidly calibrating the hand and the eye of the robot provided by the invention only uses the image of the calibration plate (shown in fig. 2) acquired by the camera to resolve the pose of the robot and the vision system to calibrate the hand and the eye, and comprises the following specific steps:
Example 1
During the calibration of the eye-on-hand relationship, the calibration plate coordinate system is fixed relative to the world coordinate system, so that the three-dimensional coordinate P of any point P in space in the calibration plate target Is left unchanged.
Suppose that the three-dimensional coordinate of the point P in the terminal coordinate system is P before and after the terminal of the robot moves end1 and Pend2 The three-dimensional coordinate in the camera coordinate system is P camer1 and Pcamera2 。
wherein ,
moving the front end seat for the robot endThe pose rotation matrix of the tail end coordinate system after the target system moves to the tail end of the robot;
a pose translation matrix from a terminal coordinate system before the robot terminal moves to a terminal coordinate system after the robot terminal moves;
wherein ,
the pose translation matrix from the terminal coordinate system to the camera coordinate system is a hand-eye calibration matrix;
the same principle is that:
wherein ,
the pose rotation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves;
is the coordinates of the camera before the end of the robot movesA pose translation matrix tied to a calibration plate coordinate system;
the same principle is that:
wherein ,
the pose rotation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained; The pose translation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained; due to P target1 =P target2 According to (4) (5), then:
for convenience, take P end1 =(0,0,0) T And then:
namely:
the invention utilizes a robot moving mode in a specific sequence and combines a formula (8) to calculate a hand-eye calibration matrix, and the specific steps comprise: calibrating the rotation relation of the hands and the eyes and calibrating the translation relation of the hands and the eyes.
Step 1, calibrating the rotation relationship of hands and eyes
1.1) placing a calibration plate in the range of the later operation target of the robot;
1.2), ensuring that the calibration plate is within the field of view range of the camera and the imaging is clear, and acquiring a first image;
1.3), operating the robot to make the robot translate for a plurality of times along the x axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example, twice);
1.4), operating the robot to make the robot translate for a plurality of times along the y axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example, twice);
1.5), operating the robot to make the robot translate for a plurality of times along the z-axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example twice);
1.6) obtaining a rotation transformation matrix between a camera coordinate system and a calibration plate coordinate system corresponding to 7 pictures through image processing and a PNP algorithm And translation transformation matrixi=0,1,2,…,6;
1.7), the corresponding 6 equations can be obtained from 7 pictures, and then the formula (9) can be combined into the formula (10):
wherein :
in the case where the robot end coordinate system movement distance is unknown, the calculation can be performed using the unit vector, where:
1.8) equation (10) is similar to equation Ax ═ b.
Wherein: a is a matrix of 3 x 3, x is a matrix of 3 x 6, and b is a matrix of 3 x 6.
Then:
A=(b·x T )·(x·x T ) -1 (11)
Step 2: hand-eye translation relation calibration
2.1) operating the robot to return to the initial position and collecting a first image;
2.2) operating the robot to rotate around the X axis of the terminal coordinate system around the origin of the terminal of the robot, and acquiring a plurality of images (at least once, for example twice);
2.3) operating the robot to rotate around the origin of the tail end of the robot around the y axis of the tail end coordinate system, and acquiring a plurality of images (at least once, for example twice);
2.4) operating the robot to rotate around the origin of the tail end of the robot around the z axis of the tail end coordinate system, and acquiring a plurality of images (at least once, for example twice);
2.5) obtaining a rotation transformation matrix between a camera coordinate system and a calibration plate coordinate system corresponding to 7 pictures through image processing and a PNP algorithm And translation transformation matrixWherein i is 0, 1, 2 … 6;
2.6) the corresponding 6 equations can be obtained from 7 pictures, and the formula (13) can be obtained after the formula (12) is combined:
wherein :
defining:
2.7) equation (13) is similar to equation Ax ═ b.
Wherein: a is a matrix of 18 x 3, x is a matrix of 3 x 1, and b is a matrix of 18 x 1.
Then:
x=(A T ·A) -1 ·(A T ·b) (14)
Example 2
During the calibration of the eye's extra-manual relationship, the camera coordinate system is fixed with respect to the world coordinate system, so that the three-dimensional coordinates P of any point P in space in the camera coordinate system camera Is left unchanged.
Suppose that the three-dimensional coordinate of point P in the end coordinate system twice before and after the robot end moves is P end1 and Pend2 The three-dimensional coordinate in the calibration plate coordinate system is P target1 and Ptarget2 。
wherein ,
a pose rotation matrix from a terminal coordinate system before the robot terminal moves to a terminal coordinate system after the robot terminal moves;
a pose translation matrix from a terminal coordinate system before the robot terminal moves to a terminal coordinate system after the robot terminal moves;
wherein ,
is a pose rotation matrix between the terminal coordinate system and the coordinate system of the calibration plate;
the method comprises the following steps of (1) obtaining a pose translation matrix between a terminal coordinate system and a calibration plate coordinate system, namely a hand-eye calibration matrix;
The same principle is that:
wherein ,
the method comprises the following steps of (1) calibrating a pose rotation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves;the method comprises the following steps of (1) calibrating a pose translation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves; the same principle is that:
wherein ,
the method comprises the following steps that a pose rotation matrix from a calibration plate coordinate system to a camera coordinate system after the tail end of a robot moves is obtained;the method comprises the following steps that a pose translation matrix from a calibration plate coordinate system to a camera coordinate system is obtained after the tail end of a robot moves; due to P camera1 =P camera2 According to (18) (19), then:
for convenience, take P end1 =(0,0,0) T And then:
namely:
the invention utilizes a robot moving mode in a specific sequence and combines a formula (22) to calculate a hand-eye calibration matrix, and the specific steps comprise: calibrating the rotation relation of the hands and the eyes and calibrating the translation relation of the hands and the eyes.
Step 1, calibrating the rotation relationship of hands and eyes
1.1) placing the calibration plate in the range of the later operation target of the robot;
1.2), ensuring that the calibration plate is within the field of view range of the camera and the imaging is clear, and acquiring a first image;
1.3), operating the robot to make the robot translate for a plurality of times along the x axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example, twice);
1.4), operating the robot to make the robot translate for a plurality of times along the y-axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example twice);
1.5), operating the robot to make the robot translate for a plurality of times along the z-axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example twice);
1.6) obtaining a rotation transformation matrix between a camera coordinate system and a calibration plate coordinate system corresponding to 7 pictures through image processing and a PNP algorithmAnd translation transformation matrixi=0,1,2,…,6;
1.7), the corresponding 6 equations can be obtained from 7 pictures, and then the formula (23) can be combined into the formula (24):
wherein :
in the case where the robot end coordinate system movement distance is unknown, the calculation can be performed using the unit vector, where:
1.8) equation (24) is similar to equation Ax ═ b.
Wherein: a is a matrix of 3 x 3, x is a matrix of 3 x 6, and b is a matrix of 3 x 6.
Then:
A=(b·x T )·(x·x T ) -1 (25)
Step 2: hand-eye translation relation calibration
2.1) operating the robot to return to the initial position and collecting a first image;
2.2) operating the robot to rotate around the X axis of the terminal coordinate system around the origin of the terminal of the robot, and acquiring a plurality of images (at least once, for example twice);
2.3) operating the robot to rotate around the origin of the tail end of the robot around the y axis of the tail end coordinate system, and acquiring a plurality of images (at least once, for example twice);
2.4) operating the robot to rotate around the origin of the tail end of the robot around the z axis of the tail end coordinate system, and acquiring a plurality of images (at least once, for example twice);
2.5) obtaining a rotation transformation matrix between a camera coordinate system and a calibration plate coordinate system corresponding to 7 pictures through image processing and a PNP algorithmAnd translation transformation matrixWherein i is 0, 1, 2 … 6;
2.6) from 7 pictures, the corresponding 6 equations can be obtained, and after the formula (26) is combined, the formula (27) can be obtained:
wherein :
defining:
2.7) equation (27) is similar to equation Ax ═ b.
Wherein: a is a matrix of 18 x 3, x is a matrix of 3 x 1, and b is a matrix of 18 x 1.
Then:
x=(A T ·A) -1 ·(A T ·b) (28)
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the present invention.
Claims (10)
1. A rapid hand-eye calibration method for a robot is characterized in that an eye-on-hand calibration method is adopted, and the method comprises the following steps:
Step 1), calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the space point moves at the terminal of the robot according to the three-dimensional coordinate of the space point before the space point moves in the terminal coordinate system; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot;
the calibration plate is fixedly placed in the robot moving range and the camera view field, and the calibration plate is ensured to be in the camera view field range and to be clear in imaging;
step 2), calibrating the rotation relationship of the hands and the eyes;
2.1), keeping the robot in a fixed state, and acquiring an image of a first pose by a camera;
2.2) operating the robot to translate for multiple times along three coordinate axis directions of the terminal coordinate system, and acquiring multiple images with different translation poses;
2.3) calculating the rotation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of translation poses;
step 3), calibrating the translation relation of the hands and the eyes;
3.1), operating the robot to return to the initial position;
3.2) operating the robot to rotate for multiple times around the origin of the terminal coordinate system, and acquiring multiple images with different rotation poses;
3.3) calculating the translation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of rotation poses;
and 4) completing the calibration of the hands and eyes of the robot.
2. The rapid hand-eye calibration method for the robot according to claim 1, wherein the step 1 specifically comprises:
1.1) calculating the three-dimensional coordinate P of the space point in the terminal coordinate system after the terminal of the robot moves end2 ;
wherein ,
P end1 three-dimensional coordinates of the space point in a terminal coordinate system before the robot terminal moves;
a pose rotation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
a pose translation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
1.2) calculating the three-dimensional coordinate P of the space point in the coordinate system of the camera before the robot moves at the tail end camera1 ;
wherein ,
1.3) calculating the three-dimensional coordinate P in the camera coordinate system after the space point moves at the tail end of the robot camera2 ;
1.4) calculating the three-dimensional coordinate P of the space point in the coordinate system of the calibration plate before the tail end of the robot moves target1 ;
wherein ,
the pose rotation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves;
the pose translation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves;
1.5) calculating the three-dimensional coordinate P in the calibration plate coordinate system after the space point moves at the tail end of the robot target2 ;
wherein ,
the pose rotation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained;
3. The rapid hand-eye calibration method for a robot according to claim 2, characterized in that:
the coordinates of the space point in the coordinate system of the calibration plate before and after the tail end of the robot moves are unchanged, P target1 =P target2 According to formula 4 and formula 5:
defining the space point before the robot end moves to the origin of the end coordinate system, P end1 =(0,0,0) T The coordinate relationship is obtained according to equation 6:
and calibrating the hand and the eye by utilizing the coordinate relation.
4. The rapid hand-eye calibration method for the robot according to claim 3, wherein the step 2.3) is specifically as follows: according to the images of the plurality of translation poses, a rotation transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not rotate relatively before and after moving,obtaining the rotation relationship of the eyes and hands according to the formula 7
Robot end moving edgeThe three axes of x, y and z of the end coordinate system are translated for a plurality of times, Andregarding the two vectors as two vectors, and simultaneously taking the unit vector for calculation;
5. The rapid hand-eye calibration method for the robot according to claim 3, wherein the step 3.3) is specifically as follows: according to the images of the plurality of rotary positions, a rotary transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not move before and after the movement,obtaining the translation relationship of the eyes and hands according to the formula 7
6. A rapid hand-eye calibration method for a robot is characterized in that an eye-outside-hand calibration method is adopted, and the method comprises the following steps:
step 1), calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the space point moves at the terminal of the robot according to the three-dimensional coordinate of the space point before the space point moves in the terminal coordinate system; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot;
the calibration plate is fixedly arranged in the robot moving range and the camera view field, and the calibration plate is ensured to be in the camera view field range and to be clear in imaging;
Step 2), calibrating the rotation relationship of the hands and the eyes;
2.1), keeping the robot in a fixed state, and acquiring an image of a first pose by a camera;
2.2) operating the robot to translate for multiple times along three coordinate axis directions of the terminal coordinate system, and acquiring multiple images with different translation poses;
2.3) calculating the rotation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of translation poses;
step 3), calibrating the translation relation of the hands and the eyes;
3.1), operating the robot to return to the initial position;
3.2) operating the robot to rotate for multiple times around the origin of the terminal coordinate system, and acquiring multiple images with different rotation poses;
3.3) calculating the translation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of rotation poses;
and 4) completing the calibration of the hands and eyes of the robot.
7. The rapid hand-eye calibration method for a robot according to claim 6, wherein:
when the eyes are adopted for external calibration, the step 1 specifically comprises the following steps:
1.1) calculating the three-dimensional coordinate P of the space point in the terminal coordinate system after the terminal of the robot moves end2 ;
wherein ,
P end1 three-dimensional coordinates of the space point in a terminal coordinate system before the robot terminal moves;
a pose rotation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
A pose translation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
1.2) calculating the three-dimensional coordinate P of the space point in the coordinate system of the camera before the robot moves at the tail end camera1 ;
wherein ,
the method comprises the following steps of (1) calibrating a pose rotation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves;
the method comprises the following steps of (1) calibrating a pose translation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves;
1.3) calculating the three-dimensional coordinate P in the camera coordinate system after the space point moves at the tail end of the robot camera2 ;
wherein ,
the method comprises the following steps that a pose rotation matrix from a calibration plate coordinate system to a camera coordinate system after the tail end of a robot moves is obtained;
the method comprises the following steps that a pose translation matrix from a calibration plate coordinate system to a camera coordinate system is obtained after the tail end of a robot moves;
1.4) calculating the three-dimensional coordinate P of the space point in the coordinate system of the calibration plate before the tail end of the robot moves target1 ;
wherein ,
a pose translation matrix from a terminal coordinate system to a calibration plate coordinate system;
1.5) calculating the three-dimensional coordinate P in the calibration plate coordinate system after the space point moves at the tail end of the robot target2 ;
8. The rapid hand-eye calibration method for a robot according to claim 7, characterized in that:
When the eyes are calibrated outside the hands, the coordinates of the space point in the camera coordinate system are unchanged before and after the robot tail end moves, namely P camera1 =P camera2 According to formula 11 and formula 12:
defining the space point before the robot end moves to the origin of the end coordinate system, P end1 =(0,0,0) T The coordinate relationship is obtained according to equation 15:
and calibrating the hand and the eye by utilizing the coordinate relation.
9. The rapid hand-eye calibration method for the robot according to claim 8, wherein the step 2.3) is specifically as follows: according to the images of the plurality of translation poses, a rotation transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not rotate relatively before and after moving,obtaining the hand-eye rotation relationship according to the formula 16
The robot tail end moves to carry out translation for a plurality of times along three axes of x, y and z of a tail end coordinate system,andregarding the two vectors as two vectors, and simultaneously taking the unit vector for calculation;
10. The rapid hand-eye calibration method for the robot according to claim 8, wherein the step 3.3) is specifically as follows: according to the images of the plurality of rotary positions, a rotary transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
When the tail end of the robot does not move before and after the movement,obtaining the translation relationship between the hands and eyes according to the formula 16
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210346172.9A CN114833822B (en) | 2022-03-31 | 2022-03-31 | Rapid hand-eye calibration method for robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210346172.9A CN114833822B (en) | 2022-03-31 | 2022-03-31 | Rapid hand-eye calibration method for robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114833822A true CN114833822A (en) | 2022-08-02 |
CN114833822B CN114833822B (en) | 2023-09-19 |
Family
ID=82563810
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210346172.9A Active CN114833822B (en) | 2022-03-31 | 2022-03-31 | Rapid hand-eye calibration method for robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114833822B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116160454A (en) * | 2023-03-28 | 2023-05-26 | 重庆智能机器人研究院 | Robot tail end plane vision hand-eye calibration algorithm model |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110103217A (en) * | 2019-05-09 | 2019-08-09 | 电子科技大学 | Industrial robot hand and eye calibrating method |
CN111791227A (en) * | 2019-12-31 | 2020-10-20 | 深圳市豪恩声学股份有限公司 | Robot hand-eye calibration method and device and robot |
CN114147728A (en) * | 2022-02-07 | 2022-03-08 | 杭州灵西机器人智能科技有限公司 | Universal robot eye on-hand calibration method and system |
CN114227700A (en) * | 2022-02-23 | 2022-03-25 | 杭州灵西机器人智能科技有限公司 | Hand-eye calibration method and system for robot |
WO2022061673A1 (en) * | 2020-09-24 | 2022-03-31 | 西门子(中国)有限公司 | Calibration method and device for robot |
-
2022
- 2022-03-31 CN CN202210346172.9A patent/CN114833822B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110103217A (en) * | 2019-05-09 | 2019-08-09 | 电子科技大学 | Industrial robot hand and eye calibrating method |
CN111791227A (en) * | 2019-12-31 | 2020-10-20 | 深圳市豪恩声学股份有限公司 | Robot hand-eye calibration method and device and robot |
WO2022061673A1 (en) * | 2020-09-24 | 2022-03-31 | 西门子(中国)有限公司 | Calibration method and device for robot |
CN114147728A (en) * | 2022-02-07 | 2022-03-08 | 杭州灵西机器人智能科技有限公司 | Universal robot eye on-hand calibration method and system |
CN114227700A (en) * | 2022-02-23 | 2022-03-25 | 杭州灵西机器人智能科技有限公司 | Hand-eye calibration method and system for robot |
Non-Patent Citations (1)
Title |
---|
兰浩;张曦;尚继辉;: "一种基于线扫描相机的手眼标定技术", 计量与测试技术, no. 05 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116160454A (en) * | 2023-03-28 | 2023-05-26 | 重庆智能机器人研究院 | Robot tail end plane vision hand-eye calibration algorithm model |
Also Published As
Publication number | Publication date |
---|---|
CN114833822B (en) | 2023-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022062464A1 (en) | Computer vision-based hand-eye calibration method and apparatus, and storage medium | |
CN111801198B (en) | Hand-eye calibration method, system and computer storage medium | |
CN109658460A (en) | A kind of mechanical arm tail end camera hand and eye calibrating method and system | |
CN111012506B (en) | Robot-assisted puncture surgery end tool center calibration method based on stereoscopic vision | |
CN113442169B (en) | Method and device for calibrating hands and eyes of robot, computer equipment and readable storage medium | |
Zhang et al. | A universal and flexible theodolite-camera system for making accurate measurements over large volumes | |
CN110465946B (en) | Method for calibrating relation between pixel coordinate and robot coordinate | |
JP2016173313A (en) | Visual line direction estimation system, visual line direction estimation method and visual line direction estimation program | |
WO2020057121A1 (en) | Data processing method and apparatus, electronic device and storage medium | |
CN110136068B (en) | Sound membrane dome assembly system based on position calibration between bilateral telecentric lens cameras | |
CN113658266B (en) | Visual measurement method for rotation angle of moving shaft based on fixed camera and single target | |
CN112229323B (en) | Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method | |
CN114147728B (en) | Universal robot eye on-hand calibration method and system | |
CN111915685B (en) | Zoom camera calibration method | |
CN113296395A (en) | Robot hand-eye calibration method in specific plane | |
CN114833822A (en) | Rapid hand-eye calibration method for robot | |
CN113409399B (en) | Dual-camera combined calibration method, system and device | |
CN112258581A (en) | On-site calibration method for multi-fish glasses head panoramic camera | |
CN112288801A (en) | Four-in-one self-adaptive tracking shooting method and device applied to inspection robot | |
CN105527980B (en) | Binocular vision system target following control method | |
CN116352710A (en) | Robot automatic calibration and three-dimensional measurement method for large aerospace component | |
CN112584041B (en) | Image identification dynamic deviation rectifying method | |
CN111860275B (en) | Gesture recognition data acquisition system and method | |
CN112790786A (en) | Point cloud data registration method and device, ultrasonic equipment and storage medium | |
CN113255662A (en) | Positioning correction method, system, equipment and storage medium based on visual imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20220812 Address after: No. 108, Aerospace West Road, National Civil Aerospace Industry Base, Xi'an City, Shaanxi Province 710100 Applicant after: Xi'an Aerospace Times precision electromechanical Co.,Ltd. Address before: 710100 No. 106, Hangtian West Road, aerospace base, Xi'an, Shaanxi Province Applicant before: XI'AN AEROSPACE PRECISION ELECTROMECHANICAL INSTITUTE |
|
GR01 | Patent grant | ||
GR01 | Patent grant |