CN114833822A - Rapid hand-eye calibration method for robot - Google Patents

Rapid hand-eye calibration method for robot Download PDF

Info

Publication number
CN114833822A
CN114833822A CN202210346172.9A CN202210346172A CN114833822A CN 114833822 A CN114833822 A CN 114833822A CN 202210346172 A CN202210346172 A CN 202210346172A CN 114833822 A CN114833822 A CN 114833822A
Authority
CN
China
Prior art keywords
robot
coordinate system
terminal
moves
tail end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210346172.9A
Other languages
Chinese (zh)
Other versions
CN114833822B (en
Inventor
杨娜
张力力
罗华
李瑞峰
郭静
郭超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Aerospace Times Precision Electromechanical Co ltd
Original Assignee
Xian Aerospace Precision Electromechanical Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Aerospace Precision Electromechanical Institute filed Critical Xian Aerospace Precision Electromechanical Institute
Priority to CN202210346172.9A priority Critical patent/CN114833822B/en
Publication of CN114833822A publication Critical patent/CN114833822A/en
Application granted granted Critical
Publication of CN114833822B publication Critical patent/CN114833822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a rapid hand-eye calibration method for a robot, aiming at solving the technical problem that the traditional hand-eye calibration method needs to take calibration plate images acquired by a vision system and corresponding robot pose information as input, and the process is complex. The calibration method comprises the following steps: step 1, calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the space point moves at the terminal of the robot according to the three-dimensional coordinate of the space point before the space point moves in the terminal coordinate system; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot; step 2, calibrating the rotation relationship of the hands and the eyes; step 3, calibrating the translation relation of hands and eyes; and 4, completing the calibration of the hands and eyes of the robot.

Description

Rapid hand-eye calibration method for robot
Technical Field
The invention relates to the technical field of robot calibration, in particular to a rapid hand-eye calibration method for a robot.
Background
At present, in most of robot hand-eye calibration processes, the transformation pose between the robot tail end and the camera is calculated by utilizing the transformation pose between the robot base and the robot tail end and the transformation pose between the camera and the calibration plate at different photographing positions, so as to obtain a hand-eye calibration result.
Conventional calibration methods include two main categories: linear calibration methods and non-linear calibration methods. The linear calibration method is divided into a two-step method, a common calibration method, a mathematical method and a motion limiting method; the nonlinear calibration method is a nonlinear motion method.
The methods all need to input the terminal poses of the robot and correspond to the images acquired by the cameras one by one, and the operation process is complex.
Chinese patent (CN113814987A), name is: a multi-camera robot hand-eye calibration method, a multi-camera robot hand-eye calibration device, electronic equipment and a storage medium particularly disclose that eye-on-hand calibration between an end camera and the end and eye-on-hand calibration between the environment camera and a robot base are calculated according to initial angle data of the environment camera, the end camera and a calibration standard under a robot base coordinate system, but the eye-on-hand calibration method also adopts a pose relationship between the robot end and the robot base when calculating the eye-on-hand relationship, and needs to be applied to the later eye-on-hand relationship calibration.
Disclosure of Invention
The invention aims to solve the technical problem that the traditional hand-eye calibration method needs to take calibration plate images acquired by a vision system and corresponding robot pose information as input, and the process is complex, and provides a rapid hand-eye calibration method for a robot, which is suitable for calibrating the relation of eyes on the hand and the relation of eyes outside the hand.
The design idea of the invention is as follows: according to the same three-dimensional coordinates of the same space point in a world coordinate system and the pose transformation relation between the coordinate systems, the transformation pose between the robot base and the tail end of the robot is not required to be obtained, only the transformation pose between the camera and the calibration plate is required, and the pose between the robot and the vision system is calculated through images, so that the hand-eye calibration result can be quickly obtained.
In order to achieve the purpose, the invention adopts the technical scheme that:
a rapid hand-eye calibration method of a robot is characterized in that an eye-on-hand calibration method is adopted, and the method comprises the following steps:
step 1), calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the terminal of the robot moves according to the three-dimensional coordinate of the space point before the terminal coordinate system moves; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot;
the calibration plate is fixedly placed in the robot moving range and the camera view field, and the calibration plate is ensured to be in the camera view field range and to be clear in imaging;
step 2), calibrating the rotation relationship of the hands and the eyes;
2.1), keeping the robot in a fixed state, and acquiring an image of a first pose by a camera;
2.2) operating the robot to translate for multiple times along three coordinate axis directions of the terminal coordinate system, and acquiring multiple images with different translation poses;
2.3) calculating the rotation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of translation poses;
step 3), calibrating the translation relation of the hands and the eyes;
3.1), operating the robot to return to the initial position;
3.2) operating the robot to rotate for multiple times around the origin of the terminal coordinate system, and acquiring multiple images with different rotation poses;
3.3) calculating the translation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of rotation poses;
and 4) completing the calibration of the hands and eyes of the robot.
Further, step 1 specifically comprises:
1.1) calculating the three-dimensional coordinate P of the space point in the terminal coordinate system after the terminal of the robot moves end2
Figure BDA0003576588900000031
wherein ,
P end1 is a point in spaceThree-dimensional coordinates in a robot tip end movement front tip end coordinate system;
Figure BDA0003576588900000032
a pose rotation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
Figure BDA0003576588900000033
a pose translation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
1.2) calculating the three-dimensional coordinate P of the space point in the coordinate system of the camera before the robot moves at the tail end camera1
Figure BDA0003576588900000034
wherein ,
Figure BDA0003576588900000035
A pose rotation matrix from a terminal coordinate system to a camera coordinate system;
Figure BDA0003576588900000036
a pose translation matrix from a terminal coordinate system to a camera coordinate system;
1.3) calculating the three-dimensional coordinate P in the camera coordinate system after the space point moves at the tail end of the robot camera2
Figure BDA0003576588900000037
1.4) calculating the three-dimensional coordinate P of the space point in the coordinate system of the calibration plate before the tail end of the robot moves target1
Figure BDA0003576588900000041
wherein ,
Figure BDA0003576588900000042
the pose rotation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves;
Figure BDA0003576588900000043
the pose translation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves; 1.5) calculating the three-dimensional coordinate P in the calibration plate coordinate system after the space point moves at the tail end of the robot target2
Figure BDA0003576588900000044
wherein ,
Figure BDA0003576588900000045
the pose rotation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained;
Figure BDA0003576588900000046
the pose translation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained.
Further, the coordinates of the space point in the coordinate system of the calibration plate before and after the robot end moves are unchanged, P target1 =P target2 According to formula 4 and formula 5:
Figure BDA0003576588900000047
defining the space point before the robot end moves to the origin of the end coordinate system, P end1 =(0,0,0) T Root of Chinese characterThe coordinate relationship is obtained according to equation 6:
Figure BDA0003576588900000051
and calibrating the hand and the eye by utilizing the coordinate relation.
Further, the step 2.3) is specifically as follows: according to the images of the plurality of translation poses, a rotation transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
When the tail end of the robot does not rotate relatively before and after moving,
Figure BDA0003576588900000052
obtaining the rotation relationship of the eyes and hands according to the formula 7
Figure BDA0003576588900000053
Figure BDA0003576588900000054
The tail end of the robot moves to perform a plurality of translations along three axes of x, y and z of a tail end coordinate system,
Figure BDA0003576588900000055
and
Figure BDA0003576588900000056
regarding the two vectors as two vectors, and simultaneously taking the unit vector for calculation;
hand-eye rotation relationship by least square method through formula 8
Figure BDA0003576588900000057
Further, step 3.3) is specifically: according to the images of the plurality of rotary positions, a rotary transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not move before and after the movement,
Figure BDA0003576588900000058
obtaining the translation relationship of the eyes and hands according to the formula 7
Figure BDA0003576588900000059
Figure BDA00035765889000000510
Hand-eye translation relation obtained by least square method through formula 9
Figure BDA00035765889000000511
A rapid hand-eye calibration method of a robot is characterized in that an eye-outside calibration method is adopted, and the method comprises the following steps:
step 1), calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the space point moves at the terminal of the robot according to the three-dimensional coordinate of the space point before the space point moves in the terminal coordinate system; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot;
The calibration plate is fixedly placed in the robot moving range and the camera view field, and the calibration plate is ensured to be in the camera view field range and to be clear in imaging;
step 2), calibrating the rotation relationship of the hands and the eyes;
2.1), keeping the robot in a fixed state, and acquiring an image of a first pose by a camera;
2.2) operating the robot to translate for multiple times along three coordinate axis directions of the terminal coordinate system, and acquiring multiple images with different translation poses;
2.3) calculating the rotation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of translation poses;
step 3), calibrating the translation relation of the hands and the eyes;
3.1), operating the robot to return to the initial position;
3.2) operating the robot to rotate for multiple times around the origin of the terminal coordinate system, and acquiring multiple images with different rotation poses;
3.3) calculating the translation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of rotation poses;
and 4) completing the calibration of the hands and the eyes of the robot.
Further, step 1 specifically comprises:
1.1) calculating the three-dimensional coordinate P of the space point in the terminal coordinate system after the terminal of the robot moves end2
Figure BDA0003576588900000061
wherein ,
P end1 three-dimensional coordinates of the space point in a terminal coordinate system before the robot terminal moves;
Figure BDA0003576588900000062
a pose rotation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
Figure BDA0003576588900000071
A pose translation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
1.2) calculating the three-dimensional coordinate P of the space point in the coordinate system of the camera before the robot moves at the tail end camera1
Figure BDA0003576588900000072
wherein ,
Figure BDA0003576588900000073
is a calibration plate coordinate system before the movement of the tail end of the robotA pose rotation matrix to a camera coordinate system;
Figure BDA0003576588900000074
the method comprises the following steps of (1) calibrating a pose translation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves;
1.3) calculating the three-dimensional coordinate P in the camera coordinate system after the space point moves at the tail end of the robot camera2
Figure BDA0003576588900000075
wherein ,
Figure BDA0003576588900000076
the method comprises the following steps that a pose rotation matrix from a calibration plate coordinate system to a camera coordinate system after the tail end of a robot moves is obtained;
Figure BDA0003576588900000077
the method comprises the following steps that a pose translation matrix from a calibration plate coordinate system to a camera coordinate system is obtained after the tail end of a robot moves;
1.4) calculating the three-dimensional coordinate P of the space point in the coordinate system of the calibration plate before the tail end of the robot moves target1
Figure BDA0003576588900000078
wherein ,
Figure BDA0003576588900000079
a pose rotation matrix from a terminal coordinate system to a calibration plate coordinate system;
Figure BDA0003576588900000081
for pose translation from end coordinate system to calibration plate coordinate systemA matrix;
1.5) calculating the three-dimensional coordinate P in the calibration plate coordinate system after the space point moves at the tail end of the robot target2
Figure BDA0003576588900000082
Further, when the eyes are calibrated outside the hands, the coordinates of the space point in the camera coordinate system are not changed before and after the robot end moves, namely P camera1 =P camera2 According to formula 11 and formula 12:
Figure BDA0003576588900000083
defining the space point before the robot end moves to the origin of the end coordinate system, P end1 =(0,0,0) T The coordinate relationship is obtained according to equation 15:
Figure BDA0003576588900000084
and calibrating the hand and the eye by utilizing the coordinate relation.
Further, the step 2.3) is specifically as follows: according to the images of the plurality of translation poses, a rotation transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not rotate relatively before and after moving,
Figure BDA0003576588900000085
obtaining the hand-eye rotation relationship according to the formula 16
Figure BDA0003576588900000086
Figure BDA0003576588900000087
The tail end of the robot moves to perform a plurality of translations along three axes of x, y and z of a tail end coordinate system,
Figure BDA0003576588900000088
and
Figure BDA0003576588900000091
regarding the two vectors as two vectors, and simultaneously taking the unit vector for calculation;
hand-eye rotation relationship by least square method through formula 17
Figure BDA0003576588900000092
Further, step 3.3) is specifically: according to the images of the plurality of rotary positions, a rotary transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not move before and after the movement,
Figure BDA0003576588900000093
obtaining the translation relationship between the hands and eyes according to the formula 16
Figure BDA0003576588900000094
Figure BDA0003576588900000095
Hand-eye translation relationship by least square method through formula 18
Figure BDA0003576588900000096
Compared with the prior art, the invention has the following beneficial technical effects:
according to the rapid hand-eye calibration method provided by the invention, according to the same three-dimensional coordinates of the same space point in a world coordinate system and the pose transformation relation between the coordinate systems, the transformation pose between the robot base and the tail end of the robot is not required to be obtained, and the calibration of the relation between the eyes on the hands and the relation between the eyes outside the hands can be rapidly completed by controlling the robot to move and acquiring calibration plate images according to the camera.
Drawings
FIG. 1 is a flow chart of the steps of the rapid hand-eye calibration method of the robot of the present invention;
fig. 2 is a schematic diagram of a calibration board used for calibration in the embodiment of the present invention.
Detailed Description
In order to make the objects, advantages and features of the present invention more clear, a rapid hand-eye calibration method for a robot according to the present invention is further described in detail with reference to the accompanying drawings and specific embodiments. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention and are not intended to limit the scope of the present invention.
As shown in fig. 1, the method for rapidly calibrating the hand and the eye of the robot provided by the invention only uses the image of the calibration plate (shown in fig. 2) acquired by the camera to resolve the pose of the robot and the vision system to calibrate the hand and the eye, and comprises the following specific steps:
Example 1
During the calibration of the eye-on-hand relationship, the calibration plate coordinate system is fixed relative to the world coordinate system, so that the three-dimensional coordinate P of any point P in space in the calibration plate target Is left unchanged.
Suppose that the three-dimensional coordinate of the point P in the terminal coordinate system is P before and after the terminal of the robot moves end1 and Pend2 The three-dimensional coordinate in the camera coordinate system is P camer1 and Pcamera2
Figure BDA0003576588900000101
wherein ,
Figure BDA0003576588900000102
moving the front end seat for the robot endThe pose rotation matrix of the tail end coordinate system after the target system moves to the tail end of the robot;
Figure BDA0003576588900000103
a pose translation matrix from a terminal coordinate system before the robot terminal moves to a terminal coordinate system after the robot terminal moves;
Figure BDA0003576588900000104
wherein ,
Figure BDA0003576588900000105
is a pose rotation matrix between the terminal coordinate system and the camera coordinate system;
Figure BDA0003576588900000106
the pose translation matrix from the terminal coordinate system to the camera coordinate system is a hand-eye calibration matrix;
the same principle is that:
Figure BDA0003576588900000107
Figure BDA0003576588900000111
wherein ,
Figure BDA0003576588900000112
the pose rotation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves;
Figure BDA0003576588900000113
is the coordinates of the camera before the end of the robot movesA pose translation matrix tied to a calibration plate coordinate system;
the same principle is that:
Figure BDA0003576588900000114
wherein ,
Figure BDA0003576588900000115
the pose rotation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained;
Figure BDA0003576588900000116
The pose translation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained; due to P target1 =P target2 According to (4) (5), then:
Figure BDA0003576588900000117
for convenience, take P end1 =(0,0,0) T And then:
Figure BDA0003576588900000118
namely:
Figure BDA0003576588900000121
the invention utilizes a robot moving mode in a specific sequence and combines a formula (8) to calculate a hand-eye calibration matrix, and the specific steps comprise: calibrating the rotation relation of the hands and the eyes and calibrating the translation relation of the hands and the eyes.
Step 1, calibrating the rotation relationship of hands and eyes
1.1) placing a calibration plate in the range of the later operation target of the robot;
1.2), ensuring that the calibration plate is within the field of view range of the camera and the imaging is clear, and acquiring a first image;
1.3), operating the robot to make the robot translate for a plurality of times along the x axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example, twice);
1.4), operating the robot to make the robot translate for a plurality of times along the y axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example, twice);
1.5), operating the robot to make the robot translate for a plurality of times along the z-axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example twice);
1.6) obtaining a rotation transformation matrix between a camera coordinate system and a calibration plate coordinate system corresponding to 7 pictures through image processing and a PNP algorithm
Figure BDA0003576588900000122
And translation transformation matrix
Figure BDA0003576588900000123
i=0,1,2,…,6;
1.7), the corresponding 6 equations can be obtained from 7 pictures, and then the formula (9) can be combined into the formula (10):
Figure BDA0003576588900000124
Figure BDA0003576588900000125
wherein :
Figure BDA0003576588900000126
in the case where the robot end coordinate system movement distance is unknown, the calculation can be performed using the unit vector, where:
Figure BDA0003576588900000131
Figure BDA0003576588900000132
is composed of
Figure BDA0003576588900000133
A unit vector of (a);
Figure BDA0003576588900000134
is composed of
Figure BDA0003576588900000135
A unit vector of (a);
Figure BDA0003576588900000136
is composed of
Figure BDA0003576588900000137
A unit vector of (a);
Figure BDA0003576588900000138
is composed of
Figure BDA0003576588900000139
A unit vector of (a);
Figure BDA00035765889000001310
is composed of
Figure BDA00035765889000001311
A unit vector of (a);
Figure BDA00035765889000001312
is composed of
Figure BDA00035765889000001313
Unit vector of。
1.8) equation (10) is similar to equation Ax ═ b.
Wherein: a is a matrix of 3 x 3, x is a matrix of 3 x 6, and b is a matrix of 3 x 6.
Then:
A=(b·x T )·(x·x T ) -1 (11)
can obtain
Figure BDA00035765889000001314
Step 2: hand-eye translation relation calibration
2.1) operating the robot to return to the initial position and collecting a first image;
2.2) operating the robot to rotate around the X axis of the terminal coordinate system around the origin of the terminal of the robot, and acquiring a plurality of images (at least once, for example twice);
2.3) operating the robot to rotate around the origin of the tail end of the robot around the y axis of the tail end coordinate system, and acquiring a plurality of images (at least once, for example twice);
2.4) operating the robot to rotate around the origin of the tail end of the robot around the z axis of the tail end coordinate system, and acquiring a plurality of images (at least once, for example twice);
2.5) obtaining a rotation transformation matrix between a camera coordinate system and a calibration plate coordinate system corresponding to 7 pictures through image processing and a PNP algorithm
Figure BDA0003576588900000141
And translation transformation matrix
Figure BDA0003576588900000142
Wherein i is 0, 1, 2 … 6;
2.6) the corresponding 6 equations can be obtained from 7 pictures, and the formula (13) can be obtained after the formula (12) is combined:
Figure BDA0003576588900000143
Figure BDA0003576588900000144
wherein :
Figure BDA0003576588900000145
Figure BDA0003576588900000146
defining:
Figure BDA0003576588900000147
Figure BDA0003576588900000148
Figure BDA0003576588900000149
Figure BDA00035765889000001410
Figure BDA00035765889000001411
Figure BDA00035765889000001412
Figure BDA00035765889000001413
Figure BDA00035765889000001414
Figure BDA00035765889000001415
Figure BDA00035765889000001416
Figure BDA00035765889000001417
Figure BDA00035765889000001418
2.7) equation (13) is similar to equation Ax ═ b.
Wherein: a is a matrix of 18 x 3, x is a matrix of 3 x 1, and b is a matrix of 18 x 1.
Then:
x=(A T ·A) -1 ·(A T ·b) (14)
can obtain
Figure BDA0003576588900000151
Example 2
During the calibration of the eye's extra-manual relationship, the camera coordinate system is fixed with respect to the world coordinate system, so that the three-dimensional coordinates P of any point P in space in the camera coordinate system camera Is left unchanged.
Suppose that the three-dimensional coordinate of point P in the end coordinate system twice before and after the robot end moves is P end1 and Pend2 The three-dimensional coordinate in the calibration plate coordinate system is P target1 and Ptarget2
Figure BDA0003576588900000152
wherein ,
Figure BDA0003576588900000153
a pose rotation matrix from a terminal coordinate system before the robot terminal moves to a terminal coordinate system after the robot terminal moves;
Figure BDA0003576588900000154
a pose translation matrix from a terminal coordinate system before the robot terminal moves to a terminal coordinate system after the robot terminal moves;
Figure BDA0003576588900000155
wherein ,
Figure BDA0003576588900000156
is a pose rotation matrix between the terminal coordinate system and the coordinate system of the calibration plate;
Figure BDA0003576588900000157
the method comprises the following steps of (1) obtaining a pose translation matrix between a terminal coordinate system and a calibration plate coordinate system, namely a hand-eye calibration matrix;
The same principle is that:
Figure BDA0003576588900000158
Figure BDA0003576588900000161
Figure BDA0003576588900000162
wherein ,
Figure BDA0003576588900000163
the method comprises the following steps of (1) calibrating a pose rotation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves;
Figure BDA0003576588900000164
the method comprises the following steps of (1) calibrating a pose translation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves; the same principle is that:
Figure BDA0003576588900000165
wherein ,
Figure BDA0003576588900000166
the method comprises the following steps that a pose rotation matrix from a calibration plate coordinate system to a camera coordinate system after the tail end of a robot moves is obtained;
Figure BDA0003576588900000167
the method comprises the following steps that a pose translation matrix from a calibration plate coordinate system to a camera coordinate system is obtained after the tail end of a robot moves; due to P camera1 =P camera2 According to (18) (19), then:
Figure BDA0003576588900000168
for convenience, take P end1 =(0,0,0) T And then:
Figure BDA0003576588900000169
namely:
Figure BDA0003576588900000171
the invention utilizes a robot moving mode in a specific sequence and combines a formula (22) to calculate a hand-eye calibration matrix, and the specific steps comprise: calibrating the rotation relation of the hands and the eyes and calibrating the translation relation of the hands and the eyes.
Step 1, calibrating the rotation relationship of hands and eyes
1.1) placing the calibration plate in the range of the later operation target of the robot;
1.2), ensuring that the calibration plate is within the field of view range of the camera and the imaging is clear, and acquiring a first image;
1.3), operating the robot to make the robot translate for a plurality of times along the x axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example, twice);
1.4), operating the robot to make the robot translate for a plurality of times along the y-axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example twice);
1.5), operating the robot to make the robot translate for a plurality of times along the z-axis of the terminal coordinate system, and acquiring a plurality of images (at least once, for example twice);
1.6) obtaining a rotation transformation matrix between a camera coordinate system and a calibration plate coordinate system corresponding to 7 pictures through image processing and a PNP algorithm
Figure BDA0003576588900000172
And translation transformation matrix
Figure BDA0003576588900000173
i=0,1,2,…,6;
1.7), the corresponding 6 equations can be obtained from 7 pictures, and then the formula (23) can be combined into the formula (24):
Figure BDA0003576588900000174
Figure BDA0003576588900000175
wherein :
Figure BDA0003576588900000176
in the case where the robot end coordinate system movement distance is unknown, the calculation can be performed using the unit vector, where:
Figure BDA0003576588900000181
Figure BDA0003576588900000182
is composed of
Figure BDA0003576588900000183
A unit vector of (a);
Figure BDA0003576588900000184
is composed of
Figure BDA0003576588900000185
A unit vector of (a);
Figure BDA0003576588900000186
is composed of
Figure BDA0003576588900000187
A unit vector of (a);
Figure BDA0003576588900000188
is composed of
Figure BDA0003576588900000189
A unit vector of (a);
Figure BDA00035765889000001810
is composed of
Figure BDA00035765889000001811
A unit vector of (a);
Figure BDA00035765889000001812
is composed of
Figure BDA00035765889000001813
The unit vector of (2).
1.8) equation (24) is similar to equation Ax ═ b.
Wherein: a is a matrix of 3 x 3, x is a matrix of 3 x 6, and b is a matrix of 3 x 6.
Then:
A=(b·x T )·(x·x T ) -1 (25)
can obtain
Figure BDA00035765889000001814
Step 2: hand-eye translation relation calibration
2.1) operating the robot to return to the initial position and collecting a first image;
2.2) operating the robot to rotate around the X axis of the terminal coordinate system around the origin of the terminal of the robot, and acquiring a plurality of images (at least once, for example twice);
2.3) operating the robot to rotate around the origin of the tail end of the robot around the y axis of the tail end coordinate system, and acquiring a plurality of images (at least once, for example twice);
2.4) operating the robot to rotate around the origin of the tail end of the robot around the z axis of the tail end coordinate system, and acquiring a plurality of images (at least once, for example twice);
2.5) obtaining a rotation transformation matrix between a camera coordinate system and a calibration plate coordinate system corresponding to 7 pictures through image processing and a PNP algorithm
Figure BDA0003576588900000191
And translation transformation matrix
Figure BDA0003576588900000192
Wherein i is 0, 1, 2 … 6;
2.6) from 7 pictures, the corresponding 6 equations can be obtained, and after the formula (26) is combined, the formula (27) can be obtained:
Figure BDA0003576588900000193
Figure BDA0003576588900000194
wherein :
Figure BDA0003576588900000195
Figure BDA0003576588900000196
defining:
Figure BDA0003576588900000197
Figure BDA0003576588900000198
Figure BDA0003576588900000199
Figure BDA00035765889000001910
Figure BDA00035765889000001911
Figure BDA00035765889000001912
Figure BDA00035765889000001913
Figure BDA00035765889000001914
Figure BDA00035765889000001915
Figure BDA00035765889000001916
Figure BDA00035765889000001917
Figure BDA00035765889000001918
2.7) equation (27) is similar to equation Ax ═ b.
Wherein: a is a matrix of 18 x 3, x is a matrix of 3 x 1, and b is a matrix of 18 x 1.
Then:
x=(A T ·A) -1 ·(A T ·b) (28)
can obtain
Figure BDA0003576588900000201
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the present invention.

Claims (10)

1. A rapid hand-eye calibration method for a robot is characterized in that an eye-on-hand calibration method is adopted, and the method comprises the following steps:
Step 1), calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the space point moves at the terminal of the robot according to the three-dimensional coordinate of the space point before the space point moves in the terminal coordinate system; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot;
the calibration plate is fixedly placed in the robot moving range and the camera view field, and the calibration plate is ensured to be in the camera view field range and to be clear in imaging;
step 2), calibrating the rotation relationship of the hands and the eyes;
2.1), keeping the robot in a fixed state, and acquiring an image of a first pose by a camera;
2.2) operating the robot to translate for multiple times along three coordinate axis directions of the terminal coordinate system, and acquiring multiple images with different translation poses;
2.3) calculating the rotation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of translation poses;
step 3), calibrating the translation relation of the hands and the eyes;
3.1), operating the robot to return to the initial position;
3.2) operating the robot to rotate for multiple times around the origin of the terminal coordinate system, and acquiring multiple images with different rotation poses;
3.3) calculating the translation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of rotation poses;
and 4) completing the calibration of the hands and eyes of the robot.
2. The rapid hand-eye calibration method for the robot according to claim 1, wherein the step 1 specifically comprises:
1.1) calculating the three-dimensional coordinate P of the space point in the terminal coordinate system after the terminal of the robot moves end2
Figure FDA0003576588890000021
wherein ,
P end1 three-dimensional coordinates of the space point in a terminal coordinate system before the robot terminal moves;
Figure FDA0003576588890000022
a pose rotation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
Figure FDA0003576588890000023
a pose translation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
1.2) calculating the three-dimensional coordinate P of the space point in the coordinate system of the camera before the robot moves at the tail end camera1
Figure FDA0003576588890000024
wherein ,
Figure FDA0003576588890000025
a pose rotation matrix from a terminal coordinate system to a camera coordinate system;
Figure FDA0003576588890000026
a pose translation matrix from a terminal coordinate system to a camera coordinate system;
1.3) calculating the three-dimensional coordinate P in the camera coordinate system after the space point moves at the tail end of the robot camera2
Figure FDA0003576588890000027
1.4) calculating the three-dimensional coordinate P of the space point in the coordinate system of the calibration plate before the tail end of the robot moves target1
Figure FDA0003576588890000028
wherein ,
Figure FDA0003576588890000029
the pose rotation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves;
Figure FDA00035765888900000210
the pose translation matrix is from a camera coordinate system to a calibration plate coordinate system before the tail end of the robot moves;
1.5) calculating the three-dimensional coordinate P in the calibration plate coordinate system after the space point moves at the tail end of the robot target2
Figure FDA0003576588890000031
wherein ,
Figure FDA0003576588890000032
the pose rotation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained;
Figure FDA0003576588890000033
the pose translation matrix from the camera coordinate system to the calibration plate coordinate system after the tail end of the robot moves is obtained.
3. The rapid hand-eye calibration method for a robot according to claim 2, characterized in that:
the coordinates of the space point in the coordinate system of the calibration plate before and after the tail end of the robot moves are unchanged, P target1 =P target2 According to formula 4 and formula 5:
Figure FDA0003576588890000034
defining the space point before the robot end moves to the origin of the end coordinate system, P end1 =(0,0,0) T The coordinate relationship is obtained according to equation 6:
Figure FDA0003576588890000035
and calibrating the hand and the eye by utilizing the coordinate relation.
4. The rapid hand-eye calibration method for the robot according to claim 3, wherein the step 2.3) is specifically as follows: according to the images of the plurality of translation poses, a rotation transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not rotate relatively before and after moving,
Figure FDA0003576588890000041
obtaining the rotation relationship of the eyes and hands according to the formula 7
Figure FDA0003576588890000042
Figure FDA0003576588890000043
Robot end moving edgeThe three axes of x, y and z of the end coordinate system are translated for a plurality of times,
Figure FDA0003576588890000044
And
Figure FDA0003576588890000045
regarding the two vectors as two vectors, and simultaneously taking the unit vector for calculation;
hand-eye rotation relationship by least square method through formula 8
Figure FDA0003576588890000046
5. The rapid hand-eye calibration method for the robot according to claim 3, wherein the step 3.3) is specifically as follows: according to the images of the plurality of rotary positions, a rotary transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not move before and after the movement,
Figure FDA0003576588890000047
obtaining the translation relationship of the eyes and hands according to the formula 7
Figure FDA0003576588890000048
Figure FDA0003576588890000049
Hand-eye translation relation obtained by least square method through formula 9
Figure FDA00035765888900000410
6. A rapid hand-eye calibration method for a robot is characterized in that an eye-outside-hand calibration method is adopted, and the method comprises the following steps:
step 1), calculating to obtain a three-dimensional coordinate of the space point in a terminal coordinate system after the space point moves at the terminal of the robot according to the three-dimensional coordinate of the space point before the space point moves in the terminal coordinate system; respectively calculating three-dimensional coordinates of the robot in a camera coordinate system and a calibration plate coordinate system twice in the moving process of the tail end of the robot;
the calibration plate is fixedly arranged in the robot moving range and the camera view field, and the calibration plate is ensured to be in the camera view field range and to be clear in imaging;
Step 2), calibrating the rotation relationship of the hands and the eyes;
2.1), keeping the robot in a fixed state, and acquiring an image of a first pose by a camera;
2.2) operating the robot to translate for multiple times along three coordinate axis directions of the terminal coordinate system, and acquiring multiple images with different translation poses;
2.3) calculating the rotation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of translation poses;
step 3), calibrating the translation relation of the hands and the eyes;
3.1), operating the robot to return to the initial position;
3.2) operating the robot to rotate for multiple times around the origin of the terminal coordinate system, and acquiring multiple images with different rotation poses;
3.3) calculating the translation relation of the hands and the eyes by utilizing a least square method according to the images of the plurality of rotation poses;
and 4) completing the calibration of the hands and eyes of the robot.
7. The rapid hand-eye calibration method for a robot according to claim 6, wherein:
when the eyes are adopted for external calibration, the step 1 specifically comprises the following steps:
1.1) calculating the three-dimensional coordinate P of the space point in the terminal coordinate system after the terminal of the robot moves end2
Figure FDA0003576588890000051
wherein ,
P end1 three-dimensional coordinates of the space point in a terminal coordinate system before the robot terminal moves;
Figure FDA0003576588890000052
a pose rotation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
Figure FDA0003576588890000053
A pose translation matrix from a terminal coordinate system before the terminal of the robot moves to the terminal of the robot after the terminal of the robot moves;
1.2) calculating the three-dimensional coordinate P of the space point in the coordinate system of the camera before the robot moves at the tail end camera1
Figure FDA0003576588890000054
Figure FDA0003576588890000061
wherein ,
Figure FDA0003576588890000062
the method comprises the following steps of (1) calibrating a pose rotation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves;
Figure FDA0003576588890000063
the method comprises the following steps of (1) calibrating a pose translation matrix from a coordinate system of a plate to a coordinate system of a camera before the tail end of a robot moves;
1.3) calculating the three-dimensional coordinate P in the camera coordinate system after the space point moves at the tail end of the robot camera2
Figure FDA0003576588890000064
wherein ,
Figure FDA0003576588890000065
the method comprises the following steps that a pose rotation matrix from a calibration plate coordinate system to a camera coordinate system after the tail end of a robot moves is obtained;
Figure FDA0003576588890000066
the method comprises the following steps that a pose translation matrix from a calibration plate coordinate system to a camera coordinate system is obtained after the tail end of a robot moves;
1.4) calculating the three-dimensional coordinate P of the space point in the coordinate system of the calibration plate before the tail end of the robot moves target1
Figure FDA0003576588890000067
wherein ,
Figure FDA0003576588890000068
a pose rotation matrix from a terminal coordinate system to a calibration plate coordinate system;
Figure FDA0003576588890000069
a pose translation matrix from a terminal coordinate system to a calibration plate coordinate system;
1.5) calculating the three-dimensional coordinate P in the calibration plate coordinate system after the space point moves at the tail end of the robot target2
Figure FDA00035765888900000610
8. The rapid hand-eye calibration method for a robot according to claim 7, characterized in that:
When the eyes are calibrated outside the hands, the coordinates of the space point in the camera coordinate system are unchanged before and after the robot tail end moves, namely P camera1 =P camera2 According to formula 11 and formula 12:
Figure FDA0003576588890000071
defining the space point before the robot end moves to the origin of the end coordinate system, P end1 =(0,0,0) T The coordinate relationship is obtained according to equation 15:
Figure FDA0003576588890000072
and calibrating the hand and the eye by utilizing the coordinate relation.
9. The rapid hand-eye calibration method for the robot according to claim 8, wherein the step 2.3) is specifically as follows: according to the images of the plurality of translation poses, a rotation transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
when the tail end of the robot does not rotate relatively before and after moving,
Figure FDA0003576588890000073
obtaining the hand-eye rotation relationship according to the formula 16
Figure FDA0003576588890000074
Figure FDA0003576588890000075
The robot tail end moves to carry out translation for a plurality of times along three axes of x, y and z of a tail end coordinate system,
Figure FDA0003576588890000076
and
Figure FDA0003576588890000077
regarding the two vectors as two vectors, and simultaneously taking the unit vector for calculation;
hand-eye rotation relationship by least square method through formula 17
Figure FDA0003576588890000078
10. The rapid hand-eye calibration method for the robot according to claim 8, wherein the step 3.3) is specifically as follows: according to the images of the plurality of rotary positions, a rotary transformation matrix and a translation transformation matrix between a corresponding camera coordinate system and a calibration plate coordinate system are obtained through image processing and a PNP algorithm;
When the tail end of the robot does not move before and after the movement,
Figure FDA0003576588890000081
obtaining the translation relationship between the hands and eyes according to the formula 16
Figure FDA0003576588890000082
Figure FDA0003576588890000083
Hand-eye translation relationship by least square method through formula 18
Figure FDA0003576588890000084
CN202210346172.9A 2022-03-31 2022-03-31 Rapid hand-eye calibration method for robot Active CN114833822B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210346172.9A CN114833822B (en) 2022-03-31 2022-03-31 Rapid hand-eye calibration method for robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210346172.9A CN114833822B (en) 2022-03-31 2022-03-31 Rapid hand-eye calibration method for robot

Publications (2)

Publication Number Publication Date
CN114833822A true CN114833822A (en) 2022-08-02
CN114833822B CN114833822B (en) 2023-09-19

Family

ID=82563810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210346172.9A Active CN114833822B (en) 2022-03-31 2022-03-31 Rapid hand-eye calibration method for robot

Country Status (1)

Country Link
CN (1) CN114833822B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116160454A (en) * 2023-03-28 2023-05-26 重庆智能机器人研究院 Robot tail end plane vision hand-eye calibration algorithm model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110103217A (en) * 2019-05-09 2019-08-09 电子科技大学 Industrial robot hand and eye calibrating method
CN111791227A (en) * 2019-12-31 2020-10-20 深圳市豪恩声学股份有限公司 Robot hand-eye calibration method and device and robot
CN114147728A (en) * 2022-02-07 2022-03-08 杭州灵西机器人智能科技有限公司 Universal robot eye on-hand calibration method and system
CN114227700A (en) * 2022-02-23 2022-03-25 杭州灵西机器人智能科技有限公司 Hand-eye calibration method and system for robot
WO2022061673A1 (en) * 2020-09-24 2022-03-31 西门子(中国)有限公司 Calibration method and device for robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110103217A (en) * 2019-05-09 2019-08-09 电子科技大学 Industrial robot hand and eye calibrating method
CN111791227A (en) * 2019-12-31 2020-10-20 深圳市豪恩声学股份有限公司 Robot hand-eye calibration method and device and robot
WO2022061673A1 (en) * 2020-09-24 2022-03-31 西门子(中国)有限公司 Calibration method and device for robot
CN114147728A (en) * 2022-02-07 2022-03-08 杭州灵西机器人智能科技有限公司 Universal robot eye on-hand calibration method and system
CN114227700A (en) * 2022-02-23 2022-03-25 杭州灵西机器人智能科技有限公司 Hand-eye calibration method and system for robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
兰浩;张曦;尚继辉;: "一种基于线扫描相机的手眼标定技术", 计量与测试技术, no. 05 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116160454A (en) * 2023-03-28 2023-05-26 重庆智能机器人研究院 Robot tail end plane vision hand-eye calibration algorithm model

Also Published As

Publication number Publication date
CN114833822B (en) 2023-09-19

Similar Documents

Publication Publication Date Title
WO2022062464A1 (en) Computer vision-based hand-eye calibration method and apparatus, and storage medium
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN109658460A (en) A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN111012506B (en) Robot-assisted puncture surgery end tool center calibration method based on stereoscopic vision
CN113442169B (en) Method and device for calibrating hands and eyes of robot, computer equipment and readable storage medium
Zhang et al. A universal and flexible theodolite-camera system for making accurate measurements over large volumes
CN110465946B (en) Method for calibrating relation between pixel coordinate and robot coordinate
JP2016173313A (en) Visual line direction estimation system, visual line direction estimation method and visual line direction estimation program
WO2020057121A1 (en) Data processing method and apparatus, electronic device and storage medium
CN110136068B (en) Sound membrane dome assembly system based on position calibration between bilateral telecentric lens cameras
CN113658266B (en) Visual measurement method for rotation angle of moving shaft based on fixed camera and single target
CN112229323B (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
CN114147728B (en) Universal robot eye on-hand calibration method and system
CN111915685B (en) Zoom camera calibration method
CN113296395A (en) Robot hand-eye calibration method in specific plane
CN114833822A (en) Rapid hand-eye calibration method for robot
CN113409399B (en) Dual-camera combined calibration method, system and device
CN112258581A (en) On-site calibration method for multi-fish glasses head panoramic camera
CN112288801A (en) Four-in-one self-adaptive tracking shooting method and device applied to inspection robot
CN105527980B (en) Binocular vision system target following control method
CN116352710A (en) Robot automatic calibration and three-dimensional measurement method for large aerospace component
CN112584041B (en) Image identification dynamic deviation rectifying method
CN111860275B (en) Gesture recognition data acquisition system and method
CN112790786A (en) Point cloud data registration method and device, ultrasonic equipment and storage medium
CN113255662A (en) Positioning correction method, system, equipment and storage medium based on visual imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220812

Address after: No. 108, Aerospace West Road, National Civil Aerospace Industry Base, Xi'an City, Shaanxi Province 710100

Applicant after: Xi'an Aerospace Times precision electromechanical Co.,Ltd.

Address before: 710100 No. 106, Hangtian West Road, aerospace base, Xi'an, Shaanxi Province

Applicant before: XI'AN AEROSPACE PRECISION ELECTROMECHANICAL INSTITUTE

GR01 Patent grant
GR01 Patent grant