CN111872922B - Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor - Google Patents

Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor Download PDF

Info

Publication number
CN111872922B
CN111872922B CN202010741078.4A CN202010741078A CN111872922B CN 111872922 B CN111872922 B CN 111872922B CN 202010741078 A CN202010741078 A CN 202010741078A CN 111872922 B CN111872922 B CN 111872922B
Authority
CN
China
Prior art keywords
coordinate system
vector
point cloud
point
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010741078.4A
Other languages
Chinese (zh)
Other versions
CN111872922A (en
Inventor
潘云
兰智菲
张倩
张英
姜慧灵
潘丰
程麒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou Power Grid Co Ltd
Original Assignee
Guizhou Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou Power Grid Co Ltd filed Critical Guizhou Power Grid Co Ltd
Priority to CN202010741078.4A priority Critical patent/CN111872922B/en
Publication of CN111872922A publication Critical patent/CN111872922A/en
Application granted granted Critical
Publication of CN111872922B publication Critical patent/CN111872922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/003Programme-controlled manipulators having parallel kinematics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention discloses a three-degree-of-freedom parallel robot hand-eye calibration method based on a 3D vision sensor, and belongs to the field of robot vision. Firstly, a mathematical model of a three-degree-of-freedom parallel robot vision system is established, then an actuating mechanism of the robot is controlled to perform two times of mutually perpendicular translation motions, two translation vectors of a robot tool coordinate system and two translation vectors of a point cloud coordinate system are obtained, a standard orthogonal base based on the tool coordinate system and a standard orthogonal base based on the point cloud coordinate system are respectively established according to the two sets of translation vectors, so that a rotating part of a conversion relation of the two coordinate systems is solved, finally, the actuating mechanism is operated to perform contact type measurement on a mark point of a calibration plate, and the translation part is solved by combining the solved rotating part. The hand-eye calibration method is suitable for a vision system consisting of the three-degree-of-freedom parallel robot and the 3D sensor and has higher precision.

Description

Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor
Technical Field
The invention belongs to the field of robot vision, and relates to a three-degree-of-freedom parallel robot hand-eye calibration method based on a 3D vision sensor.
Background
The hand-eye calibration is a key technical link for realizing the visual positioning guidance of the robot. Aiming at robots with different working modes and different types of vision sensors, the research on a proper and feasible hand-eye calibration method with high precision and good stability has important significance for the development of a robot vision system.
Compared with a common six-degree-of-freedom robot, the three-degree-of-freedom parallel robot has a larger working range and higher positioning accuracy, but an actuating mechanism of the three-degree-of-freedom parallel robot only has three translational degrees of freedom in mutually orthogonal directions and does not have rotational degrees of freedom. Most of the traditional hand-eye calibration methods need to operate a robot actuating mechanism to acquire data of a calibration object under multiple postures, which cannot be realized on a three-degree-of-freedom parallel robot.
The vision sensor is the 'eye' of the robot, so that the robot can acquire the position information of the object to be detected in a non-contact manner, and with the development of vision technology, the 3D vision sensor replaces a traditional plane camera and is applied to a robot vision system. The 3D vision sensor based on the structured light can quickly acquire the three-dimensional coordinate information of an object in the visual field range of the camera and generate real-time point cloud data. The point cloud data is analyzed and processed, and required calibration information is obtained from the point cloud data, so that the method is an important link for realizing hand-eye calibration based on the 3D visual sensor.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a three-degree-of-freedom parallel robot hand-eye calibration method based on a 3D vision sensor. Considering the problem that the controllable degree of freedom of the end actuating mechanism of the three-degree-of-freedom parallel robot is limited, a method for solving the unique solution of the hand-eye relationship without the posture change of the actuating mechanism and the sensor is designed. On the basis of the method, aiming at the problem of obtaining calibration information from point cloud data acquired by a 3D visual sensor, a method for assisting point cloud positioning through a two-dimensional image is designed, and corresponding optimization is carried out.
The technical scheme of the invention is as follows:
a three-degree-of-freedom parallel robot hand-eye calibration method based on a 3D vision sensor comprises the following processes:
1) establishing a coordinate transformation mathematical model of a three-degree-of-freedom parallel robot vision system:
Figure GDA0003084354250000011
wherein, XCam=[xC yC zC 1]TRepresenting the homogeneous coordinate, x, of the object under test in the point cloud coordinate system of the 3D vision sensorC、yC、zCRespectively representing the coordinates of the object to be measured in the X-axis direction, the Y-axis direction and the Z-axis direction of the point cloud coordinate system; xTool=[xT yTzT 1]TRepresenting the homogeneous coordinate, x, of the object to be measured in the tool coordinate system of the robotT、yT、zTRespectively representing the coordinates of the object to be measured in the X-axis direction, the Y-axis direction and the Z-axis direction of the tool coordinate system; xBase=[xB yB zB 1]TRepresenting the homogeneous coordinate, x, of the object to be measured in the base coordinate system of the robotB、yB、zBRespectively representing the coordinates of the object to be measured in the X-axis direction, the Y-axis direction and the Z-axis direction of a base coordinate system of the robot; matrix array
Figure GDA0003084354250000021
Representing the transformation of the base coordinate system to the tool coordinate system,
Figure GDA0003084354250000022
the method comprises the following steps:
Figure GDA0003084354250000023
wherein the content of the first and second substances,
Figure GDA0003084354250000024
denotes an identity matrix, 0 ═ 000],xTCP=[xP yP zP]TThree-dimensional coordinates, x, representing the tool centre point of the actuator in the base coordinate system of the robotP、yP、zPRespectively representing the coordinates of the tool center point in the X-axis, Y-axis and Z-axis directions of the robot base coordinate system, and being capable of being directly obtained from the robot motion control parameters; matrix array
Figure GDA0003084354250000025
The conversion relation from the tool coordinate system to the point cloud coordinate system, namely a hand-eye relation matrix required to be obtained,
Figure GDA0003084354250000026
the construction method comprises the following steps:
Figure GDA0003084354250000027
wherein, the matrix
Figure GDA0003084354250000028
A rotating portion representing the hand-eye relationship,
Figure GDA0003084354250000029
a translation portion representing a hand-eye relationship;
2) finding the rotating part of the hand-eye relationship
Figure GDA00030843542500000210
The method comprises the following steps:
Figure GDA00030843542500000211
wherein, the matrix
Figure GDA00030843542500000212
For orthonormal basis, matrix, constructed based on tool coordinate system
Figure GDA00030843542500000213
A normalized orthogonal base constructed based on a point cloud coordinate system; to construct MTAnd MCControlling an actuating mechanism of the three-degree-of-freedom parallel robot to perform two times of mutually vertical translation motions and acquiring point cloud data of a calibration plate; the first translation is from the position a to the position b, and a translation vector of the tool coordinate system from the position a to the position b is obtained
Figure GDA00030843542500000214
And a translation vector of the point cloud coordinate system from a position a to a position b
Figure GDA00030843542500000215
Wherein xTab、yTab、zTabRepresenting a vector
Figure GDA00030843542500000216
Component in X, Y, Z directions of the tool coordinate system, XCab、yCab、zCabRepresenting a vector
Figure GDA00030843542500000217
Components in X-axis, Y-axis and Z-axis directions of the point cloud coordinate system; the second translation movement from the position b to the position c is carried out to obtain a translation vector of the tool coordinate system from the position b to the position c
Figure GDA00030843542500000218
And a translation vector of the point cloud coordinate system from a position b to a position c
Figure GDA00030843542500000219
Wherein xTbc、yTbc、zTbcRepresenting a vector
Figure GDA00030843542500000221
Component in X, Y, Z directions of the tool coordinate system, XCbc、yCbc、zCbcRepresenting a vector
Figure GDA00030843542500000220
Components in X-axis, Y-axis and Z-axis directions of the point cloud coordinate system; the straight line ab is perpendicular to the straight line bc; unitizing the four translation vectors to obtain the following four unit vectors:
Figure GDA0003084354250000031
wherein, the vector pT1Is composed of a vector
Figure GDA0003084354250000032
Unit vector, X, obtained by unitizationT1、YT1、ZT1Is a vector pT1Components in the X, Y, Z directions of the tool coordinate system; vector pT2Is composed of a vector
Figure GDA0003084354250000033
Unit vector, X, obtained by unitizationT2、YT2、ZT2Is a vector pT2Components in the X, Y, Z directions of the tool coordinate system; vector pC1Is composed of a vector
Figure GDA0003084354250000034
Unit vector, X, obtained by unitizationC1、YC1、ZC1Is a vector pC1Vectors in the directions of an X axis, a Y axis and a Z axis of the point cloud coordinate system; vector pC2Is composed of a vector
Figure GDA0003084354250000035
Unit vector, X, obtained by unitizationC2、YC2、ZC2Is a vector pC2Components in X-axis, Y-axis and Z-axis directions of the point cloud coordinate system; construction of orthogonal base MTAnd MCComprises the following steps:
Figure GDA0003084354250000036
3) translation part for finding hand-eye relationship
Figure GDA0003084354250000037
The method comprises the following steps:
Figure GDA0003084354250000038
wherein the content of the first and second substances,
Figure GDA0003084354250000039
the value is obtained by the formula (4); selecting a point as a landmark point, xBase0=[xB0 yB0 zB0]TIs three-dimensional coordinate, x, of a mark point in a base coordinate system measured by the contact of a tool center point of the robotB0、yB0、zB0Respectively representing the coordinates of the mark points in the X-axis direction, the Y-axis direction and the Z-axis direction of a base coordinate system of the robot; x is the number ofCam0=[xC0 yC0 zC0]TIs the three-dimensional coordinate, x, of the mark point in the point cloud coordinate system of the shooting positionC0、yC0、zC0Respectively representing the coordinates of the mark point in the X-axis direction, the Y-axis direction and the Z-axis direction of a point cloud coordinate system; x is the number ofTCP0=[xP0 yP0 zP0]TIs the three-dimensional coordinate, x, of the tool center point of the shooting position under the base coordinate system of the robotP0、yP0、zP0Respectively representing the coordinates of the tool center point of the shooting position in the X-axis, Y-axis and Z-axis directions of a base coordinate system of the robot;
4) obtaining x in equation (7) by accurately locating the marker points on the calibration plate in the point cloud coordinate systemCam0The method comprises the following steps:
step 1: acquiring a picture of a calibration plate and a corresponding point cloud by using a 3D visual sensor; wherein, the number of lines of the picture is Rows, and the number of columns is Cols; the point cloud is a set of three-dimensional coordinates which are orderly arranged, and the number of the three-dimensional coordinates is as follows:
N=Rows*Cols (8)
each three-dimensional coordinate x in the point cloudi(i-0, 1, …, N-1) corresponds to its serial number IN the three-dimensional coordinate set of the point cloudiAnd integer pixel coordinates P in a pixel coordinate system of a picturei(i=0,1,…,N-1);
Figure GDA0003084354250000041
Step 2: positioning the mark point in the picture of the calibration plate to obtain the mark pointSub-pixel coordinate P in a pixel coordinate systemc=(uc,vc),ucRepresenting the row coordinate, u, of the index point in the pixel coordinate systemc∈(0,Rows-1),vcRepresenting the column coordinates, v, of the index point in the pixel coordinate systemcE (0, Cols-1); calculating the sub-pixel coordinate P of the marker point in the pixel coordinate systemcFour surrounding integer pixel coordinates Pc0、Pc1、Pc2、Pc3Comprises the following steps:
Figure GDA0003084354250000042
wherein, c0, c1, c2 and c3 epsilon (0,1, … N-1); u. of0Is ucThe integer part of (1), i.e.
Figure GDA0003084354250000043
v0Is v iscThe integer part of (1), i.e.
Figure GDA0003084354250000044
Figure GDA0003084354250000045
Is a set of integers;
and step 3: calculating four integer pixel coordinates Pc0、Pc1、Pc2、Pc3Number IN of corresponding three-dimensional coordinates IN three-dimensional coordinate set of point cloudc0、INc1、INc2、INc3Comprises the following steps:
Figure GDA0003084354250000046
finding out serial number IN IN three-dimensional coordinate set of point cloudc0、INc1、INc2、INc3Corresponding three-dimensional coordinates
Figure GDA0003084354250000047
And 4, step 4: with PcAs the center, dividing a square surrounded by four integer pixel coordinates into four small rectangular regions, and respectively calculating the areas S of the four rectangular regions0、S1、S2、S3Comprises the following steps:
Figure GDA0003084354250000048
the interpolation proportion of four integer pixel points can be obtained according to the reciprocal of the area as follows:
Figure GDA0003084354250000049
and 5: interpolation ratio obtained according to equation (12) for four three-dimensional coordinates xc0、xc1、xc2、xc3Carrying out interpolation calculation to obtain three-dimensional coordinates
Figure GDA0003084354250000051
The calculation method comprises the following steps:
xc=W(0)*xc0+W(1)*xc1+W(2)*xc2+W(3)*xc3 (13)
three-dimensional coordinate xcI.e. the precise positioning result of the mark point in the point cloud coordinate system, namely xCam0=xc
5) Obtaining the rotating part of the hand-eye matrix by the formula (4)
Figure GDA0003084354250000052
Obtaining the three-dimensional coordinate x of the mark point under the point cloud coordinate system by the formula (13)Cam0X is to beCam0Obtaining a translation part of the hand-eye matrix by substituting formula (7)
Figure GDA0003084354250000053
Will be provided with
Figure GDA0003084354250000054
And
Figure GDA0003084354250000055
obtaining a complete hand-eye matrix by substituting formula (3)
Figure GDA0003084354250000056
And completing the hand-eye calibration of the three-degree-of-freedom parallel robot vision system.
The invention has the beneficial effects that: the invention uses a method which can solve the hand-eye relation rotating part only by two times of translation movements, and is suitable for the working mode of the three-degree-of-freedom parallel robot; the solution translation part only needs to operate the robot to measure the three-dimensional coordinates of at least one mark point under the base coordinate system in a contact manner, so that the calibration process is simplified; the marking point positioning method designed aiming at the pictures and point cloud data collected by the 3D sensor has practical reference significance for solving the problems; the used calibration plate is simple in manufacturing method and low in cost; the precision of the calibration result reaches +/-0.2 mm, and the process requirements of robot welding, grabbing and the like are met.
Drawings
Fig. 1 is a flow chart of three-degree-of-freedom parallel robot hand-eye calibration.
Fig. 2 is a schematic diagram of a three-degree-of-freedom parallel robot hand-eye vision system.
Fig. 3 is a diagram of the positioning effect of the center point of the calibration plate in the picture.
FIG. 4 is a diagram of the positioning effect of the center point of the calibration plate in the point cloud.
FIG. 5 is a schematic diagram of sub-pixel interpolation region division.
FIG. 6 is a comparison graph of absolute errors for 16 markers using interpolation and rounding, respectively, to obtain calibration results.
Detailed Description
The following further describes the embodiments of the present invention with reference to the drawings.
Referring to the attached figure 1, a three-degree-of-freedom parallel robot hand-eye calibration method based on a 3D vision sensor comprises two operation steps:
step 1: operating the robot actuator to perform two mutually perpendicular translational movements
In actual operation, two strictly vertical translational motions can be obtained only by controlling the actuating mechanism to respectively move once along the directions of the x axis and the y axis of the three-degree-of-freedom parallel robot base coordinate system. Respectively recording three-dimensional coordinates x of TCP (transmission control protocol) of the actuating mechanism at three positions before and after two movementsTCP0、xTCP1、xTCP2Recording calibration board pictures and point cloud data acquired by the 3D vision sensor at three positions, and extracting three-dimensional coordinates x of the mark points at the three positionsCam0、xCam1、xCam2. The translation vector of the tool coordinate system can be obtained through the change of the TCP coordinate, and the translation vector of the point cloud coordinate system can be obtained through the change of the mark point coordinate, wherein the translation vector of the point cloud coordinate system is as follows:
Figure GDA0003084354250000061
wherein, TT1And TT2Corresponds to the claim 2) section
Figure GDA0003084354250000062
And
Figure GDA0003084354250000063
a translation vector representing two movements of the tool coordinate system; t isC1And TC2Corresponds to the claim 2) section
Figure GDA0003084354250000064
And
Figure GDA0003084354250000065
and a translation vector representing two motions of the point cloud coordinate system. Namely, the rotating part of the hand-eye relationship can be obtained according to the method of the claim 2)
Figure GDA0003084354250000066
Step 2: operating the robot to make TCP contact measurements of the marker points
Controlling the executing mechanism of the robot to move slowly towards the calibration plate to ensure that the tool center point is in point contact coincidence with the mark point on the calibration plate, wherein the coordinate value of the tool center point under the base coordinate system at the moment is equivalent to the three-dimensional coordinate x of the mark point under the base coordinate system of the robotBase0At the already determined rotating part
Figure GDA0003084354250000067
In the case of (3), the translation part of the hand-eye relationship can be calculated by the equation (7)
Figure GDA0003084354250000068
The supplementary explanation of the part 1) of the solution is made with reference to the attached figure 2:
the three-degree-of-freedom parallel robot establishes a base coordinate system O by taking a zero point of a tool central point as an origin of the base coordinate system and three orthogonal motion directions of an actuating mechanism as coordinate axis directionsBase. The tool coordinate system O is formed by the fact that the actuating mechanism of the three-freedom-degree parallel robot does not have the rotation freedom degreeToolAnd base coordinate system OBaseThe relative attitude relationship of (a) is fixed, so that the coordinate axes of the established tool coordinate system and the base coordinate system are in the same direction, and the rotating part of the conversion relationship of the two coordinate systems can be regarded as a unit matrix
Figure GDA0003084354250000069
Since the origin of the tool coordinate system is always the tool center point, the tool center point is defined by the base coordinate system OBaseTo tool coordinate system OToolThe translation part of the translation relation is the three-dimensional coordinates of the tool center point at the current position, i.e.
Figure GDA00030843542500000610
The supplementary explanation of the part 4) of the solution is made with reference to the attached figures 3 and 4:
the method uses an asymmetric circular plane calibration plate, the circle center of the calibration plate is used as a mark point, the circle part on the calibration plate is white, and the rest part is black, so that the circle center can be accurately extracted on a picture, and the error of the area on the circle in the point cloud caused by the absorption of projection light is avoided. The positioning effect of the calibration plate circle center on the picture is shown in figure 3, and the positioning effect in the point cloud is shown in figure 4.
Technical proposal 2) of obtaining the translation vector of the point cloud coordinate system
Figure GDA00030843542500000611
And
Figure GDA00030843542500000612
the coordinate transformation of the mark points is taken as a reference, and the hand-eye relation translation part is calculated in the technical scheme 3)
Figure GDA00030843542500000613
The three-dimensional coordinates in the point cloud coordinate system to the landmark points need to be used directly. The distance between adjacent points of the point cloud generated by the 3D sensor used in the method is 0.2-0.4 mm, which means that if the sub-pixels of the mark point are directly rounded and then the coordinates of the mark point in the point cloud are searched, the mark point may have an error of +/-0.4 mm in positioning, and therefore the calibration precision can be improved by accurately positioning the three-dimensional coordinates of the mark point in the point cloud coordinate system through an interpolation method.
Example (b):
the method is used for calibration experiments, the repeated positioning precision of the three-degree-of-freedom parallel robot used in the experiments is +/-0.01 mm, the image resolution of the 3D sensor is 728 multiplied by 544, the number of point cloud primary imaging points is 396032, and the standard working distance is 44 mm. Selecting 16 test points, and measuring three-dimensional coordinates x 'of the test points in a base coordinate system in a contact manner'BaseAnd then the three-dimensional coordinate x' of the test point under the base coordinate system measured by the hand-eye vision system is calculated by the formula (1)BaseThe method for calculating the absolute error e of the test point comprises the following steps:
e=|x″Base-x′Base| (16)
the calibration results using the method herein yielded an average absolute error of 0.119mm and a maximum absolute error of 0.174mm for the 16 test points. For comparison, the method of rounding the sub-pixel approximation was also tested to obtain a test point with an average absolute error of 0.248mm and a maximum absolute error of 0.347 mm. The results of the experiment are shown in FIG. 6.

Claims (1)

1. A three-degree-of-freedom parallel robot hand-eye calibration method based on a 3D vision sensor is characterized by comprising the following processes:
1) establishing a coordinate transformation mathematical model of a three-degree-of-freedom parallel robot vision system:
Figure FDA0002606766610000011
wherein, XCam=[xC yC zC 1]TRepresenting the homogeneous coordinate, x, of the object under test in the point cloud coordinate system of the 3D vision sensorC、yC、zCRespectively representing the coordinates of the object to be measured in the X-axis direction, the Y-axis direction and the Z-axis direction of the point cloud coordinate system; xTool=[xT yT zT1]TRepresenting the homogeneous coordinate, x, of the object to be measured in the tool coordinate system of the robotT、yT、zTRespectively representing the coordinates of the object to be measured in the X-axis direction, the Y-axis direction and the Z-axis direction of the tool coordinate system; xBase=[xB yB zB 1]TRepresenting the homogeneous coordinate, x, of the object to be measured in the base coordinate system of the robotB、yB、zBRespectively representing the coordinates of the object to be measured in the X-axis direction, the Y-axis direction and the Z-axis direction of a base coordinate system of the robot; matrix array
Figure FDA0002606766610000012
Representing the transformation of the base coordinate system to the tool coordinate system,
Figure FDA0002606766610000013
the method comprises the following steps:
Figure FDA0002606766610000014
wherein the content of the first and second substances,
Figure FDA0002606766610000015
denotes an identity matrix, 0 ═ 000],xTCP=[xP yP zP]TThree-dimensional coordinates, x, representing the tool centre point of the actuator in the base coordinate system of the robotP、yP、zPRespectively representing the coordinates of the tool center point in the X-axis, Y-axis and Z-axis directions of the robot base coordinate system, and being capable of being directly obtained from the robot motion control parameters; matrix array
Figure FDA0002606766610000016
The conversion relation from the tool coordinate system to the point cloud coordinate system, namely a hand-eye relation matrix required to be obtained,
Figure FDA0002606766610000017
the construction method comprises the following steps:
Figure FDA0002606766610000018
wherein, the matrix
Figure FDA0002606766610000019
A rotating portion representing the hand-eye relationship,
Figure FDA00026067666100000110
a translation portion representing a hand-eye relationship;
2) finding the rotating part of the hand-eye relationship
Figure FDA00026067666100000111
The method comprises the following steps:
Figure FDA00026067666100000112
wherein, the matrix
Figure FDA00026067666100000113
For orthonormal basis, matrix, constructed based on tool coordinate system
Figure FDA00026067666100000114
A normalized orthogonal base constructed based on a point cloud coordinate system; to construct MTAnd MCControlling an actuating mechanism of the three-degree-of-freedom parallel robot to perform two times of mutually vertical translation motions and acquiring point cloud data of a calibration plate; the first translation is from the position a to the position b, and a translation vector of the tool coordinate system from the position a to the position b is obtained
Figure FDA00026067666100000115
And a translation vector of the point cloud coordinate system from a position a to a position b
Figure FDA00026067666100000116
Wherein xTab、yTab、zTabRepresenting a vector
Figure FDA00026067666100000117
Component in X, Y, Z directions of the tool coordinate system, XCab、yCab、zCabRepresenting a vector
Figure FDA00026067666100000121
Components in X-axis, Y-axis and Z-axis directions of the point cloud coordinate system; the second translation movement from the position b to the position c is carried out to obtain a translation vector of the tool coordinate system from the position b to the position c
Figure FDA00026067666100000118
And a translation vector of the point cloud coordinate system from a position b to a position c
Figure FDA00026067666100000119
Wherein xTbc、yTbc、zTbcRepresenting a vector
Figure FDA00026067666100000120
Component in X, Y, Z directions of the tool coordinate system, XCbc、yCbc、zCbcRepresenting a vector
Figure FDA0002606766610000021
Components in X-axis, Y-axis and Z-axis directions of the point cloud coordinate system; the straight line ab is perpendicular to the straight line bc; unitizing the four translation vectors to obtain the following four unit vectors:
Figure FDA0002606766610000022
wherein, the vector pT1Is composed of a vector
Figure FDA0002606766610000023
Unit vector, X, obtained by unitizationT1、YT1、ZT1Is a vector pT1Components in the X, Y, Z directions of the tool coordinate system; vector pT2Is composed of a vector
Figure FDA0002606766610000024
Unit vector, X, obtained by unitizationT2、YT2、ZT2Is a vector pT2Components in the X, Y, Z directions of the tool coordinate system; vector pC1Is composed of a vector
Figure FDA0002606766610000025
Unit vector, X, obtained by unitizationC1、YC1、ZC1Is a vector pC1Vector in X-axis, Y-axis and Z-axis directions of point cloud coordinate system(ii) a Vector pC2Is composed of a vector
Figure FDA0002606766610000026
Unit vector, X, obtained by unitizationC2、YC2、ZC2Is a vector pC2Components in X-axis, Y-axis and Z-axis directions of the point cloud coordinate system; construction of orthogonal base MTAnd MCComprises the following steps:
Figure FDA0002606766610000027
3) translation part for finding hand-eye relationship
Figure FDA0002606766610000028
The method comprises the following steps:
Figure FDA0002606766610000029
wherein the content of the first and second substances,
Figure FDA00026067666100000210
the value is obtained by the formula (4); selecting a point as a landmark point, xBase0=[xB0 yB0 zB0]TIs three-dimensional coordinate, x, of a mark point in a base coordinate system measured by the contact of a tool center point of the robotB0、yB0、zB0Respectively representing the coordinates of the mark points in the X-axis direction, the Y-axis direction and the Z-axis direction of a base coordinate system of the robot; x is the number ofCam0=[xC0 yC0 zC0]TIs the three-dimensional coordinate, x, of the mark point in the point cloud coordinate system of the shooting positionC0、yC0、zC0Respectively representing the coordinates of the mark point in the X-axis direction, the Y-axis direction and the Z-axis direction of a point cloud coordinate system; x is the number ofTCP0=[xP0 yP0 zP0]TIs the three-dimensional coordinate, x, of the tool center point of the shooting position under the base coordinate system of the robotP0、yP0、zP0Respectively representing the coordinates of the tool center point of the shooting position in the X-axis, Y-axis and Z-axis directions of a base coordinate system of the robot;
4) obtaining x in equation (7) by accurately locating the marker points on the calibration plate in the point cloud coordinate systemCam0The method comprises the following steps:
step 1: acquiring a picture of a calibration plate and a corresponding point cloud by using a 3D visual sensor; wherein, the number of lines of the picture is Rows, and the number of columns is Cols; the point cloud is a set of three-dimensional coordinates which are orderly arranged, and the number of the three-dimensional coordinates is as follows:
N=Rows*Cols (8)
each three-dimensional coordinate x in the point cloudiCorresponding to a serial number IN of the point cloud IN the three-dimensional coordinate setiAnd integer pixel coordinates P in a pixel coordinate system of a picturei
Figure FDA0002606766610000031
INi∈(0,1,…N-1),
Figure FDA0002606766610000032
i=0,1,…,N-1;
Step 2: positioning the mark point in the picture of the calibration plate to obtain the sub-pixel coordinate P of the mark point in the pixel coordinate systemc=(uc,vc),ucRepresenting the row coordinate, u, of the index point in the pixel coordinate systemc∈(0,Rows-1),vcRepresenting the column coordinates, v, of the index point in the pixel coordinate systemcE (0, Cols-1); calculating the sub-pixel coordinate P of the marker point in the pixel coordinate systemcFour surrounding integer pixel coordinates Pc0、Pc1、Pc2、Pc3Comprises the following steps:
Figure FDA0002606766610000033
wherein, c0, c1, c2 and c3 epsilon (0,1, … N-1); u. of0Is ucThe integer part of (1), i.e.
Figure FDA0002606766610000034
v0Is v iscThe integer part of (1), i.e.
Figure FDA0002606766610000035
Figure FDA0002606766610000036
Is a set of integers;
and step 3: calculating four integer pixel coordinates Pc0、Pc1、Pc2、Pc3Number IN of corresponding three-dimensional coordinates IN three-dimensional coordinate set of point cloudc0、INc1、INc2、INc3Comprises the following steps:
Figure FDA0002606766610000037
finding out serial number IN IN three-dimensional coordinate set of point cloudc0、INc1、INc2、INc3Corresponding three-dimensional coordinates
Figure FDA0002606766610000038
Figure FDA0002606766610000039
And 4, step 4: with PcAs the center, dividing a square surrounded by four integer pixel coordinates into four small rectangular regions, and respectively calculating the areas S of the four rectangular regions0、S1、S2、S3Comprises the following steps:
Figure FDA00026067666100000310
obtaining the interpolation proportion of four integer pixel points according to the reciprocal of the area as follows:
Figure FDA00026067666100000311
and 5: interpolation ratio obtained according to equation (12) for four three-dimensional coordinates xc0、xc1、xc2、xc3Carrying out interpolation calculation to obtain three-dimensional coordinates
Figure FDA0002606766610000041
The calculation method comprises the following steps:
xc=W(0)*xc0+W(1)*xc1+W(2)*xc2+W(3)*xc3 (13)
three-dimensional coordinate xcI.e. the precise positioning result of the mark point in the point cloud coordinate system, namely xCam0=xc
5) Obtaining the rotating part of the hand-eye matrix by the formula (4)
Figure FDA0002606766610000042
Obtaining the three-dimensional coordinate x of the mark point under the point cloud coordinate system by the formula (13)Cam0X is to beCam0Obtaining a translation part of the hand-eye matrix by substituting formula (7)
Figure FDA0002606766610000043
Will be provided with
Figure FDA0002606766610000044
And
Figure FDA0002606766610000045
obtaining a complete hand-eye matrix by substituting formula (3)
Figure FDA0002606766610000046
And completing the hand-eye calibration of the three-degree-of-freedom parallel robot vision system.
CN202010741078.4A 2020-07-29 2020-07-29 Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor Active CN111872922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010741078.4A CN111872922B (en) 2020-07-29 2020-07-29 Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010741078.4A CN111872922B (en) 2020-07-29 2020-07-29 Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor

Publications (2)

Publication Number Publication Date
CN111872922A CN111872922A (en) 2020-11-03
CN111872922B true CN111872922B (en) 2021-09-03

Family

ID=73201890

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010741078.4A Active CN111872922B (en) 2020-07-29 2020-07-29 Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor

Country Status (1)

Country Link
CN (1) CN111872922B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112720469B (en) * 2020-12-18 2022-09-09 北京工业大学 Zero point calibration method for three-axis translational motion system by microscopic stereo vision
CN112819899B (en) * 2021-02-08 2022-11-01 燕山大学 Camera automatic calibration system based on series-parallel mechanism and camera automatic calibration method thereof
CN113500584B (en) * 2021-07-15 2022-06-28 西北工业大学 Tail end error correction system and method of three-degree-of-freedom parallel robot
CN113465548B (en) * 2021-08-23 2022-06-07 广东维正科技有限公司 Calibration and precision evaluation method of fringe projection three-dimensional measurement system
CN114152190B (en) * 2021-11-15 2023-10-24 苏州铸正机器人有限公司 Industrial camera precision and working space test platform

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19749557A1 (en) * 1997-11-10 1999-05-12 Jung Guenther Prof Dr Appts for parallel chemical reactions
CN102278963A (en) * 2011-06-30 2011-12-14 燕山大学 Self-calibration method of parallel robot
CN105945909A (en) * 2016-05-13 2016-09-21 大族激光科技产业集团股份有限公司 Error correction method and system for three-degree-of-freedom parallel robot
CN106041937A (en) * 2016-08-16 2016-10-26 河南埃尔森智能科技有限公司 Control method of manipulator grabbing control system based on binocular stereoscopic vision
CN108436909A (en) * 2018-03-13 2018-08-24 南京理工大学 A kind of hand and eye calibrating method of camera and robot based on ROS
CN108818535A (en) * 2018-07-05 2018-11-16 杭州汉振科技有限公司 Robot 3D vision hand and eye calibrating method
CN109702738A (en) * 2018-11-06 2019-05-03 深圳大学 A kind of mechanical arm hand and eye calibrating method and device based on Three-dimension object recognition
CN208867184U (en) * 2018-10-21 2019-05-17 西北农林科技大学 A kind of fruit sorting machine people's control system
CN110000790A (en) * 2019-04-19 2019-07-12 深圳科瑞技术股份有限公司 A kind of scaling method of SCARA robot eye-to-hand hand-eye system
CN110276806A (en) * 2019-05-27 2019-09-24 江苏大学 Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system
CN110405731A (en) * 2019-07-19 2019-11-05 南京理工大学 A kind of quick double mechanical arms basis coordinates system scaling method
CN110509280A (en) * 2019-09-11 2019-11-29 哈尔滨工程大学 A kind of multi-freedom parallel connection crawl robot control system and its control method
KR102093165B1 (en) * 2016-09-28 2020-03-25 코그넥스코오포레이션 Simultaneous kinematic and hand-eye calibration

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19749557A1 (en) * 1997-11-10 1999-05-12 Jung Guenther Prof Dr Appts for parallel chemical reactions
CN102278963A (en) * 2011-06-30 2011-12-14 燕山大学 Self-calibration method of parallel robot
CN105945909A (en) * 2016-05-13 2016-09-21 大族激光科技产业集团股份有限公司 Error correction method and system for three-degree-of-freedom parallel robot
CN106041937A (en) * 2016-08-16 2016-10-26 河南埃尔森智能科技有限公司 Control method of manipulator grabbing control system based on binocular stereoscopic vision
KR102093165B1 (en) * 2016-09-28 2020-03-25 코그넥스코오포레이션 Simultaneous kinematic and hand-eye calibration
CN108436909A (en) * 2018-03-13 2018-08-24 南京理工大学 A kind of hand and eye calibrating method of camera and robot based on ROS
CN108818535A (en) * 2018-07-05 2018-11-16 杭州汉振科技有限公司 Robot 3D vision hand and eye calibrating method
CN208867184U (en) * 2018-10-21 2019-05-17 西北农林科技大学 A kind of fruit sorting machine people's control system
CN109702738A (en) * 2018-11-06 2019-05-03 深圳大学 A kind of mechanical arm hand and eye calibrating method and device based on Three-dimension object recognition
CN110000790A (en) * 2019-04-19 2019-07-12 深圳科瑞技术股份有限公司 A kind of scaling method of SCARA robot eye-to-hand hand-eye system
CN110276806A (en) * 2019-05-27 2019-09-24 江苏大学 Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system
CN110405731A (en) * 2019-07-19 2019-11-05 南京理工大学 A kind of quick double mechanical arms basis coordinates system scaling method
CN110509280A (en) * 2019-09-11 2019-11-29 哈尔滨工程大学 A kind of multi-freedom parallel connection crawl robot control system and its control method

Also Published As

Publication number Publication date
CN111872922A (en) 2020-11-03

Similar Documents

Publication Publication Date Title
CN111872922B (en) Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor
CN109859275B (en) Monocular vision hand-eye calibration method of rehabilitation mechanical arm based on S-R-S structure
CN110276806B (en) Online hand-eye calibration and grabbing pose calculation method for four-degree-of-freedom parallel robot stereoscopic vision hand-eye system
CN109483516A (en) A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN108648237B (en) Space positioning method based on vision
CN108582076A (en) A kind of Robotic Hand-Eye Calibration method and device based on standard ball
CN111775146A (en) Visual alignment method under industrial mechanical arm multi-station operation
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN108098762A (en) A kind of robotic positioning device and method based on novel visual guiding
CN108253939B (en) Variable visual axis monocular stereo vision measuring method
CN111536902A (en) Galvanometer scanning system calibration method based on double checkerboards
CN110666798A (en) Robot vision calibration method based on perspective transformation model
CN109087355B (en) Monocular camera pose measuring device and method based on iterative updating
US10928191B2 (en) Marker, and posture estimation method and position and posture estimation method using marker
CN110238820A (en) Hand and eye calibrating method based on characteristic point
CN106940894A (en) A kind of hand-eye system self-calibrating method based on active vision
CN110202560A (en) A kind of hand and eye calibrating method based on single feature point
CN114310901B (en) Coordinate system calibration method, device, system and medium for robot
CN108180834A (en) A kind of industrial robot is the same as three-dimensional imaging instrument position orientation relation scene real-time calibration method
TWI708667B (en) Method and device and system for calibrating position and orientation of a motion manipulator
CN115139283A (en) Robot hand-eye calibration method based on random mark dot matrix
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN109900251A (en) A kind of robotic positioning device and method of view-based access control model technology
CN112958960B (en) Robot hand-eye calibration device based on optical target
Zexiao et al. A novel approach for the field calibration of line structured-light sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Pan Yun

Inventor after: LAN Zhifei

Inventor after: Zhang Qian

Inventor after: Zhang Ying

Inventor after: Jiang Huiling

Inventor after: Pan Feng

Inventor after: Cheng Qi

Inventor before: Pan Feng

Inventor before: Cheng Qi

CB03 Change of inventor or designer information
TA01 Transfer of patent application right

Effective date of registration: 20210817

Address after: No.17 Binhe Road, Nanming District, Guiyang City, Guizhou Province

Applicant after: GUIZHOU POWER GRID Corp.

Address before: 1800 No. 214122 Jiangsu city of Wuxi Province Li Lake Avenue

Applicant before: Jiangnan University

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant