CN105157725B - A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot - Google Patents
A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot Download PDFInfo
- Publication number
- CN105157725B CN105157725B CN201510460526.2A CN201510460526A CN105157725B CN 105157725 B CN105157725 B CN 105157725B CN 201510460526 A CN201510460526 A CN 201510460526A CN 105157725 B CN105157725 B CN 105157725B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- robot
- formula
- point
- calibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 230000000007 visual effect Effects 0.000 title abstract description 9
- 238000013178 mathematical model Methods 0.000 claims abstract description 7
- 239000011159 matrix material Substances 0.000 claims description 52
- 230000009466 transformation Effects 0.000 claims description 34
- 238000005259 measurement Methods 0.000 claims description 17
- 239000013598 vector Substances 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000011017 operating method Methods 0.000 abstract 1
- 238000005070 sampling Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000000779 smoke Substances 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010891 electric arc Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a kind of two-dimensional laser visual sensors and the hand and eye calibrating method of robot, include the following steps:Step A, the mathematical model of calibration algorithm is established;Step B, the implementation operating procedure of hand and eye calibrating is formulated;The step of mathematical model for wherein establishing calibration algorithm, which realizes, derives the mathematical model of trick calibration algorithm;The step of implementation operation for formulating hand and eye calibrating, solves the specific implementation operating process during two-dimensional laser sensor and robot progress hand and eye calibrating.Have many advantages, such as that this method is good with easy, practicality, flexible and precision.
Description
Technical Field
The invention relates to a hand-eye calibration technology of a laser sensor and a robot, in particular to a two-dimensional laser vision sensor and a hand-eye calibration method of the robot.
Background
Because the vision system has good detection performance and positioning performance, the development of the robot vision system has become a hotspot and a focus in the robot research field. The visual sensing method is more and more emphasized by people because of the advantages of abundant information, high sensitivity, high precision, no contact with workpieces and the like.
At present, images collected by visual sensing include images based on natural light and artificial common light and structured light images using laser as an active light source. In some special industrial environments, for example, there are strong arc light, dust, smoke and other undesirable interference factors in the welding field, the performance of the conventional CCD camera is severely interfered, and the conventional CCD camera cannot perform a task well in such environments, and is poor in practicability. In contrast, the two-dimensional laser sensor is based on the principle of triangulation, the object cross-section profile measurement is carried out through linear laser beam laser, all stray light including arc light is filtered out through an optical filter with the same wavelength as the laser, and an optical receiving device and a CMOS (complementary metal oxide semiconductor) plane detector which are integrated in the sensor only receive and form an image of a laser stripe. The two-dimensional laser sensor has the advantages of no adoption of any movable part, firmness, durability, no interference of electric arc light, smoke dust, splashing and the like.
The laser as an active light source has the advantages of high energy, high brightness, good monochromaticity, strong anti-interference capability and the like, so the two-dimensional laser vision sensor has great development prospect. The CCD camera is an area array visual sensor, and the two-dimensional laser sensor is a linear array visual sensor. Machine vision is one of the core technologies in the detection field and the artificial intelligence field, and the flexibility and the efficiency of robot operation can be undoubtedly improved. The mapping relation between the visual coordinate system and the robot end joint coordinate system is obtained through hand-eye calibration, the calibration precision determines the operation precision of the robot to a great extent, and therefore the calibration of the visual system is carried out, and the calibrated visual system can obtain higher precision, so that the technical problem to be solved is solved.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a hand-eye calibration method of a two-dimensional laser vision sensor and a robot.
The purpose of the invention is realized by the following technical scheme: a two-dimensional laser vision sensor and a hand-Eye calibration method of a robot are provided, the hand-Eye calibration method adopts a two-dimensional laser sensor and a robot (comprising a robot controller and a teaching box) which are internally integrated with an optical receiving device and a CMOS plane detector, wherein the two-dimensional laser sensor is fixedly arranged on a terminal flange of the robot through a bracket to form an Eye-in-hand system; the robot Base coordinate system is set as { Base }, the robot sixth joint End flange coordinate system is set as { End }, the two-dimensional laser sensor measurement coordinate system is set as { M }, and the purpose of hand-eye calibration is to solve the transformation matrix of the coordinate system { M } relative to the coordinate system { End }, namely
When linear laser beams emitted by a two-dimensional laser sensor adopted by the method are projected on the surface of a measured object, the laser beams can form an image consistent with the surface profile of the measured object, a series of P continuous and uniformly distributed laser sampling points are arranged on the laser beams, and then the sensor returns to the P sampling pointsZ of a sample point relative to a sensor measurement coordinate system { M }MAxis and XMAn axis coordinate value;
the method also adopts a computer to obtain the measurement data of the two-dimensional laser sensor, complete the input of calibration data and execute the algorithm operation of calibration;
the method also requires the use of some other components, such as: and (5) calibrating the board.
The two-dimensional laser vision sensor and the hand-eye calibration method of the robot comprise the following steps:
step A, establishing a mathematical model of a calibration algorithm;
b, making implementation operation steps of hand-eye calibration;
the step A comprises the following steps:
A1) obtaining a point P in space1Coordinates in the robot-based coordinate system { Base } areBP1,BP1=[1x1,1y1,1z1,1]TAn acquisition point P1Coordinates in the two-dimensional laser sensor measurement coordinate system { M } areMP1,MP1=[Mx1,My1,Mz1,1]T. Hand-eye matrix, i.e. the transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, is recorded asThe transformation relation of the coordinate system { End } relative to the coordinate system { Base } is recorded as
A2) From the transformation relationship between the coordinate systems, it can be knownWill both ends of the equationLeft-hand matrixThe inverse of (c) yields:
in the formula,is the inverse of the transformation matrix of the coordinate system { End } relative to the coordinate system { Base },BP1Is a certain point P in space1Coordinates in the robot-based coordinate system { Base }),Is a transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, and,MP1Is a point P1The coordinates in the coordinate system M are measured at the two-dimensional laser sensor.
Note the bookObtaining:
wherein T is a column matrix,BP1Is a certain point P in space1Coordinates in the robot-based coordinate system { Base }),Is a transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, and,MP1Is a point P1The coordinates in the coordinate system M are measured at the two-dimensional laser sensor.
Expanding the formula (2) to obtain:
in the formula,is a column matrix,A specific form of a transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, and,Is a column matrix, representing a point P1Coordinates are measured in a two-dimensional laser sensor in a coordinate system { M }.
Further expanding the formula (3) to obtain:
the spatial point P can be known according to the principle and characteristics of the two-dimensional laser sensor1Y-axis coordinate value in two-dimensional laser sensor measurement coordinate system { M }My1Constant at 0, thus giving formula (4).
The meanings of the variables in the formula are shown in the formula (3);
A3) like a certain point P in space1For a point P in space, the same principle applies2、P3The method comprises the following steps:
the symbols in the formula (5) and the formula (6) have the meanings shown in the formula (4);
the following can be obtained from formula (4), formula (5), and formula (6):
the meaning of the symbols in the formula is shown in formula (3) -formula (6);
equation (7) can be written as follows:
further the transformation becomes:
the symbols in the formulae (8) and (9) have the meanings given in formula (7);
to this end, solve r11、r13、Δx;
Similarly, the following equations (4), (5) and (6) can be obtained:
in the formula,is composed ofThe inverse of (c).
To this end, solve r11、r13Δ x and r21、r23Δ y and r31、r33、Δz;
A4) If the hand-eye relationship matrix is to be completely determinedAlso, it is necessary to solve for r12、r22、r32From the properties of the attitude matrix, it can be known that:
and vectorAre unit vectors and are orthogonal two by two.
In the formula, the symbols are each a transformation matrix representation of the coordinate system { M } relative to the robot End flange coordinate system { End }.
Therefore, the method comprises the following steps:
from formula (13) can solve r12、r22、r32The solved vectorPerforming unitization to obtain normalizedThe vector is solved to obtain the hand-eye relation matrix
A5) Obtaining the following coordinate pose transformation relation:
in the formula,1p' is based on the calibration resultThe theoretical coordinate value of a certain point P in the space in the coordinate system { Base } is deduced,MP is the coordinate of the point in the coordinate system { M }, willBP' andBp can check the precision of the hand-eye calibration by comparing, and only the space point P is utilized1、P2、P3The result of the calibration may have larger error, and in order to improve the accuracy and fault-tolerant rate of the calibration algorithm, N (N) is selected>3) A spatial point, 3 points selected from the spatial points, in totalAnd selecting one combination with the smallest error as a calibration result.
In the step A2), obtaining the coordinate system according to the transformation relation between the coordinate systemsLeft-multiplying both ends of an equation by a matrixThe inverse of (c) yields:
note the bookObtaining:
expanding the formula (2) to obtain:
further expanding the formula (3) to obtain:
the spatial point P can be known according to the principle and characteristics of the two-dimensional laser sensor1Y-axis coordinate value in two-dimensional laser sensor measurement coordinate system { M }My1Constant at 0, thus giving formula (4).
In the formula,is a column matrix,A specific form of a transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, and,Is a column matrix, representing a point P1Coordinates are measured in a two-dimensional laser sensor in a coordinate system { M }.
In said step A3), similar to a certain point P in space1Taking another point P in space2、P3The same principle is as follows:
from formulas (4), (5) and (6):
equation (7) is written as follows:
further the transformation becomes:
the symbols in the formulae (8) and (9) have the meanings given in formula (7);
to this end, solve r11、r13、Δx;
Similarly, the following equations (4), (5) and (6) can be obtained:
in the formula,is composed ofThe inverse of (1);
to this end, solve r11、r13、Δx、r21、r23、Δy、r31、r33And Δ z.
The step B comprises the following steps:
B1) the calibration plate is placed at a certain proper position in space, the robot is operated to enable a laser line emitted by a two-dimensional laser sensor arranged on a flange at the tail end of the calibration plate to be projected onto a certain point on the calibration plate and recorded as a point P, and a coordinate value of the point P in a laser sensor measurement coordinate system { M } at the moment is read according to the geometric relationship between the calibration plate and the laser sensorMP; keeping the robot to be read and recording the pose of the robot End flange tool coordinate system { End } relative to the robot Base coordinate system { Base } at the moment
B2) Operating the robot to make another TCP of the established tool coordinate system reach the P point, reading the coordinate of the P point in the robot Base coordinate system { Base }BAnd P. Will obtainMP、 BP is taken as a group of calibration data;
B3) repeating the steps B1) and B2) to obtain N groups of different calibration data;
B4) inputting the N groups of calibration data obtained in the step B3) into a calibration program, and obtaining a hand-eye calibration result through computer calculation
Compared with the prior art, the invention has the following advantages and effects:
1. the invention has strong practicability, flexible and simple use and high hand-eye calibration precision.
2. The invention is suitable for a hand-eye system formed by a two-dimensional laser vision sensor and a robot, well utilizes the laser vision sensor to replace the traditional CCD vision, and meets the application requirements of the robot in special occasions.
Drawings
Fig. 1a is a schematic diagram of coordinates of a space calibration point in a sensor measurement coordinate system and the current posture of a robot according to the present invention.
Fig. 1b is a partial enlarged view of the laser sensor.
Fig. 2 is a schematic diagram of coordinates of a space calibration point in a robot base coordinate system according to the invention.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but the present invention is not limited thereto.
Examples
A two-dimensional laser sensor is arranged on a flange at the tail end of a robot (comprising a robot controller and a teaching box) through a mounting bracket, the sensor is communicated with a computer, and the computer receives measurement data returned by the sensor.
When a laser beam emitted by the two-dimensional laser sensor is projected onto the surface of a measured object, the laser beam can form an image consistent with the surface profile of the measured object, a series of P continuous and uniformly distributed laser sampling points are arranged on the laser beam, and then the sensor returns P sampling points in the laser beam to measure coordinate values of a Z axis and an X axis in a coordinate system relative to the sensor.
And acquiring data required by hand-eye calibration by using a laser sensor and a robot in combination with a calibration plate. The method also adopts a computer to obtain the measurement data of the two-dimensional laser sensor, complete the input of calibration data and execute the algorithm operation of calibration, and the hand-eye relationship is solved by calculation by utilizing the data.
This embodiment also employs an attachment of a calibration plate.
As shown in fig. 1a and 1b, the step a (establishing a mathematical model of the calibration algorithm) includes the following steps:
A1) obtaining a point P in space1The coordinates in the robot-based coordinate system { Base }1 areBP1,BP1=[1x1,1y1,1z1,1]TAn acquisition point P1The coordinates in the measurement coordinate system { M }2 of the two-dimensional laser sensor areMP1,MP1=[Mx1,My1,Mz1,1]T. Hand-eye matrix, i.e. the transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }3 is recorded asThe transformation relation of the coordinate system { End } relative to the coordinate system { Base } is recorded as
In the formula,is the inverse of the transformation matrix of the coordinate system { End } relative to the coordinate system { Base },BP1Is a certain point P in space1Coordinates in the robot-based coordinate system { Base }),End of robot method for coordinate system { M }Transformation matrix of blue coordinate system { End }, and,MP1Is a point P1The coordinates in the coordinate system M are measured at the two-dimensional laser sensor.
Wherein:
twelve variables in (a) are unknown;also a 4x4 matrix.
The values of which are known and can be read directly from the robot teach pendant 4.
A2) From the transformation relationship between the coordinate systems, it can be knownLeft-multiplying both ends of an equation by a matrixThe inverse of (c) yields:
in the formula,is the inverse of the transformation matrix of the coordinate system { End } relative to the coordinate system { Base },BP1Is a certain point P in space1Coordinates in the robot-based coordinate system { Base }),Is a transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, and,MP1Is a point P1The coordinates in the coordinate system M are measured at the two-dimensional laser sensor.
Note the bookObtaining:
wherein T is a column matrix,BP1Is a certain point P in space1Coordinates in the robot-based coordinate system { Base }),Is a transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, and,MP1Is a point P1The coordinates in the coordinate system M are measured at the two-dimensional laser sensor.
Expanding the formula (2) to obtain:
in the formula,is a column matrix,A specific form of a transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, and,Is a column matrix, representing a point P1Coordinates are measured in a two-dimensional laser sensor in a coordinate system { M }.
Further expanding the formula (3) to obtain:
the spatial point P can be known according to the principle and characteristics of the two-dimensional laser sensor1Y-axis coordinate value in two-dimensional laser sensor measurement coordinate system { M }My1Constant at 0, thus giving formula (4).
The meanings of the variables in the formula are shown in the formula (3);
A3) like a certain point P in space1For a point P in space, the same principle applies2、P3The method comprises the following steps:
the meanings of the symbols in the formulas (5) and (6) are shown in formula (4);
the following can be obtained from formula (4), formula (5), and formula (6):
the meaning of the symbols in the formula is shown in formula (3) -formula (6);
equation (7) can be written as follows:
further multiplying both ends of the equation by the inverse of the matrix to obtain:
the symbols in the formulae (8) and (9) have the meanings given in formula (7);
to this end, solve r11、r13、Δx;
Similarly, the following equations (4), (5) and (6) can be obtained:
in the formula,is composed ofThe inverse of (c).
To this end, solve r11、r13Δ x and r21、r23Δ y and r31、r33、Δz;
A4) If the hand-eye relationship matrix is to be completely determinedAlso, it is necessary to solve for r12、r22、r32From the properties of the attitude matrix, it can be known that:
vector quantityAre unit vectors and are orthogonal two by two.
In the formula, the symbols are each a transformation matrix representation of the coordinate system { M } relative to the robot End flange coordinate system { End }.
Therefore, the method comprises the following steps:
from formula (13) can solve r12、r22、r32The solved vectorAccording toAndperforming unitization treatment to obtain unitizedThe vector is solved to obtain the hand-eye relation matrix
A5) Obtaining the following coordinate pose transformation relation:
in the formula,Bp' is a theoretical coordinate value of a certain point P in the space in the coordinate system { Base } derived according to the calibration result,MP is the coordinate of the point in the coordinate system { M }, willBP' andBp can check the precision of the hand-eye calibration by comparing, and only the space point P is utilized1、P2、P3The result of the hand-eye calibration may have a large error in order to improve the calibrationDetermining the accuracy and fault tolerance of the algorithm, selecting 5 space points, selecting 3 points from the 5 space points, and sharingAnd selecting one combination with the smallest error as a calibration result.
The step B (implementation operation step for making the hand-eye calibration) comprises the following steps:
B1) as shown in figure 1, a calibration plate 5 is placed at a proper position in space, a robot is operated through a robot teaching box 4, so that a laser line emitted by a two-dimensional laser sensor 2 installed on a tail end flange 3 of the robot is projected at a certain P point on the calibration plate 5, the calibration plate 5 is located in a range of measuring range of the sensor, and as can be seen from the geometrical relation in figure 1, Z in current sampling data returned by the sensor 2 is found outMAxial minimum ZminI.e. the Z-axis coordinate of the point P in the sensor measurement coordinate system { M }2, and ZminCorresponding to XMThe coordinate values of the axes are X of point P in the coordinate system { M }MAxis coordinates such that the coordinate values of the point P in the measurement coordinate system { M }2 of the laser sensor are readMP; while the robot is kept still, the pose of the robot End flange tool coordinate system { End }3 with respect to the robot Base coordinate system { Base }1 at this time is read from the robot teaching box 4 and recordedEuler angles α, β, gamma and offset deltafx、Δfy、Δfz。Can be obtained by the following formula:
in the formula,is a pose matrix of a tool coordinate system { End } relative to a robot Base coordinate system { Base };
wherein s α and c α respectively represent sin α and cos α, and the rest symbols are analogized in the same way;
B2) as shown in FIG. 2, the robot is operated to make another TCP of the established Tool coordinate system { Tool }6 reach the point P, and the coordinate system of the point P in the robot Base coordinate system { Base }1 is readBAnd P. Will obtainMP、 BP is taken as a group of calibration data;
B3) repeating the steps B1) and B2) to obtain N groups of different calibration data;
B4) inputting the N groups of calibration data obtained in the step B3) into a calibration program, and obtaining a hand-eye calibration result through the operation of the computer 7 according to a calibration algorithm mathematical model
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.
Claims (1)
1. A hand-eye calibration method for a two-dimensional laser vision sensor and a robot is characterized by comprising the following steps:
step A, establishing a mathematical model of a calibration algorithm;
b, making implementation operation steps of hand-eye calibration;
the step B comprises the following steps:
B1) the calibration plate is placed at a proper position in space, and the robot is operated to enable the laser line emitted by the two-dimensional laser sensor arranged on the end flange of the robot to be projected to a point P on the calibration plateReading the coordinate value of the point P in the measuring coordinate system { M } of the laser sensor according to the geometric relationship between the calibration plate and the laser light sensorMP; keeping the robot to be read and recording the pose of the robot End flange tool coordinate system { End } relative to the robot Base coordinate system { Base } at the moment
B2) Operating the robot to make another TCP of the established tool coordinate system reach the P point, reading the coordinate of the P point in the robot Base coordinate system { Base }BP, will obtainMP、 BP is taken as a group of calibration data;
B3) repeating the steps B1) and B2) until the set N groups of different calibration data are obtained;
B4) inputting the N groups of calibration data obtained in the step B3) into a calibration program, and obtaining a hand-eye calibration result through computer calculation
The step A comprises the following steps:
A1) obtaining a point P in space1Coordinates in the robot-based coordinate System { Base }BP1,BP1=[1x1,1y1,1z1,1]TObtaining a point P1Coordinates in the two-dimensional laser sensor measurement coordinate system { M } areMP1,MP1=[Mx1,My1,Mz1,1]TThe hand-eye matrix, i.e. the transformation matrix of the coordinate system { M } relative to the robot End flange coordinate system { End }, is recorded asRelative to the coordinate system { End }The transformation relation of the coordinate system { Base } is
A2) According to the transformation relation between the coordinate systems, the following results are obtained:
left-multiplying both ends of an equation by a matrixThe inverse of (c) yields:
in the formula,is the inverse of the transformation matrix for coordinate system { End } relative to coordinate system { Base },BP1is a certain point P in space1Coordinates in the robot-based coordinate system Base,is a transformation matrix of the coordinate system M with respect to the robot End flange coordinate system End,MP1is a point P1Measuring coordinates in a coordinate system { M } at a two-dimensional laser sensor;
note the bookObtaining:
wherein T is a column matrix,BP1Is a certain point P in space1Coordinates in the robot-based coordinate system Base,is a transformation matrix of the coordinate system M with respect to the robot End flange coordinate system End,MP1is a point P1Measuring coordinates in a coordinate system { M } at a two-dimensional laser sensor;
unfolding the formula (2) to obtain:
in the formula,in the form of a matrix of columns,in the specific form of a transformation matrix of the coordinate system M with respect to the robot End flange coordinate system End,is a column matrix, representing a point P1Measuring coordinates in a coordinate system { M } in a two-dimensional laser sensor;
further expanding the formula (3) to obtain:
the spatial point P can be known according to the principle and characteristics of the two-dimensional laser sensor1Y-axis coordinate value in two-dimensional laser sensor measurement coordinate system { M }My1Constant at 0, thus obtaining formula (4);
A3) for a certain point P in space2And P3The method comprises the following steps:
obtained from formula (4), formula (5) and formula (6):
equation (7) is written as follows:
equation (8) is further transformed into:
to this end, solve r11、r13And Δ x;
obtained from formula (4), formula (5) and formula (6):
to this end, solve r11、r13、Δx、r21、r23、Δy、r31、r33And Δ z;
in the formula,is composed ofThe inverse of (1);
A4) to determine a hand-eye relationship matrixSolving for r from the properties of the attitude matrix12、r22And r32:
Vector quantityAndare unit vectors and are orthogonal in pairs;
in the formula,andare all transformation matrix representations of the coordinate system { M } relative to the robot End flange coordinate system { End }, thus yielding:
from equation (13) to r12、r22、r32The solved vectorAndperforming unitization to obtain normalizedAndvector, so far, the hand-eye relation matrix is solved
A5) Obtaining from the coordinate pose transformation relationship:
wherein,Bp' is a theoretical coordinate value, in the coordinate system { Base }, of the certain point P in the space derived according to the calibration result,MP is the coordinate of the point in the coordinate system { M }, willBP' andBp can check the precision of the hand-eye calibration by comparing, and only the space point P is utilized1、P2、P3The result of the hand-eye calibration may have larger error, and in order to improve the accuracy and fault-tolerant rate of the calibration algorithm, N spatial points are selected, N>3, selecting 3 points from the three, and sharingCombining and selectingThe combination with the smallest error among the combinations is the calibration result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510460526.2A CN105157725B (en) | 2015-07-29 | 2015-07-29 | A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510460526.2A CN105157725B (en) | 2015-07-29 | 2015-07-29 | A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105157725A CN105157725A (en) | 2015-12-16 |
CN105157725B true CN105157725B (en) | 2018-06-29 |
Family
ID=54798664
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510460526.2A Expired - Fee Related CN105157725B (en) | 2015-07-29 | 2015-07-29 | A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105157725B (en) |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10076842B2 (en) * | 2016-09-28 | 2018-09-18 | Cognex Corporation | Simultaneous kinematic and hand-eye calibration |
CN106426172B (en) * | 2016-10-27 | 2019-04-16 | 深圳元启智能技术有限公司 | A kind of scaling method and system of industrial robot tool coordinates system |
CN106335061A (en) * | 2016-11-11 | 2017-01-18 | 福州大学 | Hand-eye relation calibration method based on four-freedom-degree robot |
CN106643805B (en) * | 2016-12-30 | 2020-07-14 | 上海交通大学 | Method for calibrating position of laser positioning sensor in AGV |
CN106839979B (en) * | 2016-12-30 | 2019-08-23 | 上海交通大学 | The hand and eye calibrating method of line structured laser sensor |
CN107256567B (en) * | 2017-01-22 | 2020-08-07 | 梅卡曼德(北京)机器人科技有限公司 | Automatic calibration device and calibration method for hand-eye camera of industrial robot |
CN107253190B (en) * | 2017-01-23 | 2020-09-01 | 梅卡曼德(北京)机器人科技有限公司 | High-precision robot hand-eye camera automatic calibration device and use method thereof |
TWI712473B (en) * | 2017-05-22 | 2020-12-11 | 達明機器人股份有限公司 | Method for calibrating coordinate of robot arm |
CN107152911A (en) * | 2017-06-01 | 2017-09-12 | 无锡中车时代智能装备有限公司 | Based on the PSD dot laser sensors fed back and the scaling method of robot relative position |
CN107560563B (en) * | 2017-07-28 | 2019-10-18 | 华南理工大学 | A kind of calibration of line laser three-dimensional measuring apparatus and error compensating method |
CN109528274A (en) * | 2017-09-22 | 2019-03-29 | 清华大学深圳研究生院 | A kind of method for registering and device |
CN108406777B (en) * | 2018-05-10 | 2023-06-20 | 华南理工大学 | Electronic component hand-eye coordination plug-in mechanism based on robot |
CN108972544A (en) * | 2018-06-21 | 2018-12-11 | 华南理工大学 | A kind of vision laser sensor is fixed on the hand and eye calibrating method of robot |
CN109048893A (en) * | 2018-07-27 | 2018-12-21 | 浙江工业大学 | A kind of mechanical arm localization method based on monocular RGB camera |
CN108972559B (en) * | 2018-08-20 | 2021-08-03 | 上海嘉奥信息科技发展有限公司 | Hand-eye calibration method based on infrared stereoscopic vision positioning system and mechanical arm |
CN111070199A (en) * | 2018-10-18 | 2020-04-28 | 杭州海康威视数字技术股份有限公司 | Hand-eye calibration assessment method and robot |
CN109470138A (en) * | 2018-10-22 | 2019-03-15 | 江苏集萃微纳自动化系统与装备技术研究所有限公司 | The On-line Measuring Method of part |
CN109615662A (en) * | 2018-12-04 | 2019-04-12 | 中冶赛迪工程技术股份有限公司 | A kind of coordinate system scaling method, system, computer readable storage medium and equipment |
CN109623206B (en) * | 2018-12-19 | 2020-05-19 | 清华大学 | Method for optimizing off-line planning welding gun pose in robot pipeline welding |
CN110000790B (en) * | 2019-04-19 | 2021-11-16 | 深圳市科瑞软件技术有限公司 | Calibration method of eye-to-hand system of SCARA robot |
CN110136208B (en) * | 2019-05-20 | 2020-03-17 | 北京无远弗届科技有限公司 | Joint automatic calibration method and device for robot vision servo system |
CN110355464A (en) * | 2019-07-05 | 2019-10-22 | 上海交通大学 | Visual Matching Method, system and the medium of laser processing |
WO2021012124A1 (en) * | 2019-07-19 | 2021-01-28 | 西门子(中国)有限公司 | Robot hand-eye calibration method and apparatus, computing device, medium and product |
CN110345869A (en) * | 2019-08-08 | 2019-10-18 | 江苏汇博机器人技术股份有限公司 | A kind of Robotic Hand-Eye Calibration accuracy evaluation system for Technique Authentication real training |
CN110480642A (en) * | 2019-10-16 | 2019-11-22 | 遨博(江苏)机器人有限公司 | Industrial robot and its method for utilizing vision calibration user coordinate system |
CN110974421B (en) * | 2019-12-13 | 2021-05-11 | 杭州三坛医疗科技有限公司 | Calibration method and system for TCP of surgical robot and storage medium |
CN111956329B (en) * | 2020-08-12 | 2022-04-26 | 中国科学院深圳先进技术研究院 | Calibration method, system, terminal and storage medium for double-arm robot |
CN113759384B (en) * | 2020-09-22 | 2024-04-05 | 北京京东乾石科技有限公司 | Method, device, equipment and medium for determining pose conversion relation of sensor |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102927908A (en) * | 2012-11-06 | 2013-02-13 | 中国科学院自动化研究所 | Robot eye-on-hand system structured light plane parameter calibration device and method |
CN103558850A (en) * | 2013-07-26 | 2014-02-05 | 无锡信捷电气股份有限公司 | Laser vision guided welding robot full-automatic movement self-calibration method |
CN103991006A (en) * | 2014-04-01 | 2014-08-20 | 浙江大学 | Calibration method and device for robot hole forming platform vision measurement system |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080088165A (en) * | 2007-03-29 | 2008-10-02 | 삼성중공업 주식회사 | Robot calibration method |
CN101630409B (en) * | 2009-08-17 | 2011-07-27 | 北京航空航天大学 | Hand-eye vision calibration method for robot hole boring system |
KR20130029883A (en) * | 2011-09-16 | 2013-03-26 | 대우조선해양 주식회사 | Laser vision system calibration method using working joint |
CN102794763B (en) * | 2012-08-31 | 2014-09-24 | 江南大学 | Systematic calibration method of welding robot guided by line structured light vision sensor |
IN2015DN02064A (en) * | 2012-10-05 | 2015-08-14 | Beckman Coulter Inc |
-
2015
- 2015-07-29 CN CN201510460526.2A patent/CN105157725B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102927908A (en) * | 2012-11-06 | 2013-02-13 | 中国科学院自动化研究所 | Robot eye-on-hand system structured light plane parameter calibration device and method |
CN103558850A (en) * | 2013-07-26 | 2014-02-05 | 无锡信捷电气股份有限公司 | Laser vision guided welding robot full-automatic movement self-calibration method |
CN103991006A (en) * | 2014-04-01 | 2014-08-20 | 浙江大学 | Calibration method and device for robot hole forming platform vision measurement system |
Non-Patent Citations (1)
Title |
---|
机器人定点变位姿手-眼标定方法;王胜华 等;《清华大学学报(自然科学版)》;20070228;第47卷(第2期);第165-168页 * |
Also Published As
Publication number | Publication date |
---|---|
CN105157725A (en) | 2015-12-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105157725B (en) | A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot | |
Zhou et al. | Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations | |
Rahman et al. | An efficient camera calibration technique offering robustness and accuracy over a wide range of lens distortion | |
JP5612916B2 (en) | Position / orientation measuring apparatus, processing method thereof, program, robot system | |
CN108717715A (en) | A kind of line-structured light vision system automatic calibration method for arc welding robot | |
WO2018201677A1 (en) | Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system | |
CN107428009A (en) | Method, the industrial robot system using this method and control system for industrial robot debugging | |
CN104802173A (en) | Data generation device for vision sensor and detection simulation system | |
JP2010528318A (en) | 3D assembly inspection with 2D images | |
Grenzdörffer et al. | YCB-M: A multi-camera RGB-D dataset for object recognition and 6DoF pose estimation | |
CN115546289A (en) | Robot-based three-dimensional shape measurement method for complex structural part | |
Luo et al. | Automated tool coordinate calibration system of an industrial robot | |
Wan et al. | Flange-based hand-eye calibration using a 3d camera with high resolution, accuracy, and frame rate | |
WO2017009615A1 (en) | Method for measuring an artefact | |
CN113916128A (en) | Method for improving precision based on optical pen type vision measurement system | |
Chen et al. | A novel hand-eye calibration method using double-layer optimization and outlier sample screening for monocular vision robots | |
TW200841981A (en) | Laser array measurement system for testing three dimensional positioning performance, measuring three dimensional orbit and straightness of arbitrary axis | |
Zhou et al. | A segmental calibration method for a miniature serial-link coordinate measuring machine using a compound calibration artefact | |
Rüther et al. | The narcissistic robot: Robot calibration using a mirror | |
Antonelli et al. | Training by demonstration for welding robots by optical trajectory tracking | |
Jian et al. | Task-Specific Near-Field Photometric Stereo for Measuring Metal Surface Texture | |
CN113624371B (en) | High-resolution visual touch sensor based on binocular vision and point cloud generation method | |
CN214200141U (en) | Robot repeated positioning precision measuring system based on vision | |
CN113048949B (en) | Cylindrical object pose detection method based on line structure optical vision | |
Kana et al. | Robot-sensor calibration for a 3D vision assisted drawing robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180629 |
|
CF01 | Termination of patent right due to non-payment of annual fee |