CN110260786B - Robot vision measurement system based on external tracking and calibration method thereof - Google Patents

Robot vision measurement system based on external tracking and calibration method thereof Download PDF

Info

Publication number
CN110260786B
CN110260786B CN201910557739.5A CN201910557739A CN110260786B CN 110260786 B CN110260786 B CN 110260786B CN 201910557739 A CN201910557739 A CN 201910557739A CN 110260786 B CN110260786 B CN 110260786B
Authority
CN
China
Prior art keywords
calibration
area array
coordinate system
array scanner
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910557739.5A
Other languages
Chinese (zh)
Other versions
CN110260786A (en
Inventor
李文龙
彭泽龙
蒋诚
王刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201910557739.5A priority Critical patent/CN110260786B/en
Publication of CN110260786A publication Critical patent/CN110260786A/en
Application granted granted Critical
Publication of CN110260786B publication Critical patent/CN110260786B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

The invention belongs to the field of robot vision measurement, and discloses a robot vision measurement system based on external tracking and a calibration method thereof. The vision measuring system comprises an external tracking device, a three-dimensional ball cage target, an area array scanner, an industrial robot and a calibration device, wherein the area array scanner is arranged at the tail end of the industrial robot, and the three-dimensional ball cage target is arranged on the area array scanner. The invention also discloses a calibration method of the measurement system, which comprises the steps of respectively scanning the three-dimensional ball cage target and the calibration device by adopting an external tracking device and an area array scanner, acquiring two groups of target data under the state of constant posture and position change and under the state of posture change, then establishing a conversion relation between coordinate systems, and finally calculating by utilizing the two groups of data to obtain a conversion matrix between the coordinate system of the area array scanner and the coordinate system of the three-dimensional ball cage target. By the method and the device, the problem that different solving results are possibly caused by processing the same data by different people or at different moments is avoided, and the stability and consistency of calibration solving are realized.

Description

Robot vision measurement system based on external tracking and calibration method thereof
Technical Field
The invention belongs to the field of robot vision measurement, and particularly relates to a robot vision measurement system based on external tracking and a calibration method thereof.
Background
The measurement of large parts is widely applied to the aspects of digital measurement, digital processing, digital assembly and the like in the industrial manufacturing field, and is one of the technical directions for realizing automation and intellectualization of industrial manufacturing.
The existing measurement of large parts is in contact type and non-contact type. The contact type measurement comprises a three-coordinate measuring machine and a binocular three-coordinate measuring system; the non-contact measurement mainly comprises photogrammetry, and an area array scanner or line laser scanner measurement scheme based on mark point splicing. The contact measurement is generally high in precision, but belongs to point location measurement, and the data acquisition efficiency and the data acquisition quantity are low. After the photogrammetric system arranges the mark points, the batch acquisition of the measuring points can be realized, the data acquisition efficiency and the data acquisition quantity are improved to a certain extent, but most of the photogrammetric system can only be used for measuring the key size; according to the measuring scheme of the area array scanner or the line laser scanner based on the mark point splicing, the point cloud can be efficiently obtained in large batch, but the arrangement and cleaning work of the mark points is time-consuming and labor-consuming. The above-mentioned measuring scheme for arranging the marking points is greatly limited in the working conditions that the marking points cannot be used or are inconvenient to arrange. According to the scheme, the industrial robot carrying the area array scanner realizes high-efficiency, high-precision, high flexibility and large-range acquisition of data, high-precision splicing without mark points is realized through the external tracking device, and a plurality of problems of the existing scheme are successfully avoided.
Disclosure of Invention
Aiming at the defects or the improvement requirements of the prior art, the invention provides a robot vision measuring system based on external tracking and a calibration method thereof.
To achieve the above object, according to the present invention, there is provided a robot vision measuring system based on external tracking, the vision measuring system including an external tracking device, a three-dimensional ball cage target, an area array scanner, an industrial robot and a calibration device, wherein:
the area array scanner is arranged at the tail end of an industrial robot, the three-dimensional ball cage target is arranged on the area array scanner, the external tracking device and the industrial robot are arranged oppositely and used for acquiring pose data of the three-dimensional ball cage target, the calibration device is arranged between the external tracking device and the industrial robot, the area array scanner is used for scanning and acquiring point cloud data of the calibration device, and the industrial robot is used for adjusting the position and the posture of the area array scanner.
Further preferably, the calibration device comprises a black matrix and a plurality of white matte ceramic calibration balls arranged on the matrix, and the three-dimensional coordinates and the radius of each calibration ball uniquely correspond to one calibration ball.
According to another aspect of the present invention, there is provided a calibration method using the external tracking-based robot vision measuring system, the method comprising the steps of:
(a) setting the pose of the area array scanner, acquiring a group of target data through the external tracking device and the area array scanner under the set pose, keeping the pose of the area array scanner unchanged, changing the position of the area array scanner for n-1 times, and acquiring n-1 groups of target data at different positions through the external tracking device and the area array scanner so as to acquire the target data;
(b) changing the pose of the area array scanner m times, and obtaining m groups of target data through the external tracking device and the area array scanner so as to obtain m groups of target data, wherein each group of target data comprises the pose of the three-dimensional ball cage target obtained by the external tracking device and point cloud data of the calibration device scanned by the area array scanner;
(c) processing point cloud data of the calibration device scanned by the area array scanner in the n + m groups of target data to obtain a spherical center coordinate and a radius of each calibration ball in each group of target data in the area array scanner coordinate system, and numbering each calibration ball in the calibration device according to the spherical center coordinate and the radius to obtain a number of each calibration ball;
(d) for the n + m groups of target data, constructing a relational expression (I) of the pose of the three-dimensional ball cage target in an external tracking device, the center coordinates of a calibration ball under the area array scanner coordinate system and a conversion matrix between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system, and calculating according to the relational expression (I) by using the n + m groups of target data to obtain a conversion matrix E between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system;
(e) for the m groups of target data, establishing a relational expression (II) of poses of the three-dimensional ball cage target coordinate system in an external tracking device coordinate system, a transformation matrix between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system, a transformation matrix between the calibration device coordinate system and the external tracking device coordinate system and a transformation matrix between the area array scanner coordinate system and the calibration device coordinate system, and calculating to obtain a plurality of transformation matrices E' between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system by using the m groups of target data according to the relational expression (II);
(f) comparing the plurality of conversion matrixes E' obtained in the step (E) with the conversion matrix E obtained in the step (d) to determine a finally required conversion matrix between the area array scanner coordinate system and the stereoscopic ball cage target coordinate system.
Further preferably, in the step (c), the processing of the point cloud data of the calibration device is preferably performed according to the following steps:
(c1) automatically segmenting and identifying the ball for the point cloud data to obtain a plurality of cambered surface point sets;
(c2) and performing secondary sphere fitting on each cambered surface type point set to obtain a calibration sphere and the radius and the center coordinates of the calibration sphere, and thus obtaining the radius and the center coordinates of each calibration sphere in the point cloud data.
Further preferably, in step (d), the relation (one) is preferably performed according to the following expression:
Figure BDA0002107342010000041
wherein, X1Is the coordinate value of the sphere center when the position j of the calibration sphere i is in the area array scanner, X2Is the coordinate value of the sphere center when the position h of the calibration sphere i is in the area array scanner, h is not equal to j, G1Under the pose j of the area array scanner, the three-dimensional ball cage targetPose marked in external tracking device, G2And E is a transformation matrix between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system.
Further preferably, in step (e), the relation (two) is preferably performed according to the following expression:
G·E=Z·B
g is the pose of the three-dimensional ball cage target in the external tracking device, Z is a conversion matrix between a calibration device coordinate system and an external tracking device coordinate system, and B is a conversion matrix between an area array scanner coordinate system and the calibration device coordinate system.
Further preferably, in step (e), the coordinate system of the calibration device is established by selecting any three calibration balls which are not in a straight line in the calibration device, the plane formed by the centers of the three calibration balls is an XOY plane, the direction of a normal vector of the XOY plane is set as the Z-axis direction, the X direction is a connecting line of the centers of two calibration balls, and the coordinate systems of the three-dimensional ball cage target coordinate system, the external tracking device and the area array scanner are the respective own coordinate systems and are known quantities.
Further preferably, in step (f), the comparing of the plurality of conversion matrices E ' and the conversion matrix E is preferably performed in the following manner, a difference between the conversion matrix E ' and the conversion matrix E is calculated, the conversion matrix E ' larger than a preset threshold is discarded, the conversion matrix E ' corresponding to the smaller than the preset threshold is retained, the retained conversion matrix E ' and the conversion matrix E are summed and then averaged, and an obtained average value is used as a final conversion matrix.
In general, compared with the prior art, the above technical solution contemplated by the present invention can achieve the following beneficial effects:
1. the invention adopts the self-designed standard matte ceramic ball group calibration device to realize that a plurality of groups of data are obtained by single measurement under one measurement pose, thereby improving the data acquisition efficiency, ensuring that the acquired data can ensure that the generated data can distribute the measurement volume of the area array scanner during single measurement as much as possible in the acquisition process, and better evaluating the measurement precision of the area array scanner at different measurement depths of field;
2. the method realizes automatic segmentation of point cloud data, ball identification and high-precision fitting of the ball, eliminates the process of manually processing the point cloud by people, ensures high efficiency of data processing, avoids different solving results possibly caused by processing the same data by different people or at different moments, and realizes stability and consistency of calibration and solution;
3. the method realizes the decoupling operation of the E matrix to be calibrated by keeping the posture of the area array scanner in the external tracking coordinate system unchanged by using the repeated positioning precision of the robot, has stable resolving result, and can be used as the initial value of the E matrix with the calibration although the resolving precision is not high.
Drawings
FIG. 1 is a flow chart of a method for calibrating a vision measurement system constructed in accordance with a preferred embodiment of the present invention;
FIG. 2 is a schematic view of a vision measurement system constructed in accordance with a preferred embodiment of the present invention;
FIG. 3 is a schematic diagram of a calibration arrangement for a vision measurement system constructed in accordance with a preferred embodiment of the present invention;
FIG. 4 is a point cloud data of n sets of calibration devices acquired by a vision measurement system constructed in accordance with a preferred embodiment of the present invention;
FIG. 5 is a point cloud data of m sets of calibration devices acquired by a vision measurement system constructed in accordance with a preferred embodiment of the present invention;
FIG. 6 is a diagram illustrating the effect of stitching calibration device point cloud data after calibration is completed in a vision measurement system constructed in accordance with a preferred embodiment of the present invention;
fig. 7 is a dimension chain transformation diagram of the vision measurement system relation (two) constructed in accordance with a preferred embodiment of the present invention.
The same reference numbers will be used throughout the drawings to refer to the same or like elements or structures, wherein:
the method comprises the following steps of 1-an external tracking device, 2-a three-dimensional ball cage target, 3-an area array scanner, 4-a six-degree-of-freedom industrial robot and 5-a calibration device.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 2, a robot vision measuring system based on external tracking is characterized in that the vision measuring system comprises an external tracking device 1, a three-dimensional ball cage target 2, an area array scanner 3, an industrial robot 4 and a calibration device 5, wherein:
the area array scanner 3 sets up at industrial robot end, three-dimensional ball cage target 2 sets up on the area array scanner, outside tracking device 1 with industrial robot 4 sets up relatively for acquire the position appearance data of three-dimensional ball cage target 2, calibration device 5 sets up between outside tracking device 1 and the industrial robot 4, area array scanner 3 is used for scanning and acquires calibration device 5's point cloud data, industrial robot 4 is used for adjusting the position and the gesture of area array scanner 3. Poses as described herein include positions and poses.
As shown in fig. 3, the calibration device includes a black matrix and a plurality of white matte ceramic calibration balls disposed on the matrix, and the three-dimensional coordinates and radius of each calibration ball uniquely correspond to one calibration ball.
As shown in fig. 1, a calibration method for a three-dimensional measurement system of a robot based on external tracking is characterized by comprising the following steps:
(a) setting the pose of the area array scanner, obtaining a group of target data through the external tracking device and the area array scanner under the set pose, keeping the pose of the area array scanner unchanged, changing the position of the area array scanner for n-1 times, and obtaining n-1 groups of target data under different positions through the external tracking device and the area array scanner so as to obtain n groups of target data under the set pose;
(b) changing the pose of the area array scanner m times, as shown in fig. 5, obtaining m sets of target data through the external tracking device and the area array scanner, so as to obtain n + m sets of target data, wherein each set of target data comprises the pose of the three-dimensional ball cage target obtained by the external tracking device and the point cloud data of the calibration device scanned by the area array scanner;
(c) as shown in fig. 6, processing the point cloud data of the calibration device scanned by the area array scanner in the n + m sets of target data to obtain a spherical center coordinate and a radius of each calibration ball in each set of target data in the area array scanner coordinate system, and numbering each calibration ball in the calibration device according to the spherical center coordinate and the radius to obtain a number of each calibration ball; and in the point cloud data in different groups of target data, the same calibration ball has the same number.
The point cloud data of the calibration device is processed according to the following steps:
(c1) automatically segmenting and identifying the ball for the point cloud data to obtain a plurality of cambered surface point sets;
(c2) and performing secondary sphere fitting on each cambered surface type point set to obtain a calibration sphere and the radius and the center coordinates of the calibration sphere, and thus obtaining the radius and the center coordinates of each calibration sphere in the point cloud data.
(d) For the n + m groups of target data, constructing a relational expression (I) of the pose of the three-dimensional ball cage target in an external tracking device, the center coordinates of a calibration ball under the area array scanner coordinate system and a conversion matrix between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system, and calculating according to the relational expression (I) by using the n + m groups of target data to obtain a conversion matrix E between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system;
establishing a coordinate transformation formula according to the closed dimension chain,
Figure BDA0002107342010000071
wherein, XsIs the position coordinate value of the sphere center coordinate in the area array scanner coordinate system under the current scanning posture;
Figure BDA0002107342010000072
converting a matrix between the area array scanner coordinate system and a three-dimensional ball cage target coordinate system to obtain a to-be-solved quantity;
Figure BDA0002107342010000073
the pose of the three-dimensional ball cage target in the external tracking device is under the current scanning pose; xwThe coordinate value of the corresponding sphere center coordinate in the coordinate system of the external tracking device is a constant. R0Is the attitude, t, of the three-dimensional rzeppa target coordinate system under the coordinate system of the external tracking device0Is the position, R, of the three-dimensional rzeppa target coordinate system under the coordinates of an external tracking deviceeIs the attitude, t, of the area array scanner coordinate system under the three-dimensional ball cage target coordinate systemeIs the position of the scanning coordinate system under the three-dimensional ball cage target coordinate system.
Taking any two groups of data in the m groups of data, the obtained equation is as follows:
Figure BDA0002107342010000081
Figure BDA0002107342010000082
the equations are identical on the left, and the right equality simplifies the equation to obtain the relation (one), the relation
The following expression is preferably used:
Figure BDA0002107342010000083
wherein, X1Is a calibration ball i on the surfaceCoordinate value of sphere center at pose j of array scanner, X2The coordinate value of the sphere center when the calibration sphere i is at the pose h of the area array scanner, h ≠ j, h and j can be simultaneously from n groups of target data, also can be simultaneously from m groups of target data, also can be one from n groups of target data, and the other from m groups of target data, G1Is the pose G of the three-dimensional ball cage target in the external tracking device under the pose j of the area array scanner2Is the pose of the three-dimensional ball cage target in the external tracking device under the pose h of the area array scanner, E is a transformation matrix between the coordinate system of the area array scanner and the coordinate system of the three-dimensional ball cage target,
Figure BDA0002107342010000084
Figure BDA0002107342010000085
wherein R is01Is the pose of the three-dimensional rzeppa target in the external tracking device at position j of the area array scanner, t01Is the position of the three-dimensional ball cage target in the external tracking device under the position j of the area array scanner, R02Is the attitude, t, of the three-dimensional rzeppa target in the external tracking device at the position h of the area array scanner02Is the position of the three-dimensional ball cage target in the external tracking device at the position h of the area array scanner, ReIs the attitude, t, of the area array scanner coordinate system under the three-dimensional ball cage target coordinate systemeIs the position, X, of the area array scanner coordinate system under the three-dimensional ball cage target coordinate systems1Is the coordinate value of the center of the sphere when the calibration sphere i is at the position j of the area array scanner, Xs2Is the coordinate value of the sphere center when the calibration sphere i is at the position h of the area array scanner.
The above relation (one) can be solved as a mathematical model of AX ═ XB. If the positions h and j are from the n sets of data, then the relation (one) can be used to solve for ReIf the positions h and j are from the m sets of data, then the relation (one) can be used to solve for te
(e) For the m sets of target data, a transformation matrix of the three-dimensional ball cage target coordinate system in an external tracking device coordinate system, a transformation matrix between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system, a transformation matrix between the calibration device coordinate system and the external tracking device coordinate system, and a relational expression (ii) of the transformation matrix between the area array scanner coordinate system and the calibration device coordinate system are constructed, as shown in fig. 7, in step (e), the coordinate system of the calibration device is established by selecting any three calibration balls which are not on a straight line in the calibration device according to the following manner, a plane formed by the centers of the three calibration balls is an XOY plane, the direction of a normal vector of the XOY plane is set as the direction of the Z axis, the direction of the X axis is a connection line of two centers of the calibration balls, and the Y axis is obtained by a right-hand rule. The coordinate systems of the three-dimensional ball cage target coordinate system, the external tracking device and the area array scanner are respectively self-contained coordinate systems, are known quantities, and a plurality of conversion matrixes E' between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system are obtained by utilizing the m groups of target data and calculating according to the relation (II);
by closing the dimension chain as shown in FIG. 4, a coordinate transformation equation is established
Figure BDA0002107342010000091
The method is simplified to the following relation (II) according to the following expression:
G·E=Z·B
wherein G is the pose of the three-dimensional ball cage target in the external tracking device, the transformation matrix between the Z calibration device coordinate system and the external tracking device coordinate system is an unknown constant, and B is the transformation matrix between the area array scanner coordinate system and the calibration device coordinate system.
Figure BDA0002107342010000101
The relation (II) is developed to obtain
R0Re=RZRBR0te+t0=RZtB+tz
R0Is the posture, t, of the three-dimensional ball cage target coordinate system under the external tracking coordinate system0Is the position of the three-dimensional ball cage target coordinate system under the external tracking coordinate, RBIs the attitude, t, of the coordinate system of the area array scanner in the coordinate system of the calibration deviceBIs the position, R, of the area array scanner coordinate system in the calibration device coordinate systemzIs the attitude, t, of the calibration apparatus coordinate system in the tracker coordinate systemzIs the position of the calibration apparatus coordinate system in the tracker coordinate system.
R can be obtained according to the transformation relation between the orthogonal unit array and the unit quaternion0Equivalent transformation to q0 T=[a0,aT]Wherein a is0Is a real part, aTIs an imaginary part, i.e. R0Re=RZRBIs equivalent to q0*qe=qZ*qB
For the ith measurement attitude
q0t*qe=Q(q0t)qeqZ*qBt=W(qBt)qZ
q0tIs the quaternion representation of the posture of the three-dimensional ball cage target coordinate system under the external tracking coordinate system when the area array scanner is in the position t, qBtThe quaternion representation form of the posture of the area array scanner coordinate system under the coordinate system of the calibration device is shown when the area array scanner is in the posture t.
From the formula R0Re=RZRBObtaining Q (Q)0i)qe-W(qBi)qZ=0
Wherein
Figure BDA0002107342010000102
For any unit quaternion q, there is Q (q)TQ(q)=qTqI=I,W(q)TW(q)=qTqI=I
Where I is an identity matrix of 4 × 4, so there is
||Q(q0t)qe-W(qBt)qZ||2=|Q(q0t)qe-W(qBt)qZ|T|Q(q0t)qe-W(qBt)qZ|=vTStv
Wherein v isT=[qe T,qZ T],
Figure BDA0002107342010000111
Ct=-Q(q0t)TW(qBt)T
Then the function to be optimized for n poses is:
Figure BDA0002107342010000112
wherein
Figure BDA0002107342010000113
Based on the quadratic Lagrange multiplier, an optimization function can be constructed as
Figure BDA0002107342010000114
Wherein λ is12For lagrange multiplier
Finally, the analytical solution of the optimization function is realized by utilizing the closed structure of the mathematical model and adopting a mathematical analysis mode to obtain qe,qZI.e. solve to obtain ReAnd RZWhen R iseIt is known that the m sets of data are obtained by the same principle as the relational expression (one) to obtain te
The method comprises the steps of converting a unit orthogonal array into unit four elements, vectorizing variables to be solved, constructing an optimization function, solving an objective function to be optimized based on a quadratic Lagrange multiplier, combining a closed structure of equation solving, and obtaining an analytic solution of the function to be optimized in a mathematical analysis mode.
(f) Comparing the plurality of conversion matrixes E' obtained in the step (E) with the conversion matrix E obtained in the step (f) to determine a finally required conversion matrix between the area array scanner coordinate system and the stereoscopic ball cage target coordinate system. Specifically, the conversion matrices E ' and the conversion matrices E are compared, preferably according to the following manner, a difference between the conversion matrices E ' and the conversion matrices E is calculated, the conversion matrices E ' larger than a preset threshold are discarded, the conversion matrices E ' smaller than the preset threshold are retained, the retained conversion matrices E ' and the conversion matrices E are summed and then averaged, and an obtained average value is used as a final conversion matrix.
Further preferably, in the step (c1), in order to ensure the point cloud processing speed, the point cloud data is down-sampled at equal intervals, and the total measurable area of a single scan can be estimated according to the number of balls and the diameter of each ball in the standard matte ceramic ball group calibration device, so as to obtain the estimated data amount of the measuring points;
when the collected point cloud data contains large-picture background noise, a random consistent sampling algorithm taking a plane model as reference is adopted to extract and remove the main plane of the background noise, and the standard matte ceramic ball group measurement data containing small background noise data is obtained.
And clustering the data subjected to the removal of the main background noise based on Euclidean distance to obtain small-point cloud blocks communicated with the region, and segmenting and identifying the spherical point cloud data by adopting a random consistent sampling algorithm based on a spherical model as reference.
Further preferably, when the fitting balls are sorted from small to large according to the radius of the fitting balls, because the standard matte ceramic ball group calibration device has ceramic balls with the same theoretical radius, the sorted balls need to be clustered in segments, and the balls in each segment are sorted again from near to far according to the preset distance. Therefore, the balls in the standard matte ceramic ball group calibration device can be ensured to obtain the same serial number from different postures.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (8)

1. A calibration method of a robot vision measurement system based on external tracking is characterized by comprising the following steps:
(a) setting the pose of an area array scanner, obtaining a group of target data through an external tracking device and the area array scanner under the set pose, keeping the pose of the area array scanner unchanged, changing the position of the area array scanner for n-1 times, and obtaining n-1 groups of target data under different positions through the external tracking device and the area array scanner so as to obtain n groups of target data under the set pose;
(b) changing the pose of the area array scanner m times, and obtaining m groups of target data through the external tracking device and the area array scanner so as to obtain n + m groups of target data, wherein each group of target data comprises the pose of the three-dimensional ball cage target obtained by the external tracking device and point cloud data of a calibration device scanned by the area array scanner;
(c) processing point cloud data of the calibration device scanned by the area array scanner in the n + m groups of target data to obtain a spherical center coordinate and a radius of each calibration ball in each group of target data in the area array scanner coordinate system, and numbering each calibration ball in the calibration device according to the spherical center coordinate and the radius to obtain a number of each calibration ball;
(d) for the n + m groups of target data, constructing a relational expression (I) of the pose of the three-dimensional ball cage target in an external tracking device, the center coordinates of a calibration ball under the area array scanner coordinate system and a conversion matrix between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system, and calculating according to the relational expression (I) by using the n + m groups of target data to obtain a conversion matrix E between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system;
(e) for the m groups of target data, establishing a relational expression (II) of poses of the three-dimensional ball cage target coordinate system in an external tracking device coordinate system, a transformation matrix between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system, a transformation matrix between the calibration device coordinate system and the external tracking device coordinate system and a transformation matrix between the area array scanner coordinate system and the calibration device coordinate system, and calculating to obtain a plurality of transformation matrices E' between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system by using the m groups of target data according to the relational expression (II);
(f) comparing the plurality of conversion matrixes E' obtained in the step (E) with the conversion matrix E obtained in the step (d) to determine a finally required conversion matrix between the area array scanner coordinate system and the stereoscopic ball cage target coordinate system.
2. The calibration method according to claim 1, wherein in step (c), the processing of the point cloud data of the calibration device is performed according to the following steps:
(c1) automatically segmenting and identifying the ball for the point cloud data to obtain a plurality of cambered surface point sets;
(c2) and performing secondary sphere fitting on each cambered surface type point set to obtain a calibration sphere and the radius and the center coordinates of the calibration sphere, and thus obtaining the radius and the center coordinates of each calibration sphere in the point cloud data.
3. A calibration method according to claim 1 or 2, wherein in step (d), the relation (one) is performed according to the following expression:
G1 -1·G2·E=E·X1·X2 +
wherein, X1Is the coordinate value of the sphere center when the position j of the calibration sphere i is in the area array scanner, X2Is the coordinate value of the sphere center when the position h of the calibration sphere i is in the area array scanner, h is not equal to j, G1Is at the pose j of the area array scannerPose of a stereo rzeppa target in an external tracking device, G2And E is a transformation matrix between the area array scanner coordinate system and the three-dimensional ball cage target coordinate system.
4. A calibration method according to claim 1, wherein in step (e), the relation (two) is performed according to the following expression:
G·E=Z·B
g is the pose of the three-dimensional ball cage target in the external tracking device, Z is a conversion matrix between a calibration device coordinate system and an external tracking device coordinate system, and B is a conversion matrix between the area array scanner coordinate system and the calibration device coordinate system.
5. A calibration method according to claim 3, wherein in step (e), the coordinate system of the calibration device is established by selecting any three calibration balls which are not in a straight line in the calibration device, the plane formed by the centers of the three calibration balls is an XOY plane, the direction of the normal vector of the XOY plane is set as the Z-axis direction, the X-direction is the line connecting the centers of two of the calibration balls, and the coordinate systems of the three-dimensional ball cage target coordinate system, the external tracking device and the area array scanner are the respective coordinate systems of their own and are known quantities.
6. The calibration method according to claim 1, wherein in step (f), the comparing of the plurality of transformation matrices E ' with the transformation matrices E is performed in the following manner, the difference between the transformation matrices E ' and the transformation matrices E is calculated, the transformation matrices E ' larger than a preset threshold are discarded, the transformation matrices E ' smaller than the preset threshold are retained, the retained transformation matrices E ' are summed with the transformation matrices E and then averaged, and the obtained average value is used as the final transformation matrix.
7. An external tracking based robot vision measuring system corresponding to the calibration method of any one of claims 1-6, characterized in that the vision measuring system comprises an external tracking device (1), a three-dimensional ball cage target (2), an area array scanner (3), an industrial robot (4) and a calibration device (5), wherein:
the device comprises an area array scanner (3), a three-dimensional ball cage target (2), an external tracking device (1), an industrial robot (4), a calibration device and an industrial robot, wherein the area array scanner (3) is arranged at the tail end of the industrial robot, the three-dimensional ball cage target (2) is arranged on the area array scanner, the external tracking device (1) and the industrial robot (4) are arranged oppositely, the calibration device is arranged between the external tracking device (1) and the industrial robot (4), the area array scanner (3) is used for scanning and acquiring point cloud data of the calibration device, and the industrial robot (4) is used for adjusting the position and the posture of the area array scanner (3).
8. The external tracking based robot vision measuring system of claim 7, wherein the calibration means comprises a black matrix and a plurality of white matte ceramic calibration balls disposed on the matrix, each calibration ball having three-dimensional coordinates and radius corresponding to only one calibration ball.
CN201910557739.5A 2019-06-26 2019-06-26 Robot vision measurement system based on external tracking and calibration method thereof Active CN110260786B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910557739.5A CN110260786B (en) 2019-06-26 2019-06-26 Robot vision measurement system based on external tracking and calibration method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910557739.5A CN110260786B (en) 2019-06-26 2019-06-26 Robot vision measurement system based on external tracking and calibration method thereof

Publications (2)

Publication Number Publication Date
CN110260786A CN110260786A (en) 2019-09-20
CN110260786B true CN110260786B (en) 2020-07-10

Family

ID=67921516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910557739.5A Active CN110260786B (en) 2019-06-26 2019-06-26 Robot vision measurement system based on external tracking and calibration method thereof

Country Status (1)

Country Link
CN (1) CN110260786B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111486867B (en) * 2020-03-19 2023-08-11 天津大学 Calibration device and method for installation parameters of vision and inertia mixed tracking assembly
CN111489399B (en) * 2020-03-19 2023-04-14 天津大学 Device and method for calibrating installation parameters of visual tracking assembly
CN111546328B (en) * 2020-04-02 2022-06-24 天津大学 Hand-eye calibration method based on three-dimensional vision measurement
CN111561868A (en) * 2020-05-21 2020-08-21 郑州辰维科技股份有限公司 Method for realizing non-contact measurement of antenna profile by utilizing optical tracking structure optical scanner
CN112146571B (en) * 2020-09-25 2022-06-14 浙江汉振智能技术有限公司 Non-contact three-dimensional measurement system for large-scale component and data splicing method
CN112504187B (en) * 2020-11-13 2022-02-11 复旦大学 Autonomous navigation system and method applied to mobile measurement
CN112880557B (en) * 2021-01-08 2022-12-09 武汉中观自动化科技有限公司 Multi-mode tracker system
CN112964196B (en) * 2021-02-05 2023-01-03 杭州思锐迪科技有限公司 Three-dimensional scanning method, system, electronic device and computer equipment
CN113295142B (en) * 2021-05-14 2023-02-21 上海大学 Terrain scanning analysis method and device based on FARO scanner and point cloud
CN114406985B (en) * 2021-10-18 2024-04-12 苏州迪凯尔医疗科技有限公司 Mechanical arm method, system, equipment and storage medium for target tracking

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
CN108180834A (en) * 2018-02-05 2018-06-19 中铁二十二局集团有限公司 A kind of industrial robot is the same as three-dimensional imaging instrument position orientation relation scene real-time calibration method
CN109079774A (en) * 2018-05-04 2018-12-25 南京航空航天大学 A kind of isotropism visual sensing three-dimensional spherical target and scaling method
CN109579766A (en) * 2018-12-24 2019-04-05 苏州瀚华智造智能技术有限公司 A kind of product shape automatic testing method and system
CN109636837A (en) * 2018-12-21 2019-04-16 浙江大学 A kind of evaluation method of monocular camera and ginseng calibration accuracy outside millimetre-wave radar

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9964398B2 (en) * 2015-05-06 2018-05-08 Faro Technologies, Inc. Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform
CN108994827A (en) * 2018-05-04 2018-12-14 武汉理工大学 A kind of robot measurement-system of processing scanner coordinate system automatic calibration method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
CN108180834A (en) * 2018-02-05 2018-06-19 中铁二十二局集团有限公司 A kind of industrial robot is the same as three-dimensional imaging instrument position orientation relation scene real-time calibration method
CN109079774A (en) * 2018-05-04 2018-12-25 南京航空航天大学 A kind of isotropism visual sensing three-dimensional spherical target and scaling method
CN109636837A (en) * 2018-12-21 2019-04-16 浙江大学 A kind of evaluation method of monocular camera and ginseng calibration accuracy outside millimetre-wave radar
CN109579766A (en) * 2018-12-24 2019-04-05 苏州瀚华智造智能技术有限公司 A kind of product shape automatic testing method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陶瓷素坯磨抛机器人的三维加工目标检测与运动规划研究;刁世普;《中国博士学位论文全文数据库 信息科技辑》;20190215(第2期);第31-35页 *

Also Published As

Publication number Publication date
CN110260786A (en) 2019-09-20

Similar Documents

Publication Publication Date Title
CN110260786B (en) Robot vision measurement system based on external tracking and calibration method thereof
CN111080627B (en) 2D +3D large airplane appearance defect detection and analysis method based on deep learning
CN106651752B (en) Three-dimensional point cloud data registration method and splicing method
CN108555908B (en) Stacked workpiece posture recognition and pickup method based on RGBD camera
CN109360240B (en) Small unmanned aerial vehicle positioning method based on binocular vision
CN113436260B (en) Mobile robot pose estimation method and system based on multi-sensor tight coupling
CN101876532B (en) Camera on-field calibration method in measuring system
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN111551111B (en) Part feature robot rapid visual positioning method based on standard ball array
Luna et al. Calibration of line-scan cameras
CN110823252B (en) Automatic calibration method for multi-line laser radar and monocular vision
CN111640158B (en) End-to-end camera and laser radar external parameter calibration method based on corresponding mask
CN110910454A (en) Automatic calibration registration method of mobile livestock three-dimensional reconstruction equipment
CN110763204B (en) Planar coding target and pose measurement method thereof
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN112991460B (en) Binocular measurement system, method and device for obtaining size of automobile part
CN107917700A (en) The 3 d pose angle measuring method of target by a small margin based on deep learning
CN112102414A (en) Binocular telecentric lens calibration method based on improved genetic algorithm and neural network
CN109443200A (en) A kind of mapping method and device of overall Vision coordinate system and mechanical arm coordinate system
CN111583342A (en) Target rapid positioning method and device based on binocular vision
Tao et al. A convenient and high-accuracy multicamera calibration method based on imperfect spherical objects
CN111008602A (en) Two-dimensional and three-dimensional visual combined lineation feature extraction method for small-curvature thin-wall part
CN110838146A (en) Homonymy point matching method, system, device and medium for coplanar cross-ratio constraint
CN116909208B (en) Shell processing path optimization method and system based on artificial intelligence
CN107958468B (en) Method for calibrating central catadioptric camera by three balls with different spatial positions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20190920

Assignee: WUHAN POWER3D TECHNOLOGY Ltd.

Assignor: HUAZHONG University OF SCIENCE AND TECHNOLOGY

Contract record no.: X2022420000110

Denomination of invention: A robot vision measurement system based on external tracking and its calibration method

Granted publication date: 20200710

License type: Common License

Record date: 20220930