CN109827521B - Calibration method for rapid multi-line structured optical vision measurement system - Google Patents

Calibration method for rapid multi-line structured optical vision measurement system Download PDF

Info

Publication number
CN109827521B
CN109827521B CN201910202385.2A CN201910202385A CN109827521B CN 109827521 B CN109827521 B CN 109827521B CN 201910202385 A CN201910202385 A CN 201910202385A CN 109827521 B CN109827521 B CN 109827521B
Authority
CN
China
Prior art keywords
light
plane
planes
light plane
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910202385.2A
Other languages
Chinese (zh)
Other versions
CN109827521A (en
Inventor
武栓虎
李爱娟
辛睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Huaxing Zhizao Technology Co.,Ltd.
Original Assignee
Yantai University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yantai University filed Critical Yantai University
Priority to CN201910202385.2A priority Critical patent/CN109827521B/en
Publication of CN109827521A publication Critical patent/CN109827521A/en
Application granted granted Critical
Publication of CN109827521B publication Critical patent/CN109827521B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

A calibration method for a rapid multi-line structured light vision measurement system comprises the following steps: (1) respectively calibrating a head light plane and a tail light plane (a first light plane and a last light plane) to obtain a unit normal vector and a distance parameter of the light planes; (2) calculating the rotation angles and the rotation vectors of the two light planes by using the normal vectors of the head light plane and the tail light plane obtained in the step (1); (3) calculating the included angle between every two adjacent light planes by utilizing the characteristics that the multi-line structured light is equidistantly parallel and is intersected in a straight line and the rotation angle calculated in the step (2); (4) constructing a rotation matrix between the light planes by using the included angle between the light planes calculated in the step (3) and the rotation vector calculated in the step (2); (5) calculating a unit normal vector of each middle light plane according to the rotation matrix obtained in the step (4) and the unit normal vector of the first light plane; (6) and (2) calculating a common intersection line of each light plane by using the head and tail light plane equation obtained in the step (1), sampling a plurality of points, and calculating distance parameters of each light plane equation in the middle respectively.

Description

Calibration method for rapid multi-line structured optical vision measurement system
Technical Field
The invention relates to the calibration problem of a structured light-based vision measurement system in the field of computer vision application, in particular to a quick calibration method of a line structured light vision measurement system, wherein an intersection line image of the system has equidistant parallel characteristics when the system is vertically projected on an imaging plane.
Background
Linear structured light measurement is a method of projecting a linear structured light plane onto a target surface to obtain three-dimensional coordinates of points of the plane at intersections of the plane on the target surface to be measured. In practical application, the intersecting line profiles are spliced by adding third axis motion information, and the three-dimensional curved surface profile of the surface of the complex target can be detected. The multi-line structured light is an extension of the single-line structured light, avoids increasing third-dimensional motion information, can obtain three-dimensional coordinates of a target surface at one time, has the advantages of high speed, high precision, strong anti-interference performance and the like, and is widely applied in recent years. The conventional multi-line structured light projector consists of a light source and a lens engraved with equidistant parallel lines, and when the light projector is vertically projected to a plane target, the intersection line image has the characteristic of being parallel and equidistant (as shown in figure 1). At present, the number of lines that can be projected by a multi-line structured light projector appearing on the market is from 7 lines to 81 lines (generally, odd lines), but the calibration process of the multi-line structured light projector always uses a calibration strategy of single-line structured light, that is, firstly, feature points on an intersection line image of each light plane and a target are detected, indexed and estimated one by one, and then a light plane equation is obtained, so that the steps are complex, and the maximum calibration precision of the system depends on the calibration error of each light plane.
Disclosure of Invention
Based on the problems, the invention provides a method for calibrating a rapid multi-line structured light vision measuring system aiming at the characteristic that common multi-line structured light is equidistantly parallel, no matter how many light planes, only a head light plane equation and a tail light plane equation (a first light plane and a last light plane equation) need to be calibrated, so that the parameters of other light planes can be calculated, the overall precision depends on the calibration precision of the head light plane and the tail light plane, and the method is rapid, practical and has good popularization and application values.
The invention is realized by the following technical scheme.
A calibration method of a rapid multi-line structured light vision measurement system comprises the following steps:
(1) respectively calibrating a head light plane and a tail light plane (a first light plane and a last light plane) to obtain a unit normal vector and a distance parameter of the light planes;
(2) calculating the rotation angles and the rotation vectors of the two light planes by using the normal vectors of the head light plane and the tail light plane obtained in the step (1);
(3) calculating the included angle between every two adjacent light planes by utilizing the characteristics that the multi-line structured light is equidistantly parallel and is intersected in a straight line and the rotation angle calculated in the step (2);
(4) constructing a rotation matrix between the light planes by using the included angle between the light planes calculated in the step (3) and the rotation vector calculated in the step (2);
(5) calculating a unit normal vector of each middle light plane according to the rotation matrix obtained in the step (4) and the unit normal vector of the first light plane;
(6) and (2) calculating a common intersection line of each light plane by using the head and tail light plane equation obtained in the step (1), sampling a plurality of points, and calculating distance parameters of each light plane equation in the middle respectively.
The calibration method of the rapid multi-line structured light vision measurement system is based on a multi-line structured light vision measurement system model, the calibration process calibration comprises camera parameter calibration and light plane calibration, and the camera model is as follows:
Figure BDA0001991116980000021
wherein: s is a constant proportionality coefficient, (u, v) is an image coordinate of a point on an intersecting line of the target and the plane of the corresponding structured light, K is a projection matrix, fx and fy are focal length parameters of the camera in x and y directions respectively, Cx and Cy are imaging principal points, [ x ] andc,yc,zc]Tis the three-dimensional coordinates of a point under the camera coordinate system;
the multi-line structured light plane model is as follows:
Figure BDA0001991116980000022
wherein: 2N +1 is the number of light planes, [ N ]i,x,ni,y,ni,z]TRepresents the unit normal vector of the ith light plane, [ x ]c,yc,zc]TIs any three-dimensional coordinate, | d, on the light plane of a point in the camera coordinate systemiAnd | is the vertical distance from the origin of the camera coordinates to the laser plane.
In the calibration method of the rapid multiline structured light vision measurement system, in the step (1), the head and tail single structured light planes are rapidly calibrated based on the camera projection matrix and the homography matrix, and the homography matrix is determined according to all the feature points and the corresponding image pixel coordinates of the target by the following formula:
Figure BDA0001991116980000023
wherein H is a homography, [ u, v [ ]]TFor projection coordinates, [ x ]w,yw]TCorresponding world coordinates on a target plane; according to the homography matrix, the world coordinates corresponding to any pixel on the intersection line of the light plane and the plane target can be obtained, and then the three-dimensional coordinates of the feature points in the camera coordinate system are as follows:
Figure BDA0001991116980000031
according to the calibration method of the rapid multiline structured light vision measurement system, in the calibration operation of the head and tail single structured light planes, each mobile plane target is twice, three-dimensional coordinates corresponding to pixel points on two straight lines are respectively obtained, and a head and tail light plane equation is obtained by utilizing least square fitting:
n1,xx+n1,yy+n1,zz=d1
n2N+1,xx+n2N+1,yy+n2N+1,zz=d2N+1
wherein: [ n ] of1,x,n1,y,n1,z]TAnd [ n2N+1,x,n2N+1,y,n2N+1,z]TIs a unit normal vector of head and tail light planes, | d1I and | d2N+1And | is the distance from the origin of the camera coordinate system to the head and tail light planes respectively.
In the calibration method of the optical vision measurement system with the fast multi-line structure, in the step (2), the rotation angle θ between the head and the tail of the two optical planes is obtained by the following formula based on the inner product of the unit normal vector:
cosθ=n1,x·n2N+1,x+n1,y·n2N+1,y+n1,z·n2N+1,z
the unit rotation vector is obtained by the following formula:
Figure BDA0001991116980000032
the rotation matrix R of the rotation angle θ around the rotation vector is:
Figure BDA0001991116980000033
in the calibration method of the rapid multi-line structured light vision measuring system, in the step (3), an included angle between each two adjacent light planes is as follows:
Figure BDA0001991116980000034
Figure BDA0001991116980000035
and:
θ2N+1-i=θi,i=1,2,...,N。
in the calibration method of the optical vision measuring system with the rapid multi-line structure, in the step (4), the rotation vector and the rotation matrix R between the optical planes are obtained according to the included angle between the adjacent optical planesi,i+1In the step (5), the unit normal vector of each middle light plane is obtained from the unit normal vector of the first light plane based on the following formula:
Figure BDA0001991116980000041
in the calibration method of the rapid multi-line structured light vision measurement system, in the step (6), the common intersection line of each light plane is obtained by a simultaneous head and tail light plane equation:
Figure BDA0001991116980000042
wherein:
Figure BDA0001991116980000043
is the direction vector of the intersecting line (the unit rotation vector described above), [ x ]0,y0,z0]Is any point on the straight line; by sampling M points [ x ] on a straight linei,yi,zi]I 1,2, M, obtaining an intermediate light plane distance parameter diComprises the following steps:
Figure BDA0001991116980000044
the calibration method of the rapid multi-line structured light vision measurement system is applied to modeling of the four-wheel positioning tire of the non-contact automobile.
The invention has the beneficial effects that:
1. according to the vision measuring system constructed by the method, the maximum error is determined by the errors of the first light plane and the last light plane, so that the accuracy of the whole system can be ensured as long as the parameters of the first light plane and the last light plane are calibrated with high accuracy;
2. compared with the traditional line-by-line calibration algorithm, the method does not need to detect, process and index the light strip images of the middle light planes, greatly reduces the complicated steps and precision burden of image processing, and is rapid, efficient and stable;
3. the experimental result shows that according to the method, the maximum error can be controlled below 0.15mm at the measurement distance of 500mm, and the actual modeling application is completely met;
4. the method disclosed by the invention is well verified in applications such as contactless automobile four-wheel positioning tire modeling, and the like, is convenient and quick, is low in calibration cost, and has a good popularization prospect.
Drawings
The aspects and advantages of the present application will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of parallel line structured light vertical projection imaging.
FIG. 2 is a schematic diagram of the operation of the parallel line structured light measurement system.
FIG. 3 is a schematic plane projection of a homography matrix.
FIG. 4 is a schematic diagram of experimental calibration operation.
FIG. 5 is a schematic view of a vertical projection cross section of parallel structured light.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings and the principles on which the invention is based.
Firstly, a multi-line vision measurement system model is introduced:
the line structured light vision measuring system mainly comprises a line laser emitter and a camera, wherein, in order to ensure the measuring precision, the included angle between the camera imaging direction and the light plane projection direction is preferably between 30 and 60. The laser sensor emits a laser plane and projects the laser plane on the surface of an object, and according to the image pixel coordinates of the intersection line of the laser formed on the surface of the object, the three-dimensional coordinates of all points on the intersection line of the target surface and the laser plane can be calculated, and a three-dimensional model is built for the measured target, as shown in fig. 2.
The calibration of the line structured light measurement system consists of two parts, namely camera parameter calibration and light plane calibration. The single-camera calibration currently and generally adopts Zhang Zhengyou algorithm to calibrate in advance, the main parameters include a camera projection matrix and a lens distortion parameter, and a camera projection measurement model is as follows:
Figure BDA0001991116980000051
in the above formula, s is a constant proportionality coefficient, (u, v) is an image coordinate of a point on an intersection line of the target and the corresponding structured light plane, K is a projection matrix, also called an internal reference matrix, where fx and fy are focal length parameters of the camera in x and y directions, respectively, and Cx and Cy are imaging principal points (intersection points of the optical axis and the imaging plane). [ x ] ofc,yc,zc]TIs the three-dimensional coordinates of a point in the camera coordinate system.
The multiline structured light plane model can be represented by:
Figure BDA0001991116980000052
where 2N +1 is the number of optical planes, [ N ]i,x,ni,y,ni,z]TRepresents the unit normal vector of the ith light plane, [ x ]c,yc,zc]TIs any three-dimensional coordinate, | d, on the light plane of a point in the camera coordinate systemiAnd | is the vertical distance from the origin of the camera coordinates to the laser plane. In fact, the above formula diThat is, the projection (inner product) of the three-dimensional vector formed by any three-dimensional point on the light plane in the camera coordinate system on the unit normal vector of the corresponding plane. Because the included angle between the camera and the laser sensor is fixed, diIs a constant, called the distance parameter.
First, it is noted that when calculating the three-dimensional coordinates of the intersection line of the light plane and the target, since the formula (1) contains a constant proportionality coefficient s, there are numerous three-dimensional coordinate points, but the three-dimensional coordinate points can be constrained by the corresponding light plane equation (refer to formula (2), d)iIs constant), the proportionality coefficient is removed to obtain the real three-dimensional coordinate.
Therefore, the calibration process of the line structured light measurement system is mainly the calibration of the light plane equation, and the purpose can be achieved by acquiring a group of three-dimensional points (> ═ 3) on the non-collinear light plane.
The method for calibrating the light plane of the equidistant parallel line structure is described as follows:
since the invention relates to calibrating the light plane of the head-to-tail single-line structure, a stable single-line structure light calibration method is introduced first, and then a rapid method for calculating the rest light plane according to the invention is given.
Single structure light plane calibration method
Generally, calibration of a single structured light plane involves acquiring a group of non-collinear three-dimensional feature points (> ═ 3), a plane target and intersection ratio invariant principle is generally adopted, but the steps are complicated, the number of acquired feature points is small, and the stability depends on the acquisition precision of part of target feature points. A fast and robust calibration method based on a camera projection matrix and a homography matrix is introduced.
The homography matrix represents projection mapping from one plane to another plane, namely for a plurality of coplanar characteristic points, the corresponding imaging characteristic points are coplanar after homography transformation. Therefore, the three-dimensional coordinates of the imaging points can be recovered through the camera projection matrix and the homography matrix according to the imaging points. As shown in fig. 3, point A, B, C, D on the checkerboard corresponds to points a, b, c, and d on the projection image, respectively, and the correspondence relationship can be expressed by the following formula:
Figure BDA0001991116980000061
where H is called a homography, [ u, v [ ]]TFor projection coordinates, [ x ]w,yw]TOn the target planeCorresponding world coordinates.
Camera imaging model considering planar targets:
Figure BDA0001991116980000062
where K is the projection matrix, [ r ]1,r2,r3]TAnd t is the rotation and translation of the target. [ x ] ofw,yw]TThe world coordinates of the points on the corresponding targets. It can be readily seen that H is the homography matrix of the planar target to the image plane. During actual calibration, the homography matrix H can be determined once according to equation (3) based on all the feature points on the target and the corresponding image pixel coordinates. After obtaining the homography matrix, the world coordinates corresponding to the pixel coordinates in any image space can be calculated according to equation (4) as follows:
Figure BDA0001991116980000063
the three-dimensional coordinates of the feature points in the corresponding camera coordinate system can be obtained by the following formula (4):
Figure BDA0001991116980000064
the three-dimensional coordinates of any straight line intersecting the light plane can be estimated at one time according to the above algorithm, and in addition, the actual calibration operation of the single structured light plane is shown in fig. 4, the plane target is moved twice, the three-dimensional coordinates corresponding to the pixel points on the two straight lines (straight line L1 and straight line L2) are obtained according to the above algorithm, and the light plane equation can be obtained by using least square fitting.
Equal-distance parallel line structure light positioning method
The equi-spaced parallel line structured light means that when the light source emitter projects perpendicular to an imaging plane, intersecting line images of the light source emitter are equidistantly parallel, and all light planes intersect in a straight line, which is most common in the field of industrial vision measurement. The number of the projected lines of the structured light is generally odd, and is as many as 81 from 7 lines, which can be selected according to the purpose of the application. When the number of lines is large, the efficiency of the traditional single-structure light calibration algorithm is low, mainly because a multi-parallel-line structured light vision measurement system is wide in visual field, a large-size high-precision target needs to be manufactured, the cost is high, a linear image formed by intersecting each structured light and the target needs to be detected and indexed, the processing is complicated, and the precision is not well controlled.
The invention can calculate the equation of the rest light plane according to the inherent characteristics of the first and the last structured light planes only by calibrating the first and the last structured light planes. The method has the advantages that the size of the target is not required to be large (the head and tail structured light planes can be calibrated independently), the cost is low, each intersection line image is not required to be detected, and the speed is high. In addition, the overall calibration accuracy depends on the calibration accuracy of the head and tail light planes, and the calibration error of the light plane of the middle layer depends on the calibration error of the head light plane in principle. The process of the present invention is described below.
By the single-line structure light calibration method, a head-to-tail light plane equation can be obtained:
n1,xx+n1,yy+n1,zz=d1(7)
n2N+1,xx+n2N+1,yy+n2N+1,zz=d2N+1
wherein [ n ] is1,x,n1,y,n1,z]TAnd [ n2N+1,x,n2N+1,y,n2N+1,z]TIs a unit normal vector of head and tail light planes, | d1I and | d2N+1And | is the distance from the origin of the camera coordinate system to the head and tail light planes respectively. In addition, in order to ensure that the included angle between the head light plane and the tail light plane is calculated, the unit normal vector of the included angle is kept consistent with the main direction, namely the inner product satisfies n1,xn2N+1,x+n1,yn2N+1,y+n1, zn2N+1,z>0. The angle between the two light planes can be obtained from the unit normal vector inner product:
cosθ=n1,x·n2N+1,x+n1,y·n2N+1,y+n1,z·n2N+1,z(8)
it will be understood that the rotation angle θ (in radians) of the light plane 1 coincides with the light plane 2N +1, and the unit rotation vector can be obtained by the following formula:
Figure BDA0001991116980000071
according to the Rodrigues transformation, a rotation matrix R of the rotation angle theta around the rotation vector is as follows:
Figure BDA0001991116980000081
then the normal vectors of the front and rear light planes have the following relationship:
Figure BDA0001991116980000082
FIG. 5 is a schematic cross-sectional view of an image of a multi-parallel line structure when light is projected perpendicularly. Assuming that the number of light planes is 2N +1, the projection height is h, and the light plane projection line spacing is d, then d/h is constant, that is, for a certain parallel line structured light with equal spacing, the included angle between the front and rear light planes is fixed and also constant, and the included angle between the front and rear light planes has the following relationship:
Figure BDA0001991116980000083
angle theta between light plane 1 and light plane 21(refer to fig. 5) can be calculated by the following equation:
Figure BDA0001991116980000084
namely:
Figure BDA0001991116980000085
similarly, the angle between the ith light plane and the (i +1) th and light planes can be calculated by:
Figure BDA0001991116980000086
namely:
Figure BDA0001991116980000087
due to the symmetry of the light planes when projected perpendicularly, the included angles between the remaining half of the light planes can be obtained:
θ2N+1-i=θi,i=1,2,...,N (17)
theta estimated by the aboveiWhen θ in the formula (10) is replaced with 1, 2.. 2N, a rotation matrix R between the light planes can be obtainedi,i+1Similar to equation (11), the following relationship exists between the normal vectors of the light planes:
Figure BDA0001991116980000091
from equation (18), the unit normal vector of each light plane in the middle can be derived from the unit normal vector of the first light plane.
So far, the unit normal vector of each light plane in the middle can be calculated by the formula (18) according to the normal vector of the first light plane, and the projection (inner product) d from the coordinate origin of the camera to the normal vector of each light plane needs to be calculatedi2N-1, i ═ 2, 3. Since the light planes intersect at a straight line, this straight line can be obtained by simultaneous solution equation (7):
Figure BDA0001991116980000092
wherein
Figure BDA0001991116980000093
Is the direction vector of the intersecting line (unit rotation vector) calculated by the formula (9), [ x ]0,y0,z0]Is any point on a straight line and can fix any x by taking0(e.g. take x)00) value, and simultaneously solving equation (7) to obtain y0And z0A value is obtained. Sampling M points [ x ] on a straight linei,yi,zi],i=1,2,...,M,diCan be estimated by:
Figure BDA0001991116980000094
each term in the above equation represents the projection of a vector formed by points on the ith light plane in the camera coordinate system on its normal vector (which should be equal in principle), and the averaging of M points is taken for computational stability.
In summary, the overall calibration is summarized as follows:
step 1: according to the algorithm (or other algorithms), respectively calibrating a head light plane equation and a tail light plane equation (a first light plane equation and a last light plane equation);
step 2: and (3) calculating an included angle theta (rotation angle) of the two light planes according to the formula (8) by using the unit normal vectors of the head light plane and the tail light plane obtained in the step (1).
And 3, step 3: using the rotation angle θ calculated in the 2 nd step and the characteristics of the equidistant parallel structured light (refer to equation (12) and fig. 5), the rotation angle θ of the adjacent light planes is calculated according to equations (13) to (17)i,i=1,2,...2N。
And 4, step 4: calculating the unit rotation vector of the adjacent light plane by using the formula (9) according to the rotation angle of the adjacent light plane obtained in the step 3, and converting the unit rotation vector into a rotation matrix R according to the formula (10)i,i+1
And 5, step 5: obtaining a unit normal vector of the middle light plane by using the unit normal vector of the first light plane and the rotation matrix calculated in the step 4;
and 6, step 6: using the characteristic that all light planes intersect on a straight line (the direction is the rotation vector calculated above), calculating the intersection line (reference formula (19)) using the head and tail light planes, and samplingM points, and calculating the distance parameter d of the intermediate light plane according to the formula (20)i,i=2,3,...,2N。
According to the method, the parameters of other intermediate light planes can be obtained according to the calibration results of the first and last light plane equations.
It should be noted that although the above method calculates the structured light with the odd number of projected lines (2N +1), the method can be calibrated just like the structured light with the even number of projected lines (2N) with a slight adjustment.
It can be seen that the maximum error of the vision measuring system constructed according to the above method is determined by the error of the first and last light planes, and therefore, the accuracy of the whole system can be ensured as long as the parameters of the first and last light planes are calibrated with high accuracy. Compared with the traditional line-by-line calibration algorithm, the method does not need to detect, process and index the light strip images of the middle light planes, greatly reduces the complicated steps and precision burden of image processing, and is rapid, efficient and stable. The experimental result shows that the maximum error can be controlled below 0.15mm at the measurement distance of 500mm, and the actual modeling application is completely met. In addition, the method is verified in the modeling application of the contactless automobile four-wheel positioning tire, is convenient and quick, has low calibration cost and has good popularization prospect.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (9)

1. A calibration method for a rapid multi-line structured light vision measurement system is characterized by comprising the following steps:
(1) respectively calibrating the head light plane and the tail light plane to obtain a unit normal vector and a distance parameter of the light planes;
(2) calculating the rotation angles and the rotation vectors of the two light planes by using the normal vectors of the head light plane and the tail light plane obtained in the step (1);
(3) calculating the included angle between every two adjacent light planes by utilizing the characteristics that the multi-line structured light is equidistantly parallel and is intersected in a straight line and the rotation angle calculated in the step (2);
(4) constructing a rotation matrix between the light planes by using the included angle between the light planes calculated in the step (3) and the rotation vector calculated in the step (2);
(5) calculating a unit normal vector of each middle light plane according to the rotation matrix obtained in the step (4) and the unit normal vector of the first light plane;
(6) and (2) calculating a common intersection line of each light plane by using the head and tail light plane equation obtained in the step (1), sampling a plurality of points, and calculating distance parameters of each light plane equation in the middle respectively.
2. The method as claimed in claim 1, wherein the calibration of the vision measuring system includes calibration of camera parameters and calibration of a multi-line structured light plane, and the camera projection measurement model is:
Figure FDA0002525082820000011
wherein: s is a constant proportionality coefficient, (u, v) is an image coordinate of a point on an intersecting line of the target and the plane of the corresponding structured light, K is a projection matrix, fx and fy are focal length parameters of the camera in x and y directions respectively, Cx and Cy are imaging principal points, [ x ] andc,yc,zc]Tis the three-dimensional coordinates of a point under the camera coordinate system;
the multi-line structured light plane model is as follows:
Figure FDA0002525082820000012
wherein: 2N +1 is the number of light planes, [ N ]i,x,ni,y,ni,z]TRepresents the unit normal vector of the ith light plane, [ x ]c,yc,zc]TOn a light plane being a point in the camera coordinate systemAny three-dimensional coordinate, | diAnd | is the vertical distance from the origin of the camera coordinates to the laser plane.
3. The calibration method of the fast multi-line structured light vision measurement system according to claim 2, wherein the fast calibration of the head-to-tail single structured light plane in step (1) is performed based on a camera projection matrix and a homography matrix, and the homography matrix is determined according to the following formula based on all feature points and corresponding image pixel coordinates of the target:
Figure FDA0002525082820000021
wherein H is a homography, [ u, v [ ]]TFor projection coordinates, [ x ]w,yw]TCorresponding world coordinates on a target plane; according to the homography matrix, the world coordinates corresponding to any pixel on the intersection line of the light plane and the plane target can be obtained, and then the three-dimensional coordinates of the feature points in the camera coordinate system are as follows:
Figure FDA0002525082820000022
4. the calibration method of the rapid multiline structured light vision measurement system as claimed in claim 3, wherein in the calibration operation of the head and tail single structured light planes, each moving plane target is twice, three-dimensional coordinates corresponding to pixel points on two straight lines are respectively obtained, and the head and tail two light plane equations are obtained by using least square fitting:
n1,xx+n1,yy+n1,zz=d1
n2N+1,xx+n2N+1,yy+n2N+1,zz=d2N+1
wherein: [ n ] of1,x,n1,y,n1,z]TAnd [ n2N+1,x,n2N+1,y,n2N+1,z]TIs a unit normal vector of head and tail light planes, | d1I and | d2N+1Is the camera seat respectivelyThe distance from the origin to the two light planes from the head to the tail is marked.
5. The method for calibrating a rapid multiline structured light vision measuring system as recited in claim 4, wherein the rotation angle θ between the head and tail light planes in the step (2) is obtained by the following formula based on the unit normal vector inner product:
cosθ=n1,x·n2N+1,x+n1,y·n2N+1,y+n1,z·n2N+1,z
the unit rotation vector is obtained by the following formula:
Figure FDA0002525082820000023
the rotation matrix R of the rotation angle θ around the rotation vector is:
Figure FDA0002525082820000024
6. the method for calibrating an optical vision measuring system with a fast multi-line structure as claimed in claim 5, wherein the included angle between the adjacent optical planes in the step (3) is:
Figure FDA0002525082820000025
Figure FDA0002525082820000031
θ2N+1-i=θi,i=1,2,...,N。
7. the method for calibrating an optical vision measuring system with a fast multi-line structure as claimed in claim 6, wherein in the step (4), the rotation vector and the rotation matrix R between the light planes are obtained according to the included angle between the adjacent light planesi,i+1In the step (5), the unit normal vector of each middle light plane is obtained from the unit normal vector of the first light plane based on the following formula:
Figure FDA0002525082820000032
8. the method for calibrating an optical vision measuring system with a fast multiline structure according to claim 7, wherein in the step (6), the common intersection line of the optical planes obtained by the simultaneous head and tail optical plane equations is:
Figure FDA0002525082820000033
wherein:
Figure FDA0002525082820000034
is the direction vector of the intersecting line, i.e., the aforementioned unit rotation vector, [ x ]0,y0,z0]Is any point on the straight line; by sampling M points [ x ] on a straight linei,yi,zi]I 1,2, M, obtaining an intermediate light plane distance parameter diComprises the following steps:
Figure FDA0002525082820000035
9. the calibration method for the rapid multi-line structured light vision measurement system according to any one of claims 1 to 8, wherein the calibration method is applied to contactless automobile four-wheel positioning tire modeling.
CN201910202385.2A 2019-03-11 2019-03-11 Calibration method for rapid multi-line structured optical vision measurement system Active CN109827521B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910202385.2A CN109827521B (en) 2019-03-11 2019-03-11 Calibration method for rapid multi-line structured optical vision measurement system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910202385.2A CN109827521B (en) 2019-03-11 2019-03-11 Calibration method for rapid multi-line structured optical vision measurement system

Publications (2)

Publication Number Publication Date
CN109827521A CN109827521A (en) 2019-05-31
CN109827521B true CN109827521B (en) 2020-08-07

Family

ID=66870175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910202385.2A Active CN109827521B (en) 2019-03-11 2019-03-11 Calibration method for rapid multi-line structured optical vision measurement system

Country Status (1)

Country Link
CN (1) CN109827521B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110470320B (en) * 2019-09-11 2021-03-05 河北科技大学 Calibration method of swinging scanning type line structured light measurement system and terminal equipment
CN110793458B (en) * 2019-10-30 2022-10-21 成都安科泰丰科技有限公司 Coplane adjusting method for two-dimensional laser displacement sensor
CN111220100B (en) * 2020-04-10 2020-08-18 广东博智林机器人有限公司 Laser beam-based measurement method, device, system, control device, and medium
CN113536210A (en) * 2021-06-04 2021-10-22 黄淮学院 Vector traversal line structure-based light stripe center coordinate calculation method
CN113566706B (en) * 2021-08-01 2022-05-31 北京工业大学 Device and method for composite rapid high-precision visual positioning
CN114494403B (en) * 2022-01-27 2022-09-30 烟台大学 Shellfish target size rapid measurement method based on deep learning
CN114565681B (en) * 2022-03-01 2022-11-22 禾多科技(北京)有限公司 Camera calibration method, device, equipment, medium and product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1990007096A1 (en) * 1988-12-21 1990-06-28 Gmf Robotics Corporation Method and system for automatically determining the position and orientation of an object in 3-d space
CN101476882A (en) * 2009-01-08 2009-07-08 上海交通大学 Structured light three-dimensional detection method based on homography matrix
CN103411553A (en) * 2013-08-13 2013-11-27 天津大学 Fast calibration method of multiple line structured light visual sensor
CN103884271A (en) * 2012-12-20 2014-06-25 中国科学院沈阳自动化研究所 Direct calibration method for line structured light vision sensor

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6415051B1 (en) * 1999-06-24 2002-07-02 Geometrix, Inc. Generating 3-D models using a manually operated structured light source
CN104240221B (en) * 2013-06-18 2017-02-08 烟台大学 Opposite-lens two-camera relative azimuth calibration device and method
CN105300311B (en) * 2015-11-10 2017-11-14 广东工业大学 Vision sensor in line-structured light scanning survey equipment
CN105783726B (en) * 2016-04-29 2018-06-19 无锡科技职业学院 A kind of curved welding seam three-dimensional rebuilding method based on line-structured light vision-based detection
CN105953747B (en) * 2016-06-07 2019-04-02 杭州电子科技大学 Structured light projection full view 3-D imaging system and method
CN106441099B (en) * 2016-10-13 2019-04-05 北京交通大学 The scaling method of multiple line structure optical sensor
CN108981608B (en) * 2018-05-29 2020-09-22 华南理工大学 Novel line structured light vision system and calibration method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1990007096A1 (en) * 1988-12-21 1990-06-28 Gmf Robotics Corporation Method and system for automatically determining the position and orientation of an object in 3-d space
CN101476882A (en) * 2009-01-08 2009-07-08 上海交通大学 Structured light three-dimensional detection method based on homography matrix
CN103884271A (en) * 2012-12-20 2014-06-25 中国科学院沈阳自动化研究所 Direct calibration method for line structured light vision sensor
CN103411553A (en) * 2013-08-13 2013-11-27 天津大学 Fast calibration method of multiple line structured light visual sensor

Also Published As

Publication number Publication date
CN109827521A (en) 2019-05-31

Similar Documents

Publication Publication Date Title
CN109827521B (en) Calibration method for rapid multi-line structured optical vision measurement system
US10690492B2 (en) Structural light parameter calibration device and method based on front-coating plane mirror
CN110276808B (en) Method for measuring unevenness of glass plate by combining single camera with two-dimensional code
KR102085228B1 (en) Imaging processing method and apparatus for calibrating depth of depth sensor
CN111243002A (en) Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement
CN107610183B (en) Calibration method of fringe projection phase height conversion mapping model
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN110940295B (en) High-reflection object measurement method and system based on laser speckle limit constraint projection
CN110378969A (en) A kind of convergence type binocular camera scaling method based on 3D geometrical constraint
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
JP2011123051A (en) Three-dimensional measurement method
CN115861445B (en) Hand-eye calibration method based on three-dimensional point cloud of calibration plate
CN108180888A (en) A kind of distance detection method based on rotating pick-up head
WO2018006246A1 (en) Method for matching feature points of planar array of four-phase unit and measurement method on basis thereof
CN106403838A (en) Field calibration method for hand-held line-structured light optical 3D scanner
US11295478B2 (en) Stereo camera calibration method and image processing device for stereo camera
WO2023201578A1 (en) Extrinsic parameter calibration method and device for monocular laser speckle projection system
KR101597163B1 (en) Method and camera apparatus for calibration of stereo camera
JP2012198031A (en) Image correction method and image correction device
Belhedi et al. Non-parametric depth calibration of a tof camera
CN109506629B (en) Method for calibrating rotation center of underwater nuclear fuel assembly detection device
CN111968182B (en) Calibration method for nonlinear model parameters of binocular camera
CN114993207B (en) Three-dimensional reconstruction method based on binocular measurement system
CN115880369A (en) Device, system and method for jointly calibrating line structured light 3D camera and line array camera
CN105809685A (en) Single-concentric circle image-based camera calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211122

Address after: 519000 Room 501, building 3, No. 388, Yongnan Road, Xiangzhou District, Zhuhai City, Guangdong Province

Patentee after: Zhuhai Huaxing Zhizao Technology Co.,Ltd.

Address before: 264005 School of computer and control engineering, Yantai University, Yantai City, Shandong Province, No. 30, Qingquan Road, Laishan District, Yantai City

Patentee before: Yantai University