CN116051659B - Linear array camera and 2D laser scanner combined calibration method - Google Patents
Linear array camera and 2D laser scanner combined calibration method Download PDFInfo
- Publication number
- CN116051659B CN116051659B CN202310323918.9A CN202310323918A CN116051659B CN 116051659 B CN116051659 B CN 116051659B CN 202310323918 A CN202310323918 A CN 202310323918A CN 116051659 B CN116051659 B CN 116051659B
- Authority
- CN
- China
- Prior art keywords
- target
- point
- linear array
- array camera
- formula
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 239000013598 vector Substances 0.000 claims description 38
- 239000011159 matrix material Substances 0.000 claims description 36
- 238000005457 optimization Methods 0.000 claims description 26
- 238000003384 imaging method Methods 0.000 claims description 22
- 238000005259 measurement Methods 0.000 claims description 16
- 230000009466 transformation Effects 0.000 claims description 12
- 238000013519 translation Methods 0.000 claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000005286 illumination Methods 0.000 claims description 6
- 238000002474 experimental method Methods 0.000 claims description 4
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 3
- 229960001948 caffeine Drugs 0.000 claims description 3
- 238000010276 construction Methods 0.000 claims description 3
- 238000013461 design Methods 0.000 claims description 3
- 238000003708 edge detection Methods 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 claims description 3
- 238000005498 polishing Methods 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 238000011426 transformation method Methods 0.000 claims description 3
- RYYVLZVUVIJVGH-UHFFFAOYSA-N trimethylxanthine Natural products CN1C(=O)N(C)C(=O)C2=C1N=CN2C RYYVLZVUVIJVGH-UHFFFAOYSA-N 0.000 claims description 3
- 238000001514 detection method Methods 0.000 description 6
- 230000003068 static effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Abstract
The invention discloses a combined calibration method of a linear array camera and a 2D laser scanner, which belongs to the technical field of measuring distance, level or azimuth, and is used for combined calibration of the linear array camera and the 2D laser scanner.
Description
Technical Field
The invention discloses a combined calibration method of a linear array camera and a 2D laser scanner, and belongs to the technical field of measuring distance, level or azimuth.
Background
The detection technology based on photogrammetry and the detection technology based on mobile laser scanning greatly improve the efficiency of tunnel inspection, generally an industrial camera and a 2D laser scanner are integrated on a tunnel inspection vehicle, images and point cloud data of the tunnel lining surface are rapidly acquired, and diseases are identified and maintained. In order to perform data fusion on the image and the point cloud data to obtain a finer and accurate tunnel live-action three-dimensional model, the two sensors need to be calibrated in a combined mode to obtain external parameters between the camera and the 2D laser scanner.
The linear array camera is suitable for long, narrow and continuous detection scenes such as tunnels due to the characteristics of high resolution, high scanning frequency, 1-dimensional imaging and the like, but also is difficult to determine characteristic points corresponding to image points because the linear array camera only collects one line in a static state and an image is formed by repeated single scanning lines, the conventional linear array camera calibration method is generally based on the principle of cross ratio invariance, and solves the problem by designing corresponding patterns and solving world coordinates of the characteristic points, so that the static calibration of the linear array camera is realized.
In the aspect of joint calibration of external parameters of a camera and a laser scanner, the current research is mainly focused on the calibration of an area-array camera and a 3D laser scanner and the calibration of the area-array camera and a 2D laser scanner. The combined calibration of the area-array camera and the 3D laser scanner is generally performed by using a calibration object-based method, for example, target objects such as a plane checkerboard, a cube, or a sphere are placed in a common field of view of the area-array camera and the 3D laser scanner, corresponding features such as points, lines, surfaces and the like of an image and point cloud are respectively extracted, geometric constraints are constructed, and external parameters are solved. In particular, a single scan of a 2D laser scanner can only obtain one contour line of the surrounding environment, and it is difficult to find a feature or a common point on the line-shaped point cloud. In the existing area-array camera and 2D laser scanning calibration method, a calibration object or V-shaped plate with a checkerboard is observed through a 2D laser scanner and an area-array camera, a normal vector of a plane is found, and geometric constraint among sensors is established for further solving.
Because the special data acquisition mode of the linear array camera and the 2D laser scanner leads to no public part between the two sensor data, the characteristics and geometric constraints of direct correspondence between the linear array camera image and the 2D laser scanner linear point cloud data are more difficult to find, and therefore, the joint calibration between the two sensors still has difficulty for a system integrating the linear array camera and the 2D laser scanner.
Disclosure of Invention
The invention provides a linear array camera and 2D laser scanner calibration method, which solves the problem that joint calibration is difficult because a common view field does not exist between a static calibration linear array camera and a 2D laser scanner and an image and point cloud data do not have direct corresponding characteristics or geometric constraint relations.
A linear array camera and 2D laser scanner joint calibration method comprises the following steps:
s1, manufacturing a mixed calibration target, and establishing a target coordinate system;
s2, building an experiment platform;
s3, acquiring calibration data of the linear array camera, and obtaining world coordinates of feature points corresponding to image points according to the principle of cross ratio invariance;
s4, establishing a linear array camera imaging model;
s5, calculating calibration parameters of the two-step calibration normal array camera;
s6, determining a basic equation of point coordinate conversion of the laser section scanner and the linear array camera;
s7, scanner data acquisition is performed, and a joint calibration geometric constraint model is constructed;
s8, nonlinear optimization accurately solves the joint calibration parameters.
S1 comprises the following steps:
the mixed calibration target comprises a fixing piece and a plane target body, wherein the target body comprises a plurality of diffuse reflection mark points, a 1.5-inch square laser tracker target ball hole, a 0.5-inch square laser tracker target ball hole and 7 cross-ratio triangle patterns, and the 7 cross-ratio triangle patterns comprise 8 straight lines and 7 oblique lines;
the fixing piece supports the target main body, and the target main body is fixed through the fixing piece after the target position is moved in the process of calibration data acquisition;
the diffuse reflection mark points are suitable for non-contact measurement of a theodolite industrial measurement system, so as to obtain three-dimensional coordinates of the diffuse reflection mark points, the mark points are uniformly distributed on a target main body, wherein three mark points respectively correspond to an O point, an X point and a Y point of a target coordinate system, the connection line of the two points of OX is an X axis, the connection line of the two points of OY is a Y axis, the O point is used as a coordinate origin of the target coordinate system, and the O point is upwards defined as a Z axis through being perpendicular to an O-XY plane;
the laser tracker target ball hole is used for carrying out contact measurement on the laser tracker and a target corresponding to the laser tracker, and a target coordinate system is established through the hole center point coordinate;
calculating to obtain straight line equations of all straight lines in 7 cross triangle patterns in a target coordinate system according to the design size of the cross triangle patterns;
s2 comprises the following steps: the experimental platform comprises a three-dimensional coordinate measuring system, a linear array camera, a 2D laser scanner, a linear array camera illumination light source and a mixed calibration target, wherein the three-dimensional coordinate measuring system comprises a laser tracker or a theodolite industrial measuring system, and the mixed calibration target is arranged on a fixed frame through a fixing piece, so that the three-dimensional coordinate measuring system, the linear array camera and the 2D laser scanner can observe the mixed calibration target.
S3 comprises the following steps:
placing the mixed calibration target in the field of view of the linear array camera, polishing by matching with a linear array camera illumination light source, enabling the linear array camera to acquire clear calibration pictures, measuring the mixed calibration target by using a three-dimensional coordinate measuring system after the linear array camera acquires the calibration pictures, taking the coordinate system of the three-dimensional coordinate measuring system as a reference standard of an overall coordinate system, calculating the target coordinate system under the reference standard of the overall coordinate, fitting out a target plane at the same time, and taking the target coordinate system as a world coordinate system when solving the internal reference calibration parameters of the linear array camera;
when the linear array camera is actually used for shooting, the camera view plane is intersected with the cross ratio triangle on the mixed calibration target, and 15 characteristic points areThe image points are represented as black straight lines on an actual calibration image, and the image points corresponding to the characteristic points are extracted through an edge detection algorithm>Is defined by the pixel coordinates of (a);
calculating to obtain the world coordinates of the feature points corresponding to the image points by the principle of cross-ratio invariance and the linear equation of each line segment on the target planeWherein, feature point->Is the intersection point on the ith straight line on the target plane.
S4 comprises the following steps:
in linear array cameras, world coordinates of feature points are knownThe correspondence between the pixel coordinates (u, v) of its corresponding image point is as follows:
the above is a camera imaging model that does not take into account lens distortion, wherein,for rotating matrix->9 elements of->Comprising a translation vector of three elements +.>The formula (1) is a view plane equation of the linear array camera, and the formula (2) is an equation conforming to the central projection relation>Representing the rotational translation from the world coordinate system to the camera coordinate system, v is the image principal point coordinate value, v 0 Calibrating an internal parameter to be solved for the linear array camera, wherein the internal parameter represents the main point offset>Representing the corresponding physical size of the pixel point in the y-axis direction;
taking the distortion of the linear array camera in the y direction into consideration, establishing a linear array camera distortion model only taking the first-order radial distortion into consideration as follows:
wherein Is distortion in the y-axis; />Is a first order radial distortion coefficient; />Representing the component of the lens distortion value on the v axis under the pixel coordinate system;
and (3) synthesizing (1), 2 and 3) to obtain a complete linear array camera imaging model as follows:
s5 comprises the following steps:
based on the corresponding relation between the feature points and the image points and the imaging model of the linear array camera, calculating the geometric imaging model parameters of the linear array camera by adopting a two-step method;
direct solution by direct linear transformation method without considering influence of lens distortion、v 0 And external parametersApproximation, wherein->The rotation angles are obtained according to the camera rotation matrix;
then taking the acquired internal parameters and external parameters of the linear array camera as initial values, and acquiring a final linear array camera calibration result containing first-order radial distortion by using an LM nonlinear optimization method under a strict imaging model;
by the formula (1)Extracting and substituting the extract into the formula (2) to obtain a formula (5):
by parametersThe camera inner and outer parameters in the inner and outer (5) of the camera are replaced to obtainFormula (6): />;
as can be seen from the nature of the rotation matrix,at the same time consider->Under the condition that the origin of the target coordinate system is below the view plane, the view plane can intersect all straight lines on the target, and the formula is obtained:
wherein ,is the equation parameter of the plane equation, so far, the parameter is obtained by solving;
Order the,/>For scaling factor +.>Is thatq i The scaled result is given by equation (8) in combination with the properties of the rotation matrix to give equation (9):
from the rotation matrix properties, formula (10) can be obtained:
substituting the world coordinates of the feature points into formula (7) to obtain formula (11):
the equation (11) is solved by adopting an SVD method, and the parameters obtained by the solution can be satisfied by any scaling, so far, the parameters are obtained by the solution;
Bringing the calculated external orientation element value into the formula (11), and solving the simultaneous equations:
External parametersThe z coordinate value representing the origin of the target coordinate system in the linear array camera coordinate system, wherein the target is always in front of the camera in the process of shooting images, < + >>Eliminating the ambiguity of the scaling factor in the formula (12) by the constraint condition, so that all initial values of the internal and external azimuth elements of the linear array camera are uniquely determined;
the initial value of the parameter is optimized and solved by a nonlinear optimization method to obtain a first-order radial distortion parameterAnd the internal and external parameters of the optimized linear array camera, the coordinates of the 3D point re-projected to the 2D image plane are differenced with the pixels of the image point actually extracted, namely the projection position and the observation position of the 3D point are differenced, and nonlinear optimization is carried out by establishing an objective function with minimized re-projection error as shown in the formula (13):
wherein ,representing a linear array camera calibration parameter equation for the first nonlinear optimization solution,
the characteristic points of all the target placement positions are,/>The j th feature point of the i th target placement position>For the v-axis coordinate of the pixel point extracted corresponding to the jth feature point of the ith target placement position, the pixel coordinate of the re-projection point obtained by the calculated camera parameters is +.>As shown in formula (14):
and obtaining final calibration results of the internal and external parameters and distortion parameters of the linear array camera under a strict imaging model by using an LM nonlinear optimization method, and integrating the linear array camera coordinate system with the external parameters of the mixed calibration target to reduce the linear array camera coordinate system to the total coordinate reference standard.
S6 comprises the following steps:
placing the mixed calibration targets for multiple times, observing the targets by a three-dimensional coordinate measuring system, establishing a target coordinate system, obtaining a target plane equation, scanning the mixed calibration targets placed at each target position by a 2D laser scanner to obtain scanning line point cloud data on a target plane, wherein the scanning plane of the 2D laser scanner is O-YOZ, namely, the point coordinate is 0 in the x-axis direction, and the point coordinate measured by the 2D laser scanner is expressed asAssume that the expression of the point measured by the 2D laser scanner in its correspondence with the linear array camera coordinate system is +.>The point coordinate transformation from the point coordinates on the line camera to the 2D laser scanner is expressed by equation (16):
the 2D laser scanner point coordinates to linear camera point coordinates transformation formula is as follows (17):
wherein ,、/>the rotation matrix and the translation vector of the linear array camera coordinate system to the 2D laser scanner coordinate system are respectively, and the formula (17) is a basic equation of the coordinate transformation of the 2D laser scanner and the linear array camera point.
S7 comprises the following steps:
setting a three-dimensional coordinate measuring system and a target plane which can be observed by a 2D laser scanner at the same time, observing a target by the 2D laser scanner to obtain a scanning line which is formed by a plurality of laser points on the target plane, calculating and fitting by using the three-dimensional coordinate measuring system on the observed target plane through uniform plane point coordinates to obtain a target plane equation in an overall reference coordinate system, and transferring the target plane equation to a plane equation under a linear array camera coordinate system as shown in a formula (18):
in the formula ,is the unit normal vector of the target plane, satisfies +.>;For camera point coordinates
From equation (18), and from the principle that the product of a unit normal vector on a plane and a point on a plane is equal to the modulo length of the vector, a geometric constraint equation in the form of a vector is derived as equation (19):
in the formula ,in order to be parallel to the plane vector of the target,,/>the method is the expression of the point on the target plane under a camera coordinate system;
the specific point-plane constraint conditions of the obtained linear array camera and the 2D laser scanner are as follows: the laser point is on the target plane, the product of the vector of the laser point to the origin of the camera and the normal vector of the target plane is the distance from the origin of the camera to the target plane, and the laser point coordinates are substituted into a formula (19) by combining a basic equation of point coordinate conversion of a formula (17), so that a formula (20) is obtained:
completing construction of a geometric constraint model between the linear array camera and the 2D laser scanner;
all point coordinates measured with a 2D laser scanner areThis feature is rewritten in the point coordinate form +.>Then formula (20) is rewritten as shown in formula (21):
placing the targets j times, observing j times by a 2D laser scanner, and bringing the coordinates of i laser points of the scanner on each target into a formula (21) to obtain a linear formula shown as a formula (22):
in the formula :
b is a mode of a target plane vector corresponding to each scanner laser point; solving the least squares solution of the equation (22) as equation (23):
adopting SVD to decompose and approximate a rotation matrix, then replacing a singular value matrix with a unit matrix, and recalculating to obtain the rotation matrixObtaining final combined calibration result-> and />。
S8 comprises the following steps:
the point coordinates of the scanner on the target plane do not fall exactly on the target plane by applying to the actual pointResidual accumulation to planeThe summation is non-linearly optimized to reduce the influence of noise error, thereby the sum is added to +>Andfurther optimized, an objective function established to minimize the sum of residuals is as formula (25):
representing the nonlinear optimization of the solution of the objective function of the joint calibration parameters using the LM algorithm, wherein +.>Normal vector of jth target, +.>An ith point represented on the jth target placement position;
obtaining a final combined calibration result after nonlinear optimization and />To this end, the linear camera and the 2D laser scanner achieve joint calibration with the aid of the three-dimensional coordinate measurement system.
Compared with the prior art, the invention has the following beneficial effects: the method has the advantages that the mixed calibration targets are designed, the three-dimensional coordinate measurement is matched, the coordinate system references of all sensors in the tunnel comprehensive detection equipment are unified, the relative space pose relation among all sensors is obtained in a static calibration mode, the problem that a linear array camera and a 2D laser scanner have no public view field, the direct corresponding characteristics and the geometric constraint are not generated between images and point cloud data is solved, and a foundation is laid for the work of later tunnel data tissue management such as obtaining a real-scene three-dimensional model through data fusion.
Drawings
FIG. 1 is a technical flow chart of the present invention;
FIG. 2 is a diagram of a target structure;
FIG. 3 is a schematic layout of a line camera and a 2D laser scanner in a detection apparatus;
fig. 4 is a schematic diagram of the measurement of a target according to the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions in the present invention will be clearly and completely described below, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
A linear array camera and 2D laser scanner joint calibration method comprises the following steps:
s1, manufacturing a mixed calibration target, and establishing a target coordinate system;
s2, building an experiment platform;
s3, acquiring calibration data of the linear array camera, and obtaining world coordinates of feature points corresponding to image points according to the principle of cross ratio invariance;
s4, establishing a linear array camera imaging model;
s5, calculating calibration parameters of the two-step calibration normal array camera;
s6, determining a basic equation of point coordinate conversion of the laser section scanner and the linear array camera;
s7, scanner data acquisition is performed, and a joint calibration geometric constraint model is constructed;
s8, nonlinear optimization accurately solves the joint calibration parameters.
S1 comprises the following steps:
the mixed calibration target comprises a fixing piece and a plane target body, wherein the target body comprises a plurality of diffuse reflection mark points, a 1.5-inch square laser tracker target ball hole, a 0.5-inch square laser tracker target ball hole and 7 cross-ratio triangle patterns, and the 7 cross-ratio triangle patterns comprise 8 straight lines and 7 oblique lines;
the fixing piece supports the target main body, and the target main body is fixed through the fixing piece after the target position is moved in the process of calibration data acquisition;
the diffuse reflection mark points are suitable for non-contact measurement of a theodolite industrial measurement system, so as to obtain three-dimensional coordinates of the diffuse reflection mark points, the mark points are uniformly distributed on a target main body, wherein three mark points respectively correspond to an O point, an X point and a Y point of a target coordinate system, the connection line of the two points of OX is an X axis, the connection line of the two points of OY is a Y axis, the O point is used as a coordinate origin of the target coordinate system, and the O point is upwards defined as a Z axis through being perpendicular to an O-XY plane;
the laser tracker target ball hole is used for carrying out contact measurement on the laser tracker and a target corresponding to the laser tracker, and a target coordinate system is established through the hole center point coordinate;
calculating to obtain straight line equations of all straight lines in 7 cross triangle patterns in a target coordinate system according to the design size of the cross triangle patterns;
s2 comprises the following steps: the experimental platform comprises a three-dimensional coordinate measuring system, a linear array camera, a 2D laser scanner, a linear array camera illumination light source and a mixed calibration target, wherein the three-dimensional coordinate measuring system comprises a laser tracker or a theodolite industrial measuring system, and the mixed calibration target is arranged on a fixed frame through a fixing piece, so that the three-dimensional coordinate measuring system, the linear array camera and the 2D laser scanner can observe the mixed calibration target.
S3 comprises the following steps:
placing the mixed calibration target in the field of view of the linear array camera, polishing by matching with a linear array camera illumination light source, enabling the linear array camera to acquire clear calibration pictures, measuring the mixed calibration target by using a three-dimensional coordinate measuring system after the linear array camera acquires the calibration pictures, taking the coordinate system of the three-dimensional coordinate measuring system as a reference standard of an overall coordinate system, calculating the target coordinate system under the reference standard of the overall coordinate, fitting out a target plane at the same time, and taking the target coordinate system as a world coordinate system when solving the internal reference calibration parameters of the linear array camera;
when the linear array camera is actually used for shooting, the camera view plane is intersected with the cross ratio triangle on the mixed calibration target, and 15 characteristic points areThe image points are represented as black straight lines on an actual calibration image, and the image points corresponding to the characteristic points are extracted through an edge detection algorithm>Is defined by the pixel coordinates of (a);
calculating to obtain the world coordinates of the feature points corresponding to the image points by the principle of cross-ratio invariance and the linear equation of each line segment on the target planeWherein, feature point->Is the intersection point on the ith straight line on the target plane.
S4 comprises the following steps:
in linear array cameras, world coordinates of feature points are knownThe correspondence between the pixel coordinates (u, v) of its corresponding image point is as follows:
the above is a camera imaging model that does not take into account lens distortion, wherein,for rotating matrix->9 elements of->Comprising a translation vector of three elements +.>The formula (1) is a view plane equation of the linear array camera, and the formula (2) is an equation conforming to the central projection relation>Representing the rotational translation from the world coordinate system to the camera coordinate system, v is the image principal point coordinate value, v 0 Calibrating an internal parameter to be solved for the linear array camera, wherein the internal parameter represents the main point offset>Representing the corresponding physical size of the pixel point in the y-axis direction;
taking the distortion of the linear array camera in the y direction into consideration, establishing a linear array camera distortion model only taking the first-order radial distortion into consideration as follows:
wherein Is distortion in the y-axis; />Is a first order radial distortion coefficient; />Representing the component of the lens distortion value on the v axis under the pixel coordinate system;
and (3) synthesizing (1), 2 and 3) to obtain a complete linear array camera imaging model as follows:
s5 comprises the following steps:
based on the corresponding relation between the feature points and the image points and the imaging model of the linear array camera, calculating the geometric imaging model parameters of the linear array camera by adopting a two-step method;
irrespective of the influence of lens distortion byDirect linear transformation method and direct solution、v 0 And external parametersApproximation, wherein->The rotation angles are obtained according to the camera rotation matrix;
then taking the acquired internal parameters and external parameters of the linear array camera as initial values, and acquiring a final linear array camera calibration result containing first-order radial distortion by using an LM nonlinear optimization method under a strict imaging model;
by the formula (1)Extracting and substituting the extract into the formula (2) to obtain a formula (5):
by parametersSubstituting the camera inside and outside parameters in the inside and outside (5) of the camera to obtain the formula (6): />;
as can be seen from the nature of the rotation matrix,at the same time consider that/>Under the condition that the origin of the target coordinate system is below the view plane, the view plane can intersect all straight lines on the target, and the formula is obtained:
wherein ,is the equation parameter of the plane equation, so far, the parameter is obtained by solving;
Order the,/>For scaling factor +.>Is thatq i The scaled result is given by equation (8) in combination with the properties of the rotation matrix to give equation (9):
from the rotation matrix properties, formula (10) can be obtained:
substituting the world coordinates of the feature points into formula (7) to obtain formula (11):
SV is used in formula (11)D, solving, wherein the solved parameters can be satisfied by any expansion and contraction, and the parameters are obtained by solving;
Bringing the calculated external orientation element value into the formula (11), and solving the simultaneous equations:
External parametersThe z coordinate value representing the origin of the target coordinate system in the linear array camera coordinate system, wherein the target is always in front of the camera in the process of shooting images, < + >>Eliminating the ambiguity of the scaling factor in the formula (12) by the constraint condition, so that all initial values of the internal and external azimuth elements of the linear array camera are uniquely determined;
the initial value of the parameter is optimized and solved by a nonlinear optimization method to obtain a first-order radial distortion parameterAnd the internal and external parameters of the optimized linear array camera, the coordinates of the 3D point re-projected to the 2D image plane are differenced with the pixels of the image point actually extracted, namely the projection position and the observation position of the 3D point are differenced, and nonlinear optimization is carried out by establishing an objective function with minimized re-projection error as shown in the formula (13):
wherein ,representing a linear array camera calibration parameter equation for the first nonlinear optimization solution,
the characteristic points of all the target placement positions are,/>The j th feature point of the i th target placement position>For the v-axis coordinate of the pixel point extracted corresponding to the jth feature point of the ith target placement position, the pixel coordinate of the re-projection point obtained by the calculated camera parameters is +.>As shown in formula (14): />
And obtaining final calibration results of the internal and external parameters and distortion parameters of the linear array camera under a strict imaging model by using an LM nonlinear optimization method, and integrating the linear array camera coordinate system with the external parameters of the mixed calibration target to reduce the linear array camera coordinate system to the total coordinate reference standard.
S6 comprises the following steps:
placing the mixed calibration targets for multiple times, observing the targets by a three-dimensional coordinate measuring system, establishing a target coordinate system, obtaining a target plane equation, scanning the mixed calibration targets placed at each target position by a 2D laser scanner to obtain scanning line point cloud data on a target plane, wherein the scanning plane of the 2D laser scanner is O-YOZ, namely, the point coordinate is 0 in the x-axis direction, and the point coordinate measured by the 2D laser scanner is expressed asAssume that the expression of the point measured by the 2D laser scanner in its correspondence with the linear array camera coordinate system is +.>The point coordinate transformation from the point coordinates on the line camera to the 2D laser scanner is expressed by equation (16):
the 2D laser scanner point coordinates to linear camera point coordinates transformation formula is as follows (17):
wherein ,、/>the rotation matrix and the translation vector of the linear array camera coordinate system to the 2D laser scanner coordinate system are respectively, and the formula (17) is a basic equation of the coordinate transformation of the 2D laser scanner and the linear array camera point.
S7 comprises the following steps:
setting a three-dimensional coordinate measuring system and a target plane which can be observed by a 2D laser scanner at the same time, observing a target by the 2D laser scanner to obtain a scanning line which is formed by a plurality of laser points on the target plane, calculating and fitting by using the three-dimensional coordinate measuring system on the observed target plane through uniform plane point coordinates to obtain a target plane equation in an overall reference coordinate system, and transferring the target plane equation to a plane equation under a linear array camera coordinate system as shown in a formula (18):
in the formula ,is the unit normal vector of the target plane, satisfies +.>;For camera point coordinates
From equation (18), and from the principle that the product of a unit normal vector on a plane and a point on a plane is equal to the modulo length of the vector, a geometric constraint equation in the form of a vector is derived as equation (19):
in the formula ,for plane vector parallel to the target, +.>,/>The method is the expression of the point on the target plane under a camera coordinate system;
the specific point-plane constraint conditions of the obtained linear array camera and the 2D laser scanner are as follows: the laser point is on the target plane, the product of the vector of the laser point to the origin of the camera and the normal vector of the target plane is the distance from the origin of the camera to the target plane, and the laser point coordinates are substituted into a formula (19) by combining a basic equation of point coordinate conversion of a formula (17), so that a formula (20) is obtained:
completing construction of a geometric constraint model between the linear array camera and the 2D laser scanner;
all point coordinates measured with a 2D laser scanner areThis feature is rewritten in the point coordinate form +.>Then formula (20) is rewritten as shown in formula (21):
placing the targets j times, observing j times by a 2D laser scanner, and bringing the coordinates of i laser points of the scanner on each target into a formula (21) to obtain a linear formula shown as a formula (22):
in the formula :
b is a mode of a target plane vector corresponding to each scanner laser point; solving the least squares solution of the equation (22) as equation (23):
adopting SVD to decompose and approximate a rotation matrix, then replacing a singular value matrix with a unit matrix, and recalculating to obtain the rotation matrixObtaining final combined calibration result-> and />。
S8 comprises the following steps:
the point coordinates of the scanner on the target plane do not fall exactly on the target plane by applying to the actual pointNon-linear optimization of the sum of residuals to the plane, reducing the influence of noise errors, thus to +.>Andfurther optimized, an objective function established to minimize the sum of residuals is as formula (25):
representing the nonlinear optimization of the solution of the objective function of the joint calibration parameters using the LM algorithm, wherein +.>Normal vector of jth target, +.>An ith point represented on the jth target placement position;
obtaining a final combined calibration result after nonlinear optimization and />To this end, the linear camera and the 2D laser scanner achieve joint calibration with the aid of the three-dimensional coordinate measurement system.
The technical flow of the invention is shown in fig. 1, the target structure is shown in fig. 2, a plurality of linear cameras of the tunnel comprehensive detection equipment in fig. 3 surround the upper part of the 2D laser scanner in a semicircular array, and the linear cameras calibrate actual images as shown in fig. 4. Measurement of the target as shown in fig. 4, the target is measured by a 2D laser scanner, a three-dimensional coordinate measurement system and a line camera together, where n is a plane normal vector.
In practical experiments, the invention adopts a Dalsa-LA-CM-08k08A linear array camera, a long-walk LS9005A type lens and a 2D laser scanner, and parameters of the linear array camera and the lens are shown in table 1.
TABLE 1
The calibration results of the linear array camera are shown in table 2.
TABLE 2
The combined calibration results of the linear array camera and the laser section scanner are shown in table 3.
TABLE 3 Table 3
The above embodiments are only for illustrating the technical aspects of the present invention, not for limiting the same, and although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may be modified or some or all of the technical features may be replaced with other technical solutions, which do not depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (6)
1. A linear array camera and 2D laser scanner joint calibration method is characterized by comprising the following steps:
s1, manufacturing a mixed calibration target, and establishing a target coordinate system;
s2, building an experiment platform;
s3, acquiring calibration data of the linear array camera, and obtaining world coordinates of feature points corresponding to image points according to the principle of cross ratio invariance;
s4, establishing a linear array camera imaging model;
s5, calculating calibration parameters of the two-step calibration normal array camera;
s6, determining a basic equation of point coordinate conversion of the laser section scanner and the linear array camera;
placing the mixed calibration targets for multiple times, observing the targets by a three-dimensional coordinate measuring system, establishing a target coordinate system, obtaining a target plane equation, scanning the mixed calibration targets placed at each target position by a 2D laser scanner to obtain scanning line point cloud data on a target plane, wherein the scanning plane of the 2D laser scanner is O-YOZ, namely, the point coordinate is 0 in the x-axis direction, and the point coordinate measured by the 2D laser scanner is expressed asAssume that the expression of the point measured by the 2D laser scanner in its correspondence with the linear array camera coordinate system is +.>The point coordinate transformation from the point coordinates on the line camera to the 2D laser scanner is expressed by equation (16):
the 2D laser scanner point coordinates to linear camera point coordinates transformation formula is as follows (17):
wherein ,、/>the rotation matrix and the translation vector are respectively from the linear array camera coordinate system to the 2D laser scanner coordinate system, and the formula (17) is a basic equation of the coordinate transformation of the 2D laser scanner and the linear array camera point;
s7, scanner data acquisition is performed, and a joint calibration geometric constraint model is constructed;
setting a three-dimensional coordinate measuring system and a target plane which can be observed by a 2D laser scanner at the same time, observing a target by the 2D laser scanner to obtain a scanning line which is formed by a plurality of laser points on the target plane, calculating and fitting by using the three-dimensional coordinate measuring system on the observed target plane through uniform plane point coordinates to obtain a target plane equation in an overall reference coordinate system, and transferring the target plane equation to a plane equation under a linear array camera coordinate system as shown in a formula (18):
in the formula ,is the unit normal vector of the target plane, satisfies +.>;For camera point coordinates
From equation (18), and from the principle that the product of a unit normal vector on a plane and a point on a plane is equal to the modulo length of the vector, a geometric constraint equation in the form of a vector is derived as equation (19):
in the formula ,in order to be parallel to the plane vector of the target,,/>the method is the expression of the point on the target plane under a camera coordinate system;
the specific point-plane constraint conditions of the obtained linear array camera and the 2D laser scanner are as follows: the laser point is on the target plane, the product of the vector of the laser point to the origin of the camera and the normal vector of the target plane is the distance from the origin of the camera to the target plane, and the laser point coordinates are substituted into a formula (19) by combining a basic equation of point coordinate conversion of a formula (17), so that a formula (20) is obtained:
completing construction of a geometric constraint model between the linear array camera and the 2D laser scanner;
all point coordinates measured with a 2D laser scanner areThis feature is rewritten in the point coordinate form +.>Then formula (20) is rewritten as shown in formula (21):
placing the targets j times, observing j times by a 2D laser scanner, and bringing the coordinates of i laser points of the scanner on each target into a formula (21) to obtain a linear formula shown as a formula (22):
in the formula :
b is a mode of a target plane vector corresponding to each scanner laser point; solving the least squares solution of the equation (22) as equation (23):
adopting SVD to decompose and approximate a rotation matrix, then replacing a singular value matrix with a unit matrix, and recalculating to obtain the rotation matrixObtaining final combined calibration result-> and />;
S8, nonlinear optimization accurately solves the joint calibration parameters.
2. The linear camera and 2D laser scanner joint calibration method according to claim 1, wherein S1 comprises:
the mixed calibration target comprises a fixing piece and a plane target body, wherein the target body comprises a plurality of diffuse reflection mark points, a 1.5-inch square laser tracker target ball hole, a 0.5-inch square laser tracker target ball hole and 7 cross-ratio triangle patterns, and the 7 cross-ratio triangle patterns comprise 8 straight lines and 7 oblique lines;
the fixing piece supports the target main body, and the target main body is fixed through the fixing piece after the target position is moved in the process of calibration data acquisition;
the diffuse reflection mark points are suitable for non-contact measurement of a theodolite industrial measurement system, so as to obtain three-dimensional coordinates of the diffuse reflection mark points, the mark points are uniformly distributed on a target main body, wherein three mark points respectively correspond to an O point, an X point and a Y point of a target coordinate system, the connection line of the two points of OX is an X axis, the connection line of the two points of OY is a Y axis, the O point is used as a coordinate origin of the target coordinate system, and the O point is upwards defined as a Z axis through being perpendicular to an O-XY plane;
the laser tracker target ball hole is used for carrying out contact measurement on the laser tracker and a target corresponding to the laser tracker, and a target coordinate system is established through the hole center point coordinate;
calculating to obtain straight line equations of all straight lines in 7 cross triangle patterns in a target coordinate system according to the design size of the cross triangle patterns;
s2 comprises the following steps: the experimental platform comprises a three-dimensional coordinate measuring system, a linear array camera, a 2D laser scanner, a linear array camera illumination light source and a mixed calibration target, wherein the three-dimensional coordinate measuring system comprises a laser tracker or a theodolite industrial measuring system, and the mixed calibration target is arranged on a fixed frame through a fixing piece, so that the three-dimensional coordinate measuring system, the linear array camera and the 2D laser scanner can observe the mixed calibration target.
3. The linear camera and 2D laser scanner joint calibration method according to claim 2, wherein S3 comprises:
placing the mixed calibration target in the field of view of the linear array camera, polishing by matching with a linear array camera illumination light source, enabling the linear array camera to acquire clear calibration pictures, measuring the mixed calibration target by using a three-dimensional coordinate measuring system after the linear array camera acquires the calibration pictures, taking the coordinate system of the three-dimensional coordinate measuring system as a reference standard of an overall coordinate system, calculating the target coordinate system under the reference standard of the overall coordinate, fitting out a target plane at the same time, and taking the target coordinate system as a world coordinate system when solving the internal reference calibration parameters of the linear array camera;
when the linear array camera is actually used for shooting, the camera view plane is intersected with the cross ratio triangle on the mixed calibration target, and 15 characteristic points areThe image points are represented as black straight lines on an actual calibration image, and the image points corresponding to the characteristic points are extracted through an edge detection algorithm>Is defined by the pixel coordinates of (a);
calculating to obtain the world coordinates of the feature points corresponding to the image points by the principle of cross-ratio invariance and the linear equation of each line segment on the target planeWherein, feature point->Is the intersection point on the ith straight line on the target plane.
4. The method for calibrating a line camera and a 2D laser scanner in combination according to claim 3, wherein S4 comprises:
in linear array cameras, world coordinates of feature points are knownThe correspondence between the pixel coordinates (u, v) of its corresponding image point is as follows:
the above is a camera imaging model that does not take into account lens distortion, wherein,for rotating matrix->9 elements of->Comprising a translation vector of three elements +.>The formula (1) is a view plane equation of the linear array camera, and the formula (2) is an equation conforming to the central projection relation>Representing the rotational translation from the world coordinate system to the camera coordinate system, v is the image principal point coordinate value, v 0 Calibrating an internal parameter to be solved for the linear array camera, wherein the internal parameter represents the main point offset>Representing the corresponding physical size of the pixel point in the y-axis direction;
taking the distortion of the linear array camera in the y direction into consideration, establishing a linear array camera distortion model only taking the first-order radial distortion into consideration as follows:
wherein Is distortion in the y-axis; />Is a first order radial distortion coefficient; />Representing the component of the lens distortion value on the v axis under the pixel coordinate system;
and (3) synthesizing (1), 2 and 3) to obtain a complete linear array camera imaging model as follows:
5. the method for calibrating a line camera and a 2D laser scanner in combination according to claim 4, wherein S5 comprises:
based on the corresponding relation between the feature points and the image points and the imaging model of the linear array camera, calculating the geometric imaging model parameters of the linear array camera by adopting a two-step method;
direct solution by direct linear transformation method without considering influence of lens distortion、v 0 And external parametersApproximation, wherein->The rotation angles are obtained according to the camera rotation matrix; />
Then taking the acquired internal parameters and external parameters of the linear array camera as initial values, and acquiring a final linear array camera calibration result containing first-order radial distortion by using an LM nonlinear optimization method under a strict imaging model;
by the formula (1)Extracting and substituting the extract into the formula (2) to obtain a formula (5):
by parametersSubstituting the camera inside and outside parameters in the inside and outside (5) of the camera to obtain the formula (6): />;
as can be seen from the nature of the rotation matrix,at the same time consider->Under the condition that the origin of the target coordinate system is below the view plane, the view plane can intersect all straight lines on the target, and the formula is obtained:
wherein ,is the equation parameter of the plane equation, so far, the parameter is obtained by solving;
Order the,/>For scaling factor +.>Is thatq i After scalingFrom equation (8) in combination with the properties of the rotation matrix, equation (9) is obtained:
from the rotation matrix properties, formula (10) can be obtained:
substituting the world coordinates of the feature points into formula (7) to obtain formula (11):
the equation (11) is solved by adopting an SVD method, and the parameters obtained by the solution can be satisfied by any scaling, so far, the parameters are obtained by the solution;
Bringing the calculated external orientation element value into the formula (11), and solving the simultaneous equations:
External parametersThe z coordinate value representing the origin of the target coordinate system in the linear array camera coordinate system, wherein the target is always in front of the camera in the process of shooting images, < + >>The ambiguity of the scaling factor in equation (12) is resolved by this constraintSo far, all initial values of the internal and external azimuth elements of the linear array camera are uniquely determined;
the initial value of the parameter is optimized and solved by a nonlinear optimization method to obtain a first-order radial distortion parameterAnd the internal and external parameters of the optimized linear array camera, the coordinates of the 3D point re-projected to the 2D image plane are differenced with the pixels of the image point actually extracted, namely the projection position and the observation position of the 3D point are differenced, and nonlinear optimization is carried out by establishing an objective function with minimized re-projection error as shown in the formula (13):
wherein ,representing a linear array camera calibration parameter equation for the first nonlinear optimization solution,
the characteristic points of all the target placement positions are,/>The j th feature point of the i th target placement position>For the v-axis coordinate of the pixel point extracted corresponding to the jth feature point of the ith target placement position, the pixel coordinate of the re-projection point obtained by the calculated camera parameters is +.>As shown in formula (14): />
And obtaining final calibration results of the internal and external parameters and distortion parameters of the linear array camera under a strict imaging model by using an LM nonlinear optimization method, and integrating the linear array camera coordinate system with the external parameters of the mixed calibration target to reduce the linear array camera coordinate system to the total coordinate reference standard.
6. The method for calibrating a linear camera and a 2D laser scanner in combination according to claim 1, wherein S8 comprises:
the point coordinates of the scanner on the target plane do not fall exactly on the target plane by applying to the actual pointNon-linear optimization of the sum of residuals to the plane, reducing the influence of noise errors, thus to +.>Andfurther optimized, an objective function established to minimize the sum of residuals is as formula (25):
representing the nonlinear optimization of the solution of the objective function of the joint calibration parameters using the LM algorithm, wherein +.>Normal vector of jth target, +.>An ith point represented on the jth target placement position;
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310323918.9A CN116051659B (en) | 2023-03-30 | 2023-03-30 | Linear array camera and 2D laser scanner combined calibration method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310323918.9A CN116051659B (en) | 2023-03-30 | 2023-03-30 | Linear array camera and 2D laser scanner combined calibration method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116051659A CN116051659A (en) | 2023-05-02 |
CN116051659B true CN116051659B (en) | 2023-06-13 |
Family
ID=86129883
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310323918.9A Active CN116051659B (en) | 2023-03-30 | 2023-03-30 | Linear array camera and 2D laser scanner combined calibration method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116051659B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116485918B (en) * | 2023-06-25 | 2023-09-08 | 天府兴隆湖实验室 | Calibration method, calibration system and computer readable storage medium |
CN116625240B (en) * | 2023-07-20 | 2023-09-19 | 中交第一航务工程局有限公司 | Calibration method of combined underwater positioning equipment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109029284A (en) * | 2018-06-14 | 2018-12-18 | 大连理工大学 | A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method |
CN111325801A (en) * | 2020-01-23 | 2020-06-23 | 天津大学 | Combined calibration method for laser radar and camera |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102339463B (en) * | 2010-07-22 | 2015-10-21 | 首都师范大学 | Based on the system and method for laser scanner calibration line array camera |
CN111476844B (en) * | 2020-02-26 | 2022-08-16 | 武汉大学 | Calibration method for multiple linear array camera array systems |
US20220128671A1 (en) * | 2020-10-26 | 2022-04-28 | Faro Technologies, Inc. | Dynamic self-calibrating of auxiliary camera of laser scanner |
-
2023
- 2023-03-30 CN CN202310323918.9A patent/CN116051659B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109029284A (en) * | 2018-06-14 | 2018-12-18 | 大连理工大学 | A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method |
CN111325801A (en) * | 2020-01-23 | 2020-06-23 | 天津大学 | Combined calibration method for laser radar and camera |
Also Published As
Publication number | Publication date |
---|---|
CN116051659A (en) | 2023-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116051659B (en) | Linear array camera and 2D laser scanner combined calibration method | |
Luhmann | Close range photogrammetry for industrial applications | |
CN108844459B (en) | Calibration method and device of blade digital sample plate detection system | |
CN109579695B (en) | Part measuring method based on heterogeneous stereoscopic vision | |
CN110378969B (en) | Convergent binocular camera calibration method based on 3D geometric constraint | |
Zhang et al. | A universal and flexible theodolite-camera system for making accurate measurements over large volumes | |
EA031929B1 (en) | Apparatus and method for three dimensional surface measurement | |
CN111091599B (en) | Multi-camera-projector system calibration method based on sphere calibration object | |
WO2018201677A1 (en) | Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system | |
WO2020199439A1 (en) | Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method | |
CN115830103A (en) | Monocular color-based transparent object positioning method and device and storage medium | |
Boehm et al. | Accuracy of exterior orientation for a range camera | |
CN111754584A (en) | Remote large-field-of-view camera parameter calibration system and method | |
CN115046498B (en) | Calibration method for monocular rotating structure light three-dimensional measurement system | |
CN110827359A (en) | Checkerboard trihedron-based camera and laser external reference checking and correcting method and device | |
Wu | Photogrammetry: 3-D from imagery | |
Oniga et al. | Metric and Non-Metric Cameras Calibration for the Improvement of Real-Time Monitoring Process Results. | |
CN114663520A (en) | Double-camera combined calibration method and system for ultra-large range vision measurement | |
CN114170321A (en) | Camera self-calibration method and system based on distance measurement | |
CN114663486A (en) | Building height measurement method and system based on binocular vision | |
CN116797669B (en) | Multi-camera array calibration method based on multi-face tool | |
CN114046779B (en) | Visual measurement adjustment method based on additional large-scale constraint between measuring station and control point | |
CN117173256B (en) | Calibration method and device of line dynamic laser system with double vibrating mirrors | |
Hemmati et al. | A study on the refractive effect of glass in vision systems | |
Khurana et al. | Localization and mapping using a non-central catadioptric camera system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |