CN116051659B - Linear array camera and 2D laser scanner combined calibration method - Google Patents

Linear array camera and 2D laser scanner combined calibration method Download PDF

Info

Publication number
CN116051659B
CN116051659B CN202310323918.9A CN202310323918A CN116051659B CN 116051659 B CN116051659 B CN 116051659B CN 202310323918 A CN202310323918 A CN 202310323918A CN 116051659 B CN116051659 B CN 116051659B
Authority
CN
China
Prior art keywords
target
point
linear array
array camera
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310323918.9A
Other languages
Chinese (zh)
Other versions
CN116051659A (en
Inventor
石波
王聪
赵凯
杨密
林康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Science and Technology
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN202310323918.9A priority Critical patent/CN116051659B/en
Publication of CN116051659A publication Critical patent/CN116051659A/en
Application granted granted Critical
Publication of CN116051659B publication Critical patent/CN116051659B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The invention discloses a combined calibration method of a linear array camera and a 2D laser scanner, which belongs to the technical field of measuring distance, level or azimuth, and is used for combined calibration of the linear array camera and the 2D laser scanner.

Description

Linear array camera and 2D laser scanner combined calibration method
Technical Field
The invention discloses a combined calibration method of a linear array camera and a 2D laser scanner, and belongs to the technical field of measuring distance, level or azimuth.
Background
The detection technology based on photogrammetry and the detection technology based on mobile laser scanning greatly improve the efficiency of tunnel inspection, generally an industrial camera and a 2D laser scanner are integrated on a tunnel inspection vehicle, images and point cloud data of the tunnel lining surface are rapidly acquired, and diseases are identified and maintained. In order to perform data fusion on the image and the point cloud data to obtain a finer and accurate tunnel live-action three-dimensional model, the two sensors need to be calibrated in a combined mode to obtain external parameters between the camera and the 2D laser scanner.
The linear array camera is suitable for long, narrow and continuous detection scenes such as tunnels due to the characteristics of high resolution, high scanning frequency, 1-dimensional imaging and the like, but also is difficult to determine characteristic points corresponding to image points because the linear array camera only collects one line in a static state and an image is formed by repeated single scanning lines, the conventional linear array camera calibration method is generally based on the principle of cross ratio invariance, and solves the problem by designing corresponding patterns and solving world coordinates of the characteristic points, so that the static calibration of the linear array camera is realized.
In the aspect of joint calibration of external parameters of a camera and a laser scanner, the current research is mainly focused on the calibration of an area-array camera and a 3D laser scanner and the calibration of the area-array camera and a 2D laser scanner. The combined calibration of the area-array camera and the 3D laser scanner is generally performed by using a calibration object-based method, for example, target objects such as a plane checkerboard, a cube, or a sphere are placed in a common field of view of the area-array camera and the 3D laser scanner, corresponding features such as points, lines, surfaces and the like of an image and point cloud are respectively extracted, geometric constraints are constructed, and external parameters are solved. In particular, a single scan of a 2D laser scanner can only obtain one contour line of the surrounding environment, and it is difficult to find a feature or a common point on the line-shaped point cloud. In the existing area-array camera and 2D laser scanning calibration method, a calibration object or V-shaped plate with a checkerboard is observed through a 2D laser scanner and an area-array camera, a normal vector of a plane is found, and geometric constraint among sensors is established for further solving.
Because the special data acquisition mode of the linear array camera and the 2D laser scanner leads to no public part between the two sensor data, the characteristics and geometric constraints of direct correspondence between the linear array camera image and the 2D laser scanner linear point cloud data are more difficult to find, and therefore, the joint calibration between the two sensors still has difficulty for a system integrating the linear array camera and the 2D laser scanner.
Disclosure of Invention
The invention provides a linear array camera and 2D laser scanner calibration method, which solves the problem that joint calibration is difficult because a common view field does not exist between a static calibration linear array camera and a 2D laser scanner and an image and point cloud data do not have direct corresponding characteristics or geometric constraint relations.
A linear array camera and 2D laser scanner joint calibration method comprises the following steps:
s1, manufacturing a mixed calibration target, and establishing a target coordinate system;
s2, building an experiment platform;
s3, acquiring calibration data of the linear array camera, and obtaining world coordinates of feature points corresponding to image points according to the principle of cross ratio invariance;
s4, establishing a linear array camera imaging model;
s5, calculating calibration parameters of the two-step calibration normal array camera;
s6, determining a basic equation of point coordinate conversion of the laser section scanner and the linear array camera;
s7, scanner data acquisition is performed, and a joint calibration geometric constraint model is constructed;
s8, nonlinear optimization accurately solves the joint calibration parameters.
S1 comprises the following steps:
the mixed calibration target comprises a fixing piece and a plane target body, wherein the target body comprises a plurality of diffuse reflection mark points, a 1.5-inch square laser tracker target ball hole, a 0.5-inch square laser tracker target ball hole and 7 cross-ratio triangle patterns, and the 7 cross-ratio triangle patterns comprise 8 straight lines and 7 oblique lines;
the fixing piece supports the target main body, and the target main body is fixed through the fixing piece after the target position is moved in the process of calibration data acquisition;
the diffuse reflection mark points are suitable for non-contact measurement of a theodolite industrial measurement system, so as to obtain three-dimensional coordinates of the diffuse reflection mark points, the mark points are uniformly distributed on a target main body, wherein three mark points respectively correspond to an O point, an X point and a Y point of a target coordinate system, the connection line of the two points of OX is an X axis, the connection line of the two points of OY is a Y axis, the O point is used as a coordinate origin of the target coordinate system, and the O point is upwards defined as a Z axis through being perpendicular to an O-XY plane;
the laser tracker target ball hole is used for carrying out contact measurement on the laser tracker and a target corresponding to the laser tracker, and a target coordinate system is established through the hole center point coordinate;
calculating to obtain straight line equations of all straight lines in 7 cross triangle patterns in a target coordinate system according to the design size of the cross triangle patterns;
s2 comprises the following steps: the experimental platform comprises a three-dimensional coordinate measuring system, a linear array camera, a 2D laser scanner, a linear array camera illumination light source and a mixed calibration target, wherein the three-dimensional coordinate measuring system comprises a laser tracker or a theodolite industrial measuring system, and the mixed calibration target is arranged on a fixed frame through a fixing piece, so that the three-dimensional coordinate measuring system, the linear array camera and the 2D laser scanner can observe the mixed calibration target.
S3 comprises the following steps:
placing the mixed calibration target in the field of view of the linear array camera, polishing by matching with a linear array camera illumination light source, enabling the linear array camera to acquire clear calibration pictures, measuring the mixed calibration target by using a three-dimensional coordinate measuring system after the linear array camera acquires the calibration pictures, taking the coordinate system of the three-dimensional coordinate measuring system as a reference standard of an overall coordinate system, calculating the target coordinate system under the reference standard of the overall coordinate, fitting out a target plane at the same time, and taking the target coordinate system as a world coordinate system when solving the internal reference calibration parameters of the linear array camera;
when the linear array camera is actually used for shooting, the camera view plane is intersected with the cross ratio triangle on the mixed calibration target, and 15 characteristic points are
Figure SMS_1
The image points are represented as black straight lines on an actual calibration image, and the image points corresponding to the characteristic points are extracted through an edge detection algorithm>
Figure SMS_2
Is defined by the pixel coordinates of (a);
calculating to obtain the world coordinates of the feature points corresponding to the image points by the principle of cross-ratio invariance and the linear equation of each line segment on the target plane
Figure SMS_3
Wherein, feature point->
Figure SMS_4
Is the intersection point on the ith straight line on the target plane.
S4 comprises the following steps:
in linear array cameras, world coordinates of feature points are known
Figure SMS_5
The correspondence between the pixel coordinates (u, v) of its corresponding image point is as follows:
Figure SMS_6
the above is a camera imaging model that does not take into account lens distortion, wherein,
Figure SMS_7
for rotating matrix->
Figure SMS_8
9 elements of->
Figure SMS_9
Comprising a translation vector of three elements +.>
Figure SMS_10
The formula (1) is a view plane equation of the linear array camera, and the formula (2) is an equation conforming to the central projection relation>
Figure SMS_11
Representing the rotational translation from the world coordinate system to the camera coordinate system, v is the image principal point coordinate value, v 0 Calibrating an internal parameter to be solved for the linear array camera, wherein the internal parameter represents the main point offset>
Figure SMS_12
Representing the corresponding physical size of the pixel point in the y-axis direction;
taking the distortion of the linear array camera in the y direction into consideration, establishing a linear array camera distortion model only taking the first-order radial distortion into consideration as follows:
Figure SMS_13
(3);
wherein
Figure SMS_14
Is distortion in the y-axis; />
Figure SMS_15
Is a first order radial distortion coefficient; />
Figure SMS_16
Representing the component of the lens distortion value on the v axis under the pixel coordinate system;
and (3) synthesizing (1), 2 and 3) to obtain a complete linear array camera imaging model as follows:
Figure SMS_17
(4)。
s5 comprises the following steps:
based on the corresponding relation between the feature points and the image points and the imaging model of the linear array camera, calculating the geometric imaging model parameters of the linear array camera by adopting a two-step method;
direct solution by direct linear transformation method without considering influence of lens distortion
Figure SMS_18
、v 0 And external parameters
Figure SMS_19
Approximation, wherein->
Figure SMS_20
The rotation angles are obtained according to the camera rotation matrix;
then taking the acquired internal parameters and external parameters of the linear array camera as initial values, and acquiring a final linear array camera calibration result containing first-order radial distortion by using an LM nonlinear optimization method under a strict imaging model;
by the formula (1)
Figure SMS_21
Extracting and substituting the extract into the formula (2) to obtain a formula (5):
Figure SMS_22
(5);
by parameters
Figure SMS_23
The camera inner and outer parameters in the inner and outer (5) of the camera are replaced to obtainFormula (6): />
Figure SMS_24
Figure SMS_25
Respectively as shown in formula (7):
Figure SMS_26
(7);
as can be seen from the nature of the rotation matrix,
Figure SMS_27
at the same time consider->
Figure SMS_28
Under the condition that the origin of the target coordinate system is below the view plane, the view plane can intersect all straight lines on the target, and the formula is obtained:
Figure SMS_29
(8);
wherein ,
Figure SMS_30
is the equation parameter of the plane equation, so far, the parameter is obtained by solving
Figure SMS_31
Order the
Figure SMS_32
,/>
Figure SMS_33
For scaling factor +.>
Figure SMS_34
Is thatq i The scaled result is given by equation (8) in combination with the properties of the rotation matrix to give equation (9):
Figure SMS_35
(9);
from the rotation matrix properties, formula (10) can be obtained:
Figure SMS_36
(10);
substituting the world coordinates of the feature points into formula (7) to obtain formula (11):
Figure SMS_37
(11);
the equation (11) is solved by adopting an SVD method, and the parameters obtained by the solution can be satisfied by any scaling, so far, the parameters are obtained by the solution
Figure SMS_38
Bringing the calculated external orientation element value into the formula (11), and solving the simultaneous equations
Figure SMS_39
Figure SMS_40
(12);
External parameters
Figure SMS_41
The z coordinate value representing the origin of the target coordinate system in the linear array camera coordinate system, wherein the target is always in front of the camera in the process of shooting images, < + >>
Figure SMS_42
Eliminating the ambiguity of the scaling factor in the formula (12) by the constraint condition, so that all initial values of the internal and external azimuth elements of the linear array camera are uniquely determined;
the initial value of the parameter is optimized and solved by a nonlinear optimization method to obtain a first-order radial distortion parameter
Figure SMS_43
And the internal and external parameters of the optimized linear array camera, the coordinates of the 3D point re-projected to the 2D image plane are differenced with the pixels of the image point actually extracted, namely the projection position and the observation position of the 3D point are differenced, and nonlinear optimization is carried out by establishing an objective function with minimized re-projection error as shown in the formula (13):
Figure SMS_44
(13);
wherein ,
Figure SMS_45
representing a linear array camera calibration parameter equation for the first nonlinear optimization solution,
Figure SMS_46
vectors formed for all parameters to be optimized;
the characteristic points of all the target placement positions are
Figure SMS_47
,/>
Figure SMS_48
The j th feature point of the i th target placement position>
Figure SMS_49
For the v-axis coordinate of the pixel point extracted corresponding to the jth feature point of the ith target placement position, the pixel coordinate of the re-projection point obtained by the calculated camera parameters is +.>
Figure SMS_50
As shown in formula (14):
Figure SMS_51
(14);
and obtaining final calibration results of the internal and external parameters and distortion parameters of the linear array camera under a strict imaging model by using an LM nonlinear optimization method, and integrating the linear array camera coordinate system with the external parameters of the mixed calibration target to reduce the linear array camera coordinate system to the total coordinate reference standard.
S6 comprises the following steps:
placing the mixed calibration targets for multiple times, observing the targets by a three-dimensional coordinate measuring system, establishing a target coordinate system, obtaining a target plane equation, scanning the mixed calibration targets placed at each target position by a 2D laser scanner to obtain scanning line point cloud data on a target plane, wherein the scanning plane of the 2D laser scanner is O-YOZ, namely, the point coordinate is 0 in the x-axis direction, and the point coordinate measured by the 2D laser scanner is expressed as
Figure SMS_52
Assume that the expression of the point measured by the 2D laser scanner in its correspondence with the linear array camera coordinate system is +.>
Figure SMS_53
The point coordinate transformation from the point coordinates on the line camera to the 2D laser scanner is expressed by equation (16):
Figure SMS_54
(16);
the 2D laser scanner point coordinates to linear camera point coordinates transformation formula is as follows (17):
Figure SMS_55
(17);
wherein ,
Figure SMS_56
、/>
Figure SMS_57
the rotation matrix and the translation vector of the linear array camera coordinate system to the 2D laser scanner coordinate system are respectively, and the formula (17) is a basic equation of the coordinate transformation of the 2D laser scanner and the linear array camera point.
S7 comprises the following steps:
setting a three-dimensional coordinate measuring system and a target plane which can be observed by a 2D laser scanner at the same time, observing a target by the 2D laser scanner to obtain a scanning line which is formed by a plurality of laser points on the target plane, calculating and fitting by using the three-dimensional coordinate measuring system on the observed target plane through uniform plane point coordinates to obtain a target plane equation in an overall reference coordinate system, and transferring the target plane equation to a plane equation under a linear array camera coordinate system as shown in a formula (18):
Figure SMS_58
(18);
in the formula ,
Figure SMS_59
is the unit normal vector of the target plane, satisfies +.>
Figure SMS_60
Figure SMS_61
For camera point coordinates
From equation (18), and from the principle that the product of a unit normal vector on a plane and a point on a plane is equal to the modulo length of the vector, a geometric constraint equation in the form of a vector is derived as equation (19):
Figure SMS_62
(19);
in the formula ,
Figure SMS_63
in order to be parallel to the plane vector of the target,
Figure SMS_64
,/>
Figure SMS_65
the method is the expression of the point on the target plane under a camera coordinate system;
the specific point-plane constraint conditions of the obtained linear array camera and the 2D laser scanner are as follows: the laser point is on the target plane, the product of the vector of the laser point to the origin of the camera and the normal vector of the target plane is the distance from the origin of the camera to the target plane, and the laser point coordinates are substituted into a formula (19) by combining a basic equation of point coordinate conversion of a formula (17), so that a formula (20) is obtained:
Figure SMS_66
(20);
Figure SMS_67
the method is the expression of points on a target plane under a scanner coordinate system;
completing construction of a geometric constraint model between the linear array camera and the 2D laser scanner;
all point coordinates measured with a 2D laser scanner are
Figure SMS_68
This feature is rewritten in the point coordinate form +.>
Figure SMS_69
Then formula (20) is rewritten as shown in formula (21):
Figure SMS_70
(21);
wherein ,
Figure SMS_71
(21) H is a transformation matrix after coordinate rewriting;
placing the targets j times, observing j times by a 2D laser scanner, and bringing the coordinates of i laser points of the scanner on each target into a formula (21) to obtain a linear formula shown as a formula (22):
Figure SMS_72
(22);
in the formula :
Figure SMS_73
a is a coefficient matrix after substituting the coordinates of the points;
Figure SMS_74
for the 9 parameters in equation H,
Figure SMS_75
;/>
b is a mode of a target plane vector corresponding to each scanner laser point; solving the least squares solution of the equation (22) as equation (23):
Figure SMS_76
(23);
order the
Figure SMS_77
In combination with the rotation matrix properties, formula (24) can be obtained:
Figure SMS_78
(24);
adopting SVD to decompose and approximate a rotation matrix, then replacing a singular value matrix with a unit matrix, and recalculating to obtain the rotation matrix
Figure SMS_79
Obtaining final combined calibration result->
Figure SMS_80
and />
Figure SMS_81
S8 comprises the following steps:
the point coordinates of the scanner on the target plane do not fall exactly on the target plane by applying to the actual point
Figure SMS_82
Residual accumulation to planeThe summation is non-linearly optimized to reduce the influence of noise error, thereby the sum is added to +>
Figure SMS_83
And
Figure SMS_84
further optimized, an objective function established to minimize the sum of residuals is as formula (25):
Figure SMS_85
(25);
Figure SMS_86
representing the nonlinear optimization of the solution of the objective function of the joint calibration parameters using the LM algorithm, wherein +.>
Figure SMS_87
Normal vector of jth target, +.>
Figure SMS_88
An ith point represented on the jth target placement position;
obtaining a final combined calibration result after nonlinear optimization
Figure SMS_89
and />
Figure SMS_90
To this end, the linear camera and the 2D laser scanner achieve joint calibration with the aid of the three-dimensional coordinate measurement system.
Compared with the prior art, the invention has the following beneficial effects: the method has the advantages that the mixed calibration targets are designed, the three-dimensional coordinate measurement is matched, the coordinate system references of all sensors in the tunnel comprehensive detection equipment are unified, the relative space pose relation among all sensors is obtained in a static calibration mode, the problem that a linear array camera and a 2D laser scanner have no public view field, the direct corresponding characteristics and the geometric constraint are not generated between images and point cloud data is solved, and a foundation is laid for the work of later tunnel data tissue management such as obtaining a real-scene three-dimensional model through data fusion.
Drawings
FIG. 1 is a technical flow chart of the present invention;
FIG. 2 is a diagram of a target structure;
FIG. 3 is a schematic layout of a line camera and a 2D laser scanner in a detection apparatus;
fig. 4 is a schematic diagram of the measurement of a target according to the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions in the present invention will be clearly and completely described below, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
A linear array camera and 2D laser scanner joint calibration method comprises the following steps:
s1, manufacturing a mixed calibration target, and establishing a target coordinate system;
s2, building an experiment platform;
s3, acquiring calibration data of the linear array camera, and obtaining world coordinates of feature points corresponding to image points according to the principle of cross ratio invariance;
s4, establishing a linear array camera imaging model;
s5, calculating calibration parameters of the two-step calibration normal array camera;
s6, determining a basic equation of point coordinate conversion of the laser section scanner and the linear array camera;
s7, scanner data acquisition is performed, and a joint calibration geometric constraint model is constructed;
s8, nonlinear optimization accurately solves the joint calibration parameters.
S1 comprises the following steps:
the mixed calibration target comprises a fixing piece and a plane target body, wherein the target body comprises a plurality of diffuse reflection mark points, a 1.5-inch square laser tracker target ball hole, a 0.5-inch square laser tracker target ball hole and 7 cross-ratio triangle patterns, and the 7 cross-ratio triangle patterns comprise 8 straight lines and 7 oblique lines;
the fixing piece supports the target main body, and the target main body is fixed through the fixing piece after the target position is moved in the process of calibration data acquisition;
the diffuse reflection mark points are suitable for non-contact measurement of a theodolite industrial measurement system, so as to obtain three-dimensional coordinates of the diffuse reflection mark points, the mark points are uniformly distributed on a target main body, wherein three mark points respectively correspond to an O point, an X point and a Y point of a target coordinate system, the connection line of the two points of OX is an X axis, the connection line of the two points of OY is a Y axis, the O point is used as a coordinate origin of the target coordinate system, and the O point is upwards defined as a Z axis through being perpendicular to an O-XY plane;
the laser tracker target ball hole is used for carrying out contact measurement on the laser tracker and a target corresponding to the laser tracker, and a target coordinate system is established through the hole center point coordinate;
calculating to obtain straight line equations of all straight lines in 7 cross triangle patterns in a target coordinate system according to the design size of the cross triangle patterns;
s2 comprises the following steps: the experimental platform comprises a three-dimensional coordinate measuring system, a linear array camera, a 2D laser scanner, a linear array camera illumination light source and a mixed calibration target, wherein the three-dimensional coordinate measuring system comprises a laser tracker or a theodolite industrial measuring system, and the mixed calibration target is arranged on a fixed frame through a fixing piece, so that the three-dimensional coordinate measuring system, the linear array camera and the 2D laser scanner can observe the mixed calibration target.
S3 comprises the following steps:
placing the mixed calibration target in the field of view of the linear array camera, polishing by matching with a linear array camera illumination light source, enabling the linear array camera to acquire clear calibration pictures, measuring the mixed calibration target by using a three-dimensional coordinate measuring system after the linear array camera acquires the calibration pictures, taking the coordinate system of the three-dimensional coordinate measuring system as a reference standard of an overall coordinate system, calculating the target coordinate system under the reference standard of the overall coordinate, fitting out a target plane at the same time, and taking the target coordinate system as a world coordinate system when solving the internal reference calibration parameters of the linear array camera;
when the linear array camera is actually used for shooting, the camera view plane is intersected with the cross ratio triangle on the mixed calibration target, and 15 characteristic points are
Figure SMS_91
The image points are represented as black straight lines on an actual calibration image, and the image points corresponding to the characteristic points are extracted through an edge detection algorithm>
Figure SMS_92
Is defined by the pixel coordinates of (a);
calculating to obtain the world coordinates of the feature points corresponding to the image points by the principle of cross-ratio invariance and the linear equation of each line segment on the target plane
Figure SMS_93
Wherein, feature point->
Figure SMS_94
Is the intersection point on the ith straight line on the target plane.
S4 comprises the following steps:
in linear array cameras, world coordinates of feature points are known
Figure SMS_95
The correspondence between the pixel coordinates (u, v) of its corresponding image point is as follows:
Figure SMS_96
the above is a camera imaging model that does not take into account lens distortion, wherein,
Figure SMS_97
for rotating matrix->
Figure SMS_98
9 elements of->
Figure SMS_99
Comprising a translation vector of three elements +.>
Figure SMS_100
The formula (1) is a view plane equation of the linear array camera, and the formula (2) is an equation conforming to the central projection relation>
Figure SMS_101
Representing the rotational translation from the world coordinate system to the camera coordinate system, v is the image principal point coordinate value, v 0 Calibrating an internal parameter to be solved for the linear array camera, wherein the internal parameter represents the main point offset>
Figure SMS_102
Representing the corresponding physical size of the pixel point in the y-axis direction;
taking the distortion of the linear array camera in the y direction into consideration, establishing a linear array camera distortion model only taking the first-order radial distortion into consideration as follows:
Figure SMS_103
(3);
wherein
Figure SMS_104
Is distortion in the y-axis; />
Figure SMS_105
Is a first order radial distortion coefficient; />
Figure SMS_106
Representing the component of the lens distortion value on the v axis under the pixel coordinate system;
and (3) synthesizing (1), 2 and 3) to obtain a complete linear array camera imaging model as follows:
Figure SMS_107
(4)。
s5 comprises the following steps:
based on the corresponding relation between the feature points and the image points and the imaging model of the linear array camera, calculating the geometric imaging model parameters of the linear array camera by adopting a two-step method;
irrespective of the influence of lens distortion byDirect linear transformation method and direct solution
Figure SMS_108
、v 0 And external parameters
Figure SMS_109
Approximation, wherein->
Figure SMS_110
The rotation angles are obtained according to the camera rotation matrix;
then taking the acquired internal parameters and external parameters of the linear array camera as initial values, and acquiring a final linear array camera calibration result containing first-order radial distortion by using an LM nonlinear optimization method under a strict imaging model;
by the formula (1)
Figure SMS_111
Extracting and substituting the extract into the formula (2) to obtain a formula (5):
Figure SMS_112
(5);
by parameters
Figure SMS_113
Substituting the camera inside and outside parameters in the inside and outside (5) of the camera to obtain the formula (6): />
Figure SMS_114
Figure SMS_115
Respectively as shown in formula (7):
Figure SMS_116
(7);
as can be seen from the nature of the rotation matrix,
Figure SMS_117
at the same time consider that/>
Figure SMS_118
Under the condition that the origin of the target coordinate system is below the view plane, the view plane can intersect all straight lines on the target, and the formula is obtained:
Figure SMS_119
(8);
wherein ,
Figure SMS_120
is the equation parameter of the plane equation, so far, the parameter is obtained by solving
Figure SMS_121
Order the
Figure SMS_122
,/>
Figure SMS_123
For scaling factor +.>
Figure SMS_124
Is thatq i The scaled result is given by equation (8) in combination with the properties of the rotation matrix to give equation (9):
Figure SMS_125
(9);
from the rotation matrix properties, formula (10) can be obtained:
Figure SMS_126
(10);
substituting the world coordinates of the feature points into formula (7) to obtain formula (11):
Figure SMS_127
(11);
SV is used in formula (11)D, solving, wherein the solved parameters can be satisfied by any expansion and contraction, and the parameters are obtained by solving
Figure SMS_128
Bringing the calculated external orientation element value into the formula (11), and solving the simultaneous equations
Figure SMS_129
Figure SMS_130
(12);
External parameters
Figure SMS_131
The z coordinate value representing the origin of the target coordinate system in the linear array camera coordinate system, wherein the target is always in front of the camera in the process of shooting images, < + >>
Figure SMS_132
Eliminating the ambiguity of the scaling factor in the formula (12) by the constraint condition, so that all initial values of the internal and external azimuth elements of the linear array camera are uniquely determined;
the initial value of the parameter is optimized and solved by a nonlinear optimization method to obtain a first-order radial distortion parameter
Figure SMS_133
And the internal and external parameters of the optimized linear array camera, the coordinates of the 3D point re-projected to the 2D image plane are differenced with the pixels of the image point actually extracted, namely the projection position and the observation position of the 3D point are differenced, and nonlinear optimization is carried out by establishing an objective function with minimized re-projection error as shown in the formula (13):
Figure SMS_134
(13);
wherein ,
Figure SMS_135
representing a linear array camera calibration parameter equation for the first nonlinear optimization solution,
Figure SMS_136
vectors formed for all parameters to be optimized;
the characteristic points of all the target placement positions are
Figure SMS_137
,/>
Figure SMS_138
The j th feature point of the i th target placement position>
Figure SMS_139
For the v-axis coordinate of the pixel point extracted corresponding to the jth feature point of the ith target placement position, the pixel coordinate of the re-projection point obtained by the calculated camera parameters is +.>
Figure SMS_140
As shown in formula (14): />
Figure SMS_141
(14);
And obtaining final calibration results of the internal and external parameters and distortion parameters of the linear array camera under a strict imaging model by using an LM nonlinear optimization method, and integrating the linear array camera coordinate system with the external parameters of the mixed calibration target to reduce the linear array camera coordinate system to the total coordinate reference standard.
S6 comprises the following steps:
placing the mixed calibration targets for multiple times, observing the targets by a three-dimensional coordinate measuring system, establishing a target coordinate system, obtaining a target plane equation, scanning the mixed calibration targets placed at each target position by a 2D laser scanner to obtain scanning line point cloud data on a target plane, wherein the scanning plane of the 2D laser scanner is O-YOZ, namely, the point coordinate is 0 in the x-axis direction, and the point coordinate measured by the 2D laser scanner is expressed as
Figure SMS_142
Assume that the expression of the point measured by the 2D laser scanner in its correspondence with the linear array camera coordinate system is +.>
Figure SMS_143
The point coordinate transformation from the point coordinates on the line camera to the 2D laser scanner is expressed by equation (16):
Figure SMS_144
(16);
the 2D laser scanner point coordinates to linear camera point coordinates transformation formula is as follows (17):
Figure SMS_145
(17);
wherein ,
Figure SMS_146
、/>
Figure SMS_147
the rotation matrix and the translation vector of the linear array camera coordinate system to the 2D laser scanner coordinate system are respectively, and the formula (17) is a basic equation of the coordinate transformation of the 2D laser scanner and the linear array camera point.
S7 comprises the following steps:
setting a three-dimensional coordinate measuring system and a target plane which can be observed by a 2D laser scanner at the same time, observing a target by the 2D laser scanner to obtain a scanning line which is formed by a plurality of laser points on the target plane, calculating and fitting by using the three-dimensional coordinate measuring system on the observed target plane through uniform plane point coordinates to obtain a target plane equation in an overall reference coordinate system, and transferring the target plane equation to a plane equation under a linear array camera coordinate system as shown in a formula (18):
Figure SMS_148
(18);
in the formula ,
Figure SMS_149
is the unit normal vector of the target plane, satisfies +.>
Figure SMS_150
Figure SMS_151
For camera point coordinates
From equation (18), and from the principle that the product of a unit normal vector on a plane and a point on a plane is equal to the modulo length of the vector, a geometric constraint equation in the form of a vector is derived as equation (19):
Figure SMS_152
(19);
in the formula ,
Figure SMS_153
for plane vector parallel to the target, +.>
Figure SMS_154
,/>
Figure SMS_155
The method is the expression of the point on the target plane under a camera coordinate system;
the specific point-plane constraint conditions of the obtained linear array camera and the 2D laser scanner are as follows: the laser point is on the target plane, the product of the vector of the laser point to the origin of the camera and the normal vector of the target plane is the distance from the origin of the camera to the target plane, and the laser point coordinates are substituted into a formula (19) by combining a basic equation of point coordinate conversion of a formula (17), so that a formula (20) is obtained:
Figure SMS_156
(20);
Figure SMS_157
the method is the expression of points on a target plane under a scanner coordinate system;
completing construction of a geometric constraint model between the linear array camera and the 2D laser scanner;
all point coordinates measured with a 2D laser scanner are
Figure SMS_158
This feature is rewritten in the point coordinate form +.>
Figure SMS_159
Then formula (20) is rewritten as shown in formula (21):
Figure SMS_160
(21);
wherein ,
Figure SMS_161
(21) H is a transformation matrix after coordinate rewriting;
placing the targets j times, observing j times by a 2D laser scanner, and bringing the coordinates of i laser points of the scanner on each target into a formula (21) to obtain a linear formula shown as a formula (22):
Figure SMS_162
(22);
in the formula :
Figure SMS_163
a is a coefficient matrix after substituting the coordinates of the points;
Figure SMS_164
for 9 parameters in equation H, +.>
Figure SMS_165
b is a mode of a target plane vector corresponding to each scanner laser point; solving the least squares solution of the equation (22) as equation (23):
Figure SMS_166
(23);
order the
Figure SMS_167
In combination with the rotation matrix properties, formula (24) can be obtained:
Figure SMS_168
(24);
adopting SVD to decompose and approximate a rotation matrix, then replacing a singular value matrix with a unit matrix, and recalculating to obtain the rotation matrix
Figure SMS_169
Obtaining final combined calibration result->
Figure SMS_170
and />
Figure SMS_171
S8 comprises the following steps:
the point coordinates of the scanner on the target plane do not fall exactly on the target plane by applying to the actual point
Figure SMS_172
Non-linear optimization of the sum of residuals to the plane, reducing the influence of noise errors, thus to +.>
Figure SMS_173
And
Figure SMS_174
further optimized, an objective function established to minimize the sum of residuals is as formula (25):
Figure SMS_175
(25);
Figure SMS_176
representing the nonlinear optimization of the solution of the objective function of the joint calibration parameters using the LM algorithm, wherein +.>
Figure SMS_177
Normal vector of jth target, +.>
Figure SMS_178
An ith point represented on the jth target placement position;
obtaining a final combined calibration result after nonlinear optimization
Figure SMS_179
and />
Figure SMS_180
To this end, the linear camera and the 2D laser scanner achieve joint calibration with the aid of the three-dimensional coordinate measurement system.
The technical flow of the invention is shown in fig. 1, the target structure is shown in fig. 2, a plurality of linear cameras of the tunnel comprehensive detection equipment in fig. 3 surround the upper part of the 2D laser scanner in a semicircular array, and the linear cameras calibrate actual images as shown in fig. 4. Measurement of the target as shown in fig. 4, the target is measured by a 2D laser scanner, a three-dimensional coordinate measurement system and a line camera together, where n is a plane normal vector.
In practical experiments, the invention adopts a Dalsa-LA-CM-08k08A linear array camera, a long-walk LS9005A type lens and a 2D laser scanner, and parameters of the linear array camera and the lens are shown in table 1.
TABLE 1
Figure SMS_181
The calibration results of the linear array camera are shown in table 2.
TABLE 2
Figure SMS_182
The combined calibration results of the linear array camera and the laser section scanner are shown in table 3.
TABLE 3 Table 3
Figure SMS_183
The above embodiments are only for illustrating the technical aspects of the present invention, not for limiting the same, and although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may be modified or some or all of the technical features may be replaced with other technical solutions, which do not depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (6)

1. A linear array camera and 2D laser scanner joint calibration method is characterized by comprising the following steps:
s1, manufacturing a mixed calibration target, and establishing a target coordinate system;
s2, building an experiment platform;
s3, acquiring calibration data of the linear array camera, and obtaining world coordinates of feature points corresponding to image points according to the principle of cross ratio invariance;
s4, establishing a linear array camera imaging model;
s5, calculating calibration parameters of the two-step calibration normal array camera;
s6, determining a basic equation of point coordinate conversion of the laser section scanner and the linear array camera;
placing the mixed calibration targets for multiple times, observing the targets by a three-dimensional coordinate measuring system, establishing a target coordinate system, obtaining a target plane equation, scanning the mixed calibration targets placed at each target position by a 2D laser scanner to obtain scanning line point cloud data on a target plane, wherein the scanning plane of the 2D laser scanner is O-YOZ, namely, the point coordinate is 0 in the x-axis direction, and the point coordinate measured by the 2D laser scanner is expressed as
Figure QLYQS_1
Assume that the expression of the point measured by the 2D laser scanner in its correspondence with the linear array camera coordinate system is +.>
Figure QLYQS_2
The point coordinate transformation from the point coordinates on the line camera to the 2D laser scanner is expressed by equation (16):
Figure QLYQS_3
(16);
the 2D laser scanner point coordinates to linear camera point coordinates transformation formula is as follows (17):
Figure QLYQS_4
(17);
wherein ,
Figure QLYQS_5
、/>
Figure QLYQS_6
the rotation matrix and the translation vector are respectively from the linear array camera coordinate system to the 2D laser scanner coordinate system, and the formula (17) is a basic equation of the coordinate transformation of the 2D laser scanner and the linear array camera point;
s7, scanner data acquisition is performed, and a joint calibration geometric constraint model is constructed;
setting a three-dimensional coordinate measuring system and a target plane which can be observed by a 2D laser scanner at the same time, observing a target by the 2D laser scanner to obtain a scanning line which is formed by a plurality of laser points on the target plane, calculating and fitting by using the three-dimensional coordinate measuring system on the observed target plane through uniform plane point coordinates to obtain a target plane equation in an overall reference coordinate system, and transferring the target plane equation to a plane equation under a linear array camera coordinate system as shown in a formula (18):
Figure QLYQS_7
(18);
in the formula ,
Figure QLYQS_8
is the unit normal vector of the target plane, satisfies +.>
Figure QLYQS_9
Figure QLYQS_10
For camera point coordinates
From equation (18), and from the principle that the product of a unit normal vector on a plane and a point on a plane is equal to the modulo length of the vector, a geometric constraint equation in the form of a vector is derived as equation (19):
Figure QLYQS_11
(19);
in the formula ,
Figure QLYQS_12
in order to be parallel to the plane vector of the target,
Figure QLYQS_13
,/>
Figure QLYQS_14
the method is the expression of the point on the target plane under a camera coordinate system;
the specific point-plane constraint conditions of the obtained linear array camera and the 2D laser scanner are as follows: the laser point is on the target plane, the product of the vector of the laser point to the origin of the camera and the normal vector of the target plane is the distance from the origin of the camera to the target plane, and the laser point coordinates are substituted into a formula (19) by combining a basic equation of point coordinate conversion of a formula (17), so that a formula (20) is obtained:
Figure QLYQS_15
(20);
Figure QLYQS_16
the method is the expression of points on a target plane under a scanner coordinate system;
completing construction of a geometric constraint model between the linear array camera and the 2D laser scanner;
all point coordinates measured with a 2D laser scanner are
Figure QLYQS_17
This feature is rewritten in the point coordinate form +.>
Figure QLYQS_18
Then formula (20) is rewritten as shown in formula (21):
Figure QLYQS_19
(21);
wherein ,
Figure QLYQS_20
(21) H is a transformation matrix after coordinate rewriting;
placing the targets j times, observing j times by a 2D laser scanner, and bringing the coordinates of i laser points of the scanner on each target into a formula (21) to obtain a linear formula shown as a formula (22):
Figure QLYQS_21
(22);
in the formula :
Figure QLYQS_22
a is a coefficient matrix after substituting the coordinates of the points;
Figure QLYQS_23
for the 9 parameters in equation H,
Figure QLYQS_24
b is a mode of a target plane vector corresponding to each scanner laser point; solving the least squares solution of the equation (22) as equation (23):
Figure QLYQS_25
(23);
order the
Figure QLYQS_26
In combination with the rotation matrix properties, formula (24) can be obtained:
Figure QLYQS_27
(24);
adopting SVD to decompose and approximate a rotation matrix, then replacing a singular value matrix with a unit matrix, and recalculating to obtain the rotation matrix
Figure QLYQS_28
Obtaining final combined calibration result->
Figure QLYQS_29
and />
Figure QLYQS_30
S8, nonlinear optimization accurately solves the joint calibration parameters.
2. The linear camera and 2D laser scanner joint calibration method according to claim 1, wherein S1 comprises:
the mixed calibration target comprises a fixing piece and a plane target body, wherein the target body comprises a plurality of diffuse reflection mark points, a 1.5-inch square laser tracker target ball hole, a 0.5-inch square laser tracker target ball hole and 7 cross-ratio triangle patterns, and the 7 cross-ratio triangle patterns comprise 8 straight lines and 7 oblique lines;
the fixing piece supports the target main body, and the target main body is fixed through the fixing piece after the target position is moved in the process of calibration data acquisition;
the diffuse reflection mark points are suitable for non-contact measurement of a theodolite industrial measurement system, so as to obtain three-dimensional coordinates of the diffuse reflection mark points, the mark points are uniformly distributed on a target main body, wherein three mark points respectively correspond to an O point, an X point and a Y point of a target coordinate system, the connection line of the two points of OX is an X axis, the connection line of the two points of OY is a Y axis, the O point is used as a coordinate origin of the target coordinate system, and the O point is upwards defined as a Z axis through being perpendicular to an O-XY plane;
the laser tracker target ball hole is used for carrying out contact measurement on the laser tracker and a target corresponding to the laser tracker, and a target coordinate system is established through the hole center point coordinate;
calculating to obtain straight line equations of all straight lines in 7 cross triangle patterns in a target coordinate system according to the design size of the cross triangle patterns;
s2 comprises the following steps: the experimental platform comprises a three-dimensional coordinate measuring system, a linear array camera, a 2D laser scanner, a linear array camera illumination light source and a mixed calibration target, wherein the three-dimensional coordinate measuring system comprises a laser tracker or a theodolite industrial measuring system, and the mixed calibration target is arranged on a fixed frame through a fixing piece, so that the three-dimensional coordinate measuring system, the linear array camera and the 2D laser scanner can observe the mixed calibration target.
3. The linear camera and 2D laser scanner joint calibration method according to claim 2, wherein S3 comprises:
placing the mixed calibration target in the field of view of the linear array camera, polishing by matching with a linear array camera illumination light source, enabling the linear array camera to acquire clear calibration pictures, measuring the mixed calibration target by using a three-dimensional coordinate measuring system after the linear array camera acquires the calibration pictures, taking the coordinate system of the three-dimensional coordinate measuring system as a reference standard of an overall coordinate system, calculating the target coordinate system under the reference standard of the overall coordinate, fitting out a target plane at the same time, and taking the target coordinate system as a world coordinate system when solving the internal reference calibration parameters of the linear array camera;
when the linear array camera is actually used for shooting, the camera view plane is intersected with the cross ratio triangle on the mixed calibration target, and 15 characteristic points are
Figure QLYQS_31
The image points are represented as black straight lines on an actual calibration image, and the image points corresponding to the characteristic points are extracted through an edge detection algorithm>
Figure QLYQS_32
Is defined by the pixel coordinates of (a);
calculating to obtain the world coordinates of the feature points corresponding to the image points by the principle of cross-ratio invariance and the linear equation of each line segment on the target plane
Figure QLYQS_33
Wherein, feature point->
Figure QLYQS_34
Is the intersection point on the ith straight line on the target plane.
4. The method for calibrating a line camera and a 2D laser scanner in combination according to claim 3, wherein S4 comprises:
in linear array cameras, world coordinates of feature points are known
Figure QLYQS_35
The correspondence between the pixel coordinates (u, v) of its corresponding image point is as follows:
Figure QLYQS_36
the above is a camera imaging model that does not take into account lens distortion, wherein,
Figure QLYQS_37
for rotating matrix->
Figure QLYQS_38
9 elements of->
Figure QLYQS_39
Comprising a translation vector of three elements +.>
Figure QLYQS_40
The formula (1) is a view plane equation of the linear array camera, and the formula (2) is an equation conforming to the central projection relation>
Figure QLYQS_41
Representing the rotational translation from the world coordinate system to the camera coordinate system, v is the image principal point coordinate value, v 0 Calibrating an internal parameter to be solved for the linear array camera, wherein the internal parameter represents the main point offset>
Figure QLYQS_42
Representing the corresponding physical size of the pixel point in the y-axis direction;
taking the distortion of the linear array camera in the y direction into consideration, establishing a linear array camera distortion model only taking the first-order radial distortion into consideration as follows:
Figure QLYQS_43
(3);
wherein
Figure QLYQS_44
Is distortion in the y-axis; />
Figure QLYQS_45
Is a first order radial distortion coefficient; />
Figure QLYQS_46
Representing the component of the lens distortion value on the v axis under the pixel coordinate system;
and (3) synthesizing (1), 2 and 3) to obtain a complete linear array camera imaging model as follows:
Figure QLYQS_47
(4)。
5. the method for calibrating a line camera and a 2D laser scanner in combination according to claim 4, wherein S5 comprises:
based on the corresponding relation between the feature points and the image points and the imaging model of the linear array camera, calculating the geometric imaging model parameters of the linear array camera by adopting a two-step method;
direct solution by direct linear transformation method without considering influence of lens distortion
Figure QLYQS_48
、v 0 And external parameters
Figure QLYQS_49
Approximation, wherein->
Figure QLYQS_50
The rotation angles are obtained according to the camera rotation matrix; />
Then taking the acquired internal parameters and external parameters of the linear array camera as initial values, and acquiring a final linear array camera calibration result containing first-order radial distortion by using an LM nonlinear optimization method under a strict imaging model;
by the formula (1)
Figure QLYQS_51
Extracting and substituting the extract into the formula (2) to obtain a formula (5):
Figure QLYQS_52
(5);
by parameters
Figure QLYQS_53
Substituting the camera inside and outside parameters in the inside and outside (5) of the camera to obtain the formula (6): />
Figure QLYQS_54
Figure QLYQS_55
Respectively as shown in formula (7):
Figure QLYQS_56
(7);
as can be seen from the nature of the rotation matrix,
Figure QLYQS_57
at the same time consider->
Figure QLYQS_58
Under the condition that the origin of the target coordinate system is below the view plane, the view plane can intersect all straight lines on the target, and the formula is obtained:
Figure QLYQS_59
(8);
wherein ,
Figure QLYQS_60
is the equation parameter of the plane equation, so far, the parameter is obtained by solving
Figure QLYQS_61
Order the
Figure QLYQS_62
,/>
Figure QLYQS_63
For scaling factor +.>
Figure QLYQS_64
Is thatq i After scalingFrom equation (8) in combination with the properties of the rotation matrix, equation (9) is obtained:
Figure QLYQS_65
(9);
from the rotation matrix properties, formula (10) can be obtained:
Figure QLYQS_66
(10);
substituting the world coordinates of the feature points into formula (7) to obtain formula (11):
Figure QLYQS_67
(11);
the equation (11) is solved by adopting an SVD method, and the parameters obtained by the solution can be satisfied by any scaling, so far, the parameters are obtained by the solution
Figure QLYQS_68
Bringing the calculated external orientation element value into the formula (11), and solving the simultaneous equations
Figure QLYQS_69
Figure QLYQS_70
(12);
External parameters
Figure QLYQS_71
The z coordinate value representing the origin of the target coordinate system in the linear array camera coordinate system, wherein the target is always in front of the camera in the process of shooting images, < + >>
Figure QLYQS_72
The ambiguity of the scaling factor in equation (12) is resolved by this constraintSo far, all initial values of the internal and external azimuth elements of the linear array camera are uniquely determined;
the initial value of the parameter is optimized and solved by a nonlinear optimization method to obtain a first-order radial distortion parameter
Figure QLYQS_73
And the internal and external parameters of the optimized linear array camera, the coordinates of the 3D point re-projected to the 2D image plane are differenced with the pixels of the image point actually extracted, namely the projection position and the observation position of the 3D point are differenced, and nonlinear optimization is carried out by establishing an objective function with minimized re-projection error as shown in the formula (13):
Figure QLYQS_74
(13);
wherein ,
Figure QLYQS_75
representing a linear array camera calibration parameter equation for the first nonlinear optimization solution,
Figure QLYQS_76
vectors formed for all parameters to be optimized;
the characteristic points of all the target placement positions are
Figure QLYQS_77
,/>
Figure QLYQS_78
The j th feature point of the i th target placement position>
Figure QLYQS_79
For the v-axis coordinate of the pixel point extracted corresponding to the jth feature point of the ith target placement position, the pixel coordinate of the re-projection point obtained by the calculated camera parameters is +.>
Figure QLYQS_80
As shown in formula (14): />
Figure QLYQS_81
(14);
And obtaining final calibration results of the internal and external parameters and distortion parameters of the linear array camera under a strict imaging model by using an LM nonlinear optimization method, and integrating the linear array camera coordinate system with the external parameters of the mixed calibration target to reduce the linear array camera coordinate system to the total coordinate reference standard.
6. The method for calibrating a linear camera and a 2D laser scanner in combination according to claim 1, wherein S8 comprises:
the point coordinates of the scanner on the target plane do not fall exactly on the target plane by applying to the actual point
Figure QLYQS_82
Non-linear optimization of the sum of residuals to the plane, reducing the influence of noise errors, thus to +.>
Figure QLYQS_83
And
Figure QLYQS_84
further optimized, an objective function established to minimize the sum of residuals is as formula (25):
Figure QLYQS_85
(25);
Figure QLYQS_86
representing the nonlinear optimization of the solution of the objective function of the joint calibration parameters using the LM algorithm, wherein +.>
Figure QLYQS_87
Normal vector of jth target, +.>
Figure QLYQS_88
An ith point represented on the jth target placement position;
obtaining a final combined calibration result after nonlinear optimization
Figure QLYQS_89
and />
Figure QLYQS_90
To this end, the linear camera and the 2D laser scanner achieve joint calibration with the aid of the three-dimensional coordinate measurement system. />
CN202310323918.9A 2023-03-30 2023-03-30 Linear array camera and 2D laser scanner combined calibration method Active CN116051659B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310323918.9A CN116051659B (en) 2023-03-30 2023-03-30 Linear array camera and 2D laser scanner combined calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310323918.9A CN116051659B (en) 2023-03-30 2023-03-30 Linear array camera and 2D laser scanner combined calibration method

Publications (2)

Publication Number Publication Date
CN116051659A CN116051659A (en) 2023-05-02
CN116051659B true CN116051659B (en) 2023-06-13

Family

ID=86129883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310323918.9A Active CN116051659B (en) 2023-03-30 2023-03-30 Linear array camera and 2D laser scanner combined calibration method

Country Status (1)

Country Link
CN (1) CN116051659B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116485918B (en) * 2023-06-25 2023-09-08 天府兴隆湖实验室 Calibration method, calibration system and computer readable storage medium
CN116625240B (en) * 2023-07-20 2023-09-19 中交第一航务工程局有限公司 Calibration method of combined underwater positioning equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method
CN111325801A (en) * 2020-01-23 2020-06-23 天津大学 Combined calibration method for laser radar and camera

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102339463B (en) * 2010-07-22 2015-10-21 首都师范大学 Based on the system and method for laser scanner calibration line array camera
CN111476844B (en) * 2020-02-26 2022-08-16 武汉大学 Calibration method for multiple linear array camera array systems
US20220128671A1 (en) * 2020-10-26 2022-04-28 Faro Technologies, Inc. Dynamic self-calibrating of auxiliary camera of laser scanner

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method
CN111325801A (en) * 2020-01-23 2020-06-23 天津大学 Combined calibration method for laser radar and camera

Also Published As

Publication number Publication date
CN116051659A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN116051659B (en) Linear array camera and 2D laser scanner combined calibration method
Luhmann Close range photogrammetry for industrial applications
CN108844459B (en) Calibration method and device of blade digital sample plate detection system
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN110378969B (en) Convergent binocular camera calibration method based on 3D geometric constraint
Zhang et al. A universal and flexible theodolite-camera system for making accurate measurements over large volumes
EA031929B1 (en) Apparatus and method for three dimensional surface measurement
CN111091599B (en) Multi-camera-projector system calibration method based on sphere calibration object
WO2018201677A1 (en) Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN115830103A (en) Monocular color-based transparent object positioning method and device and storage medium
Boehm et al. Accuracy of exterior orientation for a range camera
CN111754584A (en) Remote large-field-of-view camera parameter calibration system and method
CN115046498B (en) Calibration method for monocular rotating structure light three-dimensional measurement system
CN110827359A (en) Checkerboard trihedron-based camera and laser external reference checking and correcting method and device
Wu Photogrammetry: 3-D from imagery
Oniga et al. Metric and Non-Metric Cameras Calibration for the Improvement of Real-Time Monitoring Process Results.
CN114663520A (en) Double-camera combined calibration method and system for ultra-large range vision measurement
CN114170321A (en) Camera self-calibration method and system based on distance measurement
CN114663486A (en) Building height measurement method and system based on binocular vision
CN116797669B (en) Multi-camera array calibration method based on multi-face tool
CN114046779B (en) Visual measurement adjustment method based on additional large-scale constraint between measuring station and control point
CN117173256B (en) Calibration method and device of line dynamic laser system with double vibrating mirrors
Hemmati et al. A study on the refractive effect of glass in vision systems
Khurana et al. Localization and mapping using a non-central catadioptric camera system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant