CN111612844A - Three-dimensional laser scanner and camera calibration method based on sector features - Google Patents

Three-dimensional laser scanner and camera calibration method based on sector features Download PDF

Info

Publication number
CN111612844A
CN111612844A CN202010267549.2A CN202010267549A CN111612844A CN 111612844 A CN111612844 A CN 111612844A CN 202010267549 A CN202010267549 A CN 202010267549A CN 111612844 A CN111612844 A CN 111612844A
Authority
CN
China
Prior art keywords
point
line
calibration plate
point cloud
folding fan
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010267549.2A
Other languages
Chinese (zh)
Other versions
CN111612844B (en
Inventor
安毅
李博
胡兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN202010267549.2A priority Critical patent/CN111612844B/en
Publication of CN111612844A publication Critical patent/CN111612844A/en
Application granted granted Critical
Publication of CN111612844B publication Critical patent/CN111612844B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention belongs to the technical field of three-dimensional point cloud data processing and three-dimensional scene reconstruction, and discloses a three-dimensional laser scanner and camera calibration method based on sector characteristics, which comprises the following steps: (1) manufacturing a black and white folding fan calibration plate, (2) collecting three-dimensional point cloud and a two-dimensional image of the calibration plate, (3) extracting characteristic ruffles of the three-dimensional line point cloud, (4) determining a coplanar line point cloud segment, (5) estimating a central point and an end point of the folding fan calibration plate, (6) optimizing the central point and the end point of the folding fan calibration plate, (7) extracting the central point and the end point of the folding fan calibration plate in an image coordinate system, (8) obtaining the three-dimensional point cloud and the two-dimensional image of the calibration plate at different poses, and (9) calculating the geometric mapping relation between the point cloud and the image. The invention adopts a method of discrete curvature extreme value to obtain the characteristic fold points of the three-dimensional line point cloud, divides the three-dimensional point cloud according to the characteristic fold points, and utilizes the geometric characteristics of a calibration plate appliance as the basis of nonlinear optimization, so that the calibration of the three-dimensional laser scanner and the camera is more accurate and reliable.

Description

Three-dimensional laser scanner and camera calibration method based on sector features
Technical Field
The invention relates to a three-dimensional laser scanner and camera calibration method based on sector characteristics, and belongs to the technical field of three-dimensional point cloud data processing and three-dimensional scene reconstruction.
Background
In the process of digitalizing the real world, the three-dimensional point cloud data records the geometric attributes and the position information of the surface of an object, the two-dimensional image records the color information and the texture information of the surface of the object, the two-dimensional image and the color information are deeply fused to form a new digital medium, namely three-dimensional color point cloud data, and the three-dimensional color point cloud data is the further development of the three-dimensional point cloud data and can more accurately represent the real world. In the process of fusing the three-dimensional point cloud and the two-dimensional image, the calibration of the laser scanner and the camera is the most key technology for determining the fusion precision, has stronger theoretical significance and application value, and has more and more applications in the fields of industrial detection, environmental perception, autonomous navigation and the like.
The three-dimensional laser scanner has two working modes, one mode is that three-dimensional scanning is realized through transverse and longitudinal rotation of single laser, the other mode is that three-dimensional scanning is realized through transverse rotation of a plurality of lasers, and three-dimensional point clouds obtained by the three-dimensional laser scanner in the two working modes are regular gridding three-dimensional point cloud data. The three-dimensional laser scanner and camera calibration mainly refers to: the method comprises the steps of scanning and shooting a calibration scene by using a three-dimensional laser scanner and a camera, respectively obtaining a three-dimensional point cloud and a two-dimensional image of the calibration scene, and solving a camera internal reference matrix and a rotation matrix and a translation vector between the three-dimensional laser scanner and the camera by using a camera imaging principle. And determining a geometric mapping relation between the three-dimensional point cloud in the laser coordinate system and the two-dimensional image in the image coordinate system to obtain a pixel point corresponding to each laser scanning point.
Through a large amount of researches, the three-dimensional laser scanner and camera calibration method similar to the method disclosed by the invention is as follows: and manufacturing a black-white grid calibration plate, wherein circular holes with the same size are uniformly distributed on the calibration plate. And scanning the calibration plate by using a three-dimensional laser scanner to obtain the three-dimensional point cloud of the calibration plate. And simultaneously, shooting the calibration plate by using a camera to obtain a two-dimensional image of the calibration plate. The spatial coordinates of the center of the round hole are obtained in a laser coordinate system, the pixel coordinates of the center of the round hole are obtained in a camera coordinate system, point geometric constraint is built, and internal parameters and external parameters are obtained, so that the calibration of the three-dimensional laser scanner and the camera is realized. This method has the following disadvantages: 1) when the three-dimensional point cloud is sparse, the spatial coordinates of the center of the circular hole are difficult to accurately calculate in a laser coordinate system, and the calibration accuracy is influenced; 2) the structure of the three-dimensional point cloud is not deeply analyzed, geometric constraint is built by excessively depending on the circular hole of the calibration plate, and the robustness and accuracy of calibration need to be improved.
Disclosure of Invention
In order to further improve the precision of the three-dimensional laser scanner and the camera calibration, the invention provides a three-dimensional laser scanner and a camera calibration method based on fan-shaped characteristics. The invention provides a calibration method for a three-dimensional laser scanner and a camera when the three-dimensional laser scanner and the camera are used for scanning and shooting a three-dimensional scene so as to solve the geometric mapping relation between a three-dimensional point cloud in a laser coordinate system and a two-dimensional image in an image coordinate system, thereby realizing the accurate fusion between the three-dimensional point cloud of the laser scanner and the two-dimensional image of the camera and acquiring the three-dimensional color point cloud of the scene in real time.
In order to realize the purpose of the invention and solve the problems in the prior art, the invention adopts the technical scheme that: the three-dimensional laser scanner and camera calibration method based on the fan-shaped features comprises the following steps:
step 1, manufacturing a black and white folding fan calibration plate, wherein the unfolding angle of the folding fan calibration plate is 180 degrees, and the radius raThe characteristics are as follows when the length is 50 cm: between 0 degree and 180 degrees, every 30 degrees is a peak pleat, totally 7 peak pleats, the length of each peak pleat is 50cm, and the 7 peak pleats are coplanar; between 15 and 165 degrees, every 30 degrees is a valley fold, and the total number of the valley folds is 6, wherein the length of each valley fold is 50cm, and the 6 valley folds are coplanar; starting at 0 degrees, connecting end points of the crest pleat and the valley pleat in sequence to form a folding fan surface in sequence to form a folding fan calibration plate; the included angle between the plane formed by 7 peak folds and the plane formed by 6 valley folds is 30 degrees;
step 2, collecting three-dimensional point cloud and two-dimensional image of the calibration plate, fixing a three-dimensional laser scanner and a camera, facing the peak pleat of the calibration plate to the three-dimensional laser scanner and the camera, scanning the calibration plate by using the three-dimensional laser scanner to obtain three-dimensional point cloud P of the calibration plate, and simultaneously, obtaining the three-dimensional point cloud P of the calibration plate by using the three-dimensional laser scannerShooting a calibration plate by a camera to obtain a two-dimensional image I ═ q of the calibration platei=(ui,vi)|1≤i≤niWherein q isi=(ui,vi) Is the ith pixel point, n, in the two-dimensional image of the calibration plateiThe number of pixel points in the two-dimensional image of the calibration plate is determined; laser coordinate system [ O ]l;x,y,z]Origin O oflThe xy plane is parallel to the three-dimensional laser scanner base; camera coordinate system
Figure BDA0002441864520000031
Origin O ofcIs positioned in the optical center of the camera lens,
Figure BDA0002441864520000032
the plane is parallel to the image sensor plane; image coordinate system [ O ]a;u,v]Origin O ofaThe vertex of the upper left corner of the plane of the image sensor is positioned, and the uv plane is positioned on the plane of the image sensor;
step 3, extracting characteristic ruffles of the three-dimensional line point clouds, decomposing the three-dimensional point clouds of the calibration plate into a plurality of line point clouds according to the scanning mode of the three-dimensional laser scanner, and projecting each line point cloud to a corresponding projection reference coordinate system to form projection line point clouds; carrying out noise reduction treatment on each projection line point cloud by utilizing Gaussian regression to form a smooth projection line point cloud; calculating the discrete curvature of each smooth projection line point cloud, wherein the maximum value point of the discrete curvature corresponds to the characteristic inflection point of the line point cloud, and the method specifically comprises the following substeps:
(a) three-dimensional laser scanner has two kinds of working methods, one kind is that horizontal and vertical rotation through single line laser realizes the three-dimensional scanning, the horizontal and vertical rotation of single line laser can form laser horizontal scanning face and the vertical scanning face of laser, another kind is that horizontal rotation through multi-line laser realizes the three-dimensional scanning, the horizontal rotation of multi-line laser can form the horizontal scanning face of laser, the three-dimensional point cloud that three-dimensional laser scanner of these two kinds of working methods obtained is regular grid three-dimensional point cloud data, consequently, the three-dimensional point cloud P of calibration board has following expression: p ═ Pi=(xi,yi,zi) I is more than or equal to 1 and less than or equal to mi, wherein pi=(xi,yi,zi) For the ith laser scanning point m in the three-dimensional point cloudiThe number of laser scanning points in the three-dimensional point cloud is obtained; decomposing the three-dimensional point cloud P into a plurality of line point clouds according to the line number of the laser scanning, namely P ═ { P ═ Pj|1≤j≤mj},
Figure BDA0002441864520000033
Wherein, PjIs the jth transverse line point cloud in the three-dimensional point cloud, which is composed of a series of ordered discrete points distributed on the laser scanning line, mjIs the number of the line point clouds in the three-dimensional point cloud,
Figure BDA0002441864520000034
is the ith laser scanning point in the jth transverse line point cloud in the three-dimensional point cloud,
Figure BDA0002441864520000035
the number of laser scanning points in the jth transverse line point cloud is obtained; decomposing the three-dimensional point cloud P into a plurality of sector point clouds according to the sector distribution of the three-dimensional point cloud of the calibration plate, namely P ═ Pk|1≤k≤mkIn which P iskIs the kth sector point cloud in the three-dimensional point cloud, which consists of a series of ordered discrete points distributed on the sector of the calibration plate, mkThe total number of surface point clouds in the three-dimensional point cloud is obtained;
(b) for the jth transverse line point cloud P in the three-dimensional point cloudjMaking a cutting plane of the laser transverse scanning plane through the central laser scanning point and the laser optical center, and establishing a projection reference coordinate system on the cutting plane
Figure BDA0002441864520000041
The origin of the projection reference coordinate system is located at the laser optical center,
Figure BDA0002441864520000042
the axis coincides with the x-axis of the laser coordinate system; will be provided with
Figure BDA0002441864520000043
Coordinate transformation of formula (1) is performedAlternatively, the projection is projected to a projection reference coordinate system to form a projection line point cloud
Figure BDA0002441864520000044
Wherein the content of the first and second substances,
Figure BDA0002441864520000045
the point cloud of the projection line of the jth item,
Figure BDA0002441864520000046
the projection laser scanning point is the ith projection laser scanning point in the jth projection line point cloud;
Figure BDA0002441864520000047
(c) using Gaussian regression to make point cloud of j-th projection line
Figure BDA0002441864520000048
Performing noise reduction processing to obtain smooth projection line point cloud
Figure BDA0002441864520000049
Wherein the content of the first and second substances,
Figure BDA00024418645200000410
for the jth smoothed projection line point cloud,
Figure BDA00024418645200000411
the ith smooth projection laser scanning point in the jth smooth projection line point cloud is obtained;
(d) method for determining smooth projection laser scanning point by adopting accumulated chord length parameterization method
Figure BDA00024418645200000412
Value of (2)
Figure BDA00024418645200000413
The expression form is described by formula (2),
Figure BDA00024418645200000414
parameter(s)
Figure BDA00024418645200000415
And smoothly projecting the laser scanning spot
Figure BDA00024418645200000416
Form a one-to-one mapping relation
Figure BDA00024418645200000417
For each parameter
Figure BDA00024418645200000418
All have a smooth projection laser scanning spot
Figure BDA0002441864520000051
Corresponding to it, is shown as
Figure BDA0002441864520000052
Figure BDA0002441864520000053
Wherein the content of the first and second substances,
Figure BDA0002441864520000054
in order to be a set of parameters, the parameters,
Figure BDA0002441864520000055
and
Figure BDA0002441864520000056
as a parameter
Figure BDA0002441864520000057
Thus, the point is determined
Figure BDA0002441864520000058
Expressed as parameters
Figure BDA0002441864520000059
Of discrete vector functions, i.e.
Figure BDA00024418645200000510
(e) Estimating discrete functions
Figure BDA00024418645200000511
And
Figure BDA00024418645200000512
in that
Figure BDA00024418645200000513
The derivative of (c) is calculated according to the formula (3) and the formula (4),
Figure BDA00024418645200000514
Figure BDA00024418645200000515
wherein the content of the first and second substances,
Figure BDA00024418645200000516
as a discrete function
Figure BDA00024418645200000517
In that
Figure BDA00024418645200000518
The derivative of (a) of (b),
Figure BDA00024418645200000519
discrete function
Figure BDA00024418645200000520
In that
Figure BDA00024418645200000521
The derivative of (a), m is the radius of the neighborhood, then the vector function is discrete
Figure BDA00024418645200000522
In that
Figure BDA00024418645200000523
The derivative of (a) is that of,
Figure BDA00024418645200000524
wherein the content of the first and second substances,
Figure BDA00024418645200000525
also referred to as discrete derivatives for short;
(f) estimating
Figure BDA00024418645200000526
In that
Figure BDA00024418645200000527
Unit tangent vector of
Figure BDA00024418645200000528
And discrete curvature
Figure BDA00024418645200000529
Figure BDA00024418645200000530
Figure BDA00024418645200000531
Wherein the content of the first and second substances,
Figure BDA00024418645200000532
the method is the same as the method of the formula (3) and the formula (4);
(g) obtaining the point cloud of the jth smooth projection line by using the discrete curvature result calculated in the substep (f) in the step 3
Figure BDA0002441864520000061
The discrete curvature maximum point corresponding to the laserJ-th transverse line point cloud P under optical coordinate systemjSo as to obtain the jth transverse line point cloud PjThe characteristic fold points of (1);
step 4, determining a coplanar line point cloud segment, and segmenting the line point cloud by using characteristic ruffles of the line point cloud under a laser coordinate system, wherein each segmented part is called a line point cloud segment; performing linear fitting on the line point cloud segment to obtain a fitting linear line, wherein the fitting linear line is called a laser scanning line segment; determining coplanar laser scanning line segments on adjacent line point clouds to obtain coplanar line point cloud segments, and specifically comprising the following substeps:
(a) obtaining P by using substep (g) in step 3jIs characterized by the fold point of PjDivided into several line-cloud segments, PjCan be expressed as
Figure BDA0002441864520000062
Wherein the content of the first and second substances,
Figure BDA0002441864520000063
for the kth line point cloud segment in the jth line point cloud,
Figure BDA0002441864520000064
the number of line cloud segments in the point cloud of the jth line is the number of the line cloud segments in the point cloud of the jth line;
(b) utilizing a least square method to segment the kth line point cloud in the jth line point cloud
Figure BDA0002441864520000065
Fitting straight line called laser scanning line segment
Figure BDA0002441864520000066
And is provided with
Figure BDA0002441864520000067
Figure BDA0002441864520000068
For the kth laser scanning line segment, L, in the jth line point cloudjFor scanning laser in jth line point cloudA collection of fragments;
(c) for a set L of laser scan line segmentsjAnd Lj+1Determining L by using formula (8) to judge that two straight lines are coplanarjAnd Lj+1If the laser scanning line segments are coplanar, the corresponding line point cloud segments are coplanar and are laser scanning points in the same sector, which is called as sector point cloud PkAnd has P ═ Pk|1≤k≤mk};
(o1-o2)·(l1×l2)=0 (8)
Wherein o is1、o2Is the passing point of any two straight lines,/1、l2Is a direction vector of two straight lines;
step 5, estimating the central point and the end point of the folding fan calibration plate, and performing plane fitting on the sector point cloud under a laser coordinate system to obtain a fitting plane, wherein the fitting plane is called a sector; calculating the intersecting line of the adjacent sectors, wherein the intersecting line is the fold line of the folding fan calibration plate; estimating the center point of the folding fan calibration plate by utilizing the intersection of the folding lines, and estimating the end point of the folding fan calibration plate according to the direction vector of the folding lines and the folding line length of the folding fan calibration plate, wherein the method specifically comprises the following substeps:
(a) using least square method to make a point cloud P of sectorkFitting a plane, called a sector FkAnd has F ═ Fk|1≤k≤mkIn which FkIs the kth sector, and F is the set of all sectors;
(b) for adjacent sectors FkAnd Fk+1And simultaneous two-plane equation can obtain the cross line SkAnd obtaining the direction vector of the intersection line as dkAnd has S ═ Sk|1≤k≤mk-1} and D ═ Dk|1≤k≤mk-1, the intersection line is the fold line of the folding fan calibration plate, wherein S is the collection of fold lines of the folding fan calibration plate, SkThe fold line of the kth folding fan calibration plate, D is the collection of direction vectors of the fold line of the folding fan calibration plate, DkThe direction vector of the fold line of the kth folding fan calibration plate is defined;
(c) folding line set S ═ S for folding fan calibration platek|1≤k≤mk-1}, calculating mkThe intersection point of the fold lines of 1 folding fan calibration plate is the central point p of the folding fan calibration platecWherein p iscThe central point of the folding fan calibration plate is taken as the central point;
(d) estimating the end point of the folding fan calibration plate by the formula (9)
Figure BDA0002441864520000071
And is provided with
Figure BDA0002441864520000072
Figure BDA0002441864520000073
Wherein r isa50cm is the fold line length of the folding fan calibration plate, PeThe set of endpoints of the panels are calibrated for the folding fan,
Figure BDA0002441864520000074
calibrating the kth end point of the folding fan calibration plate;
step 6, optimizing the central point and the end point of the folding fan calibration plate, and regarding the collection of the end points of the folding fan calibration plate under the laser coordinate system
Figure BDA0002441864520000075
According to the characteristic of staggered arrangement of crest folds and valley folds, P is addedeIs divided into peak pleat end points
Figure BDA0002441864520000076
And point of valley fold
Figure BDA0002441864520000077
And m isp+mv=mk-1, wherein PpThe set of peak pleat ends of the panel is scaled for the folding fan,
Figure BDA0002441864520000078
scaling the k-th peak pleat end point, m, of the panel for a folding fanpFor calibrating the number of peak-fold end points, P, of the panel for a folding fanvThe collection of tuck endpoints of the panel is scaled for the folding fan,
Figure BDA0002441864520000081
scaling the kth valley-fold end point, m, of the panel for a folding fanvThe number of valley fold end points of the folding fan calibration plate is determined; the optimal problem is constructed by utilizing the included angle of adjacent crest pleats as 30 degrees and the included angle of adjacent valley pleats as 30 degrees
Figure BDA0002441864520000082
Wherein, the optimal problem is solved by the central point p of the folding fan calibration plate estimated in the step 5cPeak pleat end point
Figure BDA0002441864520000083
And point of valley fold
Figure BDA0002441864520000084
The central point of the optimized folding fan calibration plate can be obtained for the initial value of the optimal problem
Figure BDA0002441864520000085
Peak pleat end point
Figure BDA0002441864520000086
And point of valley fold
Figure BDA0002441864520000087
Wherein the content of the first and second substances,
Figure BDA0002441864520000088
the center point of the optimized folding fan calibration plate under the laser coordinate system,
Figure BDA0002441864520000089
the set of the peak pleat end points of the folding fan calibration plate after optimization under the laser coordinate system is obtained,
Figure BDA00024418645200000810
the k-th peak pleat end point in the plate peak pleat end point set is calibrated for the optimized folding fan under the laser coordinate system,
Figure BDA00024418645200000811
the set of valley fold end points of the folding fan calibration plate after optimization under the laser coordinate system is obtained,
Figure BDA00024418645200000812
calibrating the kth valley-fold endpoint in the panel valley-fold endpoint set for the optimized folding fan under the laser coordinate system;
step 7, extracting the central point and the end point of the folding fan calibration plate in the image coordinate system, and obtaining the image coordinate system [ O ]a;u,v]In the method, a central point q of a folding fan calibration plate is obtained by using an angular point extraction algorithmcPeak pleat end point
Figure BDA00024418645200000813
And point of valley fold
Figure BDA00024418645200000814
Wherein q iscFor the centre point, I, of the calibration plate for the folding fan under the image coordinate systempSets of peak pleat end points of the folding fan calibration plate under the image coordinate system,
Figure BDA00024418645200000815
scaling the kth crest fold end point, I, of the panel for an image coordinate systemvSets of valley fold end points of the folding fan calibration panel under the image coordinate system,
Figure BDA0002441864520000091
calibrating the kth valley-fold endpoint of the panel for the folding fan under the image coordinate system;
step 8, obtaining three-dimensional point cloud and two-dimensional images of the calibration plate under different poses, changing the pose of the folding fan calibration plate, scanning and shooting the calibration plate, and obtaining
Figure BDA0002441864520000092
Repeating the steps 3 to 7 to obtain the central point and the end point of the calibration plate of the folding fan of the laser coordinate system under different poses and the corresponding three-dimensional point cloud and two-dimensional image of the calibration plate under different posesThe central point and the end point of the image coordinate system folding fan calibration plate;
step 9, calculating a geometric mapping relation between the point cloud and the image, constructing an over-determined equation set by using a camera pinhole model, calculating the geometric mapping relation between the point cloud and the image, and completing calibration of the three-dimensional laser scanner and the camera, wherein the method specifically comprises the following substeps:
(a) constructing a geometric mapping relation model of the central point and the end point of the folding fan calibration plate of the laser coordinate system and the central point and the end point of the folding fan calibration plate of the image coordinate system according to the camera pinhole model, the spatial rotation matrix and the spatial translation vector, describing according to a formula (10),
Figure BDA0002441864520000093
wherein s is a camera magnification coefficient, e (u, v) is a central point and an end point of the image coordinate system folding fan calibration plate, a is a camera internal reference matrix, [ R t ] is an external reference matrix, R is a 3 × 3 rotation matrix, t is a 3 × 1 translation vector, and c ═ is (x, y, z) is the central point and the end point of the laser coordinate system folding fan calibration plate;
(b) order to
Figure BDA0002441864520000094
Figure BDA0002441864520000095
Figure BDA0002441864520000096
Figure BDA0002441864520000101
Figure BDA0002441864520000102
Wherein, T is a vector transposition symbol, and a formula (11) can be calculated by a formula (10);
Figure BDA0002441864520000103
(c) according to the matrix equality principle, a formula (12) can be calculated from the formula (11);
Figure BDA0002441864520000104
(d) constructing an equation set by using the formula (12), wherein the expression form of the equation set is described by the formula (13);
Figure BDA0002441864520000105
(e) in step 8, the central point, the crest pleat end point and the trough pleat end point of the folding fan calibration plate of the laser coordinate system under different poses are obtained, and the number of the central point, the crest pleat end point and the trough pleat end point is
Figure BDA0002441864520000106
Wherein the content of the first and second substances,
Figure BDA0002441864520000107
the number of the peak pleat end points of the lower folding fan calibration plate of the jth position calibration plate,
Figure BDA0002441864520000108
the number of valley fold end points of the folding fan calibration plate of the jth position calibration plate is shown, n is
Figure BDA0002441864520000109
The number of the central point, the peak pleat end point and the valley pleat end point of the calibration plate under the posture is one; meanwhile, the central point, the crest pleat end point and the trough pleat end point of the image coordinate system folding fan calibration plate under different poses are obtained, and the number of the central point, the crest pleat end point and the trough pleat end point is also
Figure BDA00024418645200001010
An overdetermined equation set is constructed by using the formula (13), the expression form of the overdetermined equation set is described by the formula (14),
Figure BDA00024418645200001011
order to
Figure BDA0002441864520000111
And F is a 2n multiplied by 12 matrix, a coefficient matrix of the over-determined equation set is formed, the over-determined equation set is solved by using a least square method, a geometric mapping relation H is obtained, and the calibration of the three-dimensional laser scanner and the camera is completed.
The invention has the beneficial effects that: the three-dimensional laser scanner and camera calibration method based on the fan-shaped features comprises the following steps: (1) manufacturing a black and white folding fan calibration plate, (2) collecting three-dimensional point cloud and a two-dimensional image of the calibration plate, (3) extracting characteristic ruffles of the three-dimensional line point cloud, (4) determining a coplanar line point cloud segment, (5) estimating a central point and an end point of the folding fan calibration plate, (6) optimizing the central point and the end point of the folding fan calibration plate, (7) extracting the central point and the end point of the folding fan calibration plate in an image coordinate system, (8) obtaining the three-dimensional point cloud and the two-dimensional image of the calibration plate at different poses, and (9) calculating the geometric mapping relation between the point cloud and the image. Compared with the prior art, the invention has the advantages that: a brand-new calibration plate appliance is designed, the structure of three-dimensional point cloud is deeply analyzed, the characteristic fold points of three-dimensional line point cloud are obtained by adopting a discrete curvature extremum method, the three-dimensional point cloud is divided according to the characteristic fold points, and the geometric characteristics of the calibration plate appliance are used as the basis of nonlinear optimization, so that the calibration of the three-dimensional laser scanner and the camera is more accurate and reliable.
Drawings
FIG. 1 is a flow chart of the method steps of the present invention.
Fig. 2 is a schematic view of the black-and-white folding fan calibration plate, in which fig. (a) is a front view of the black-and-white folding fan calibration plate, and fig. (b) is a left view of the black-and-white folding fan calibration plate.
Fig. 3 is a schematic diagram of point cloud and image acquisition, wherein (a) is a black and white fan calibration plate, and (b) is a three-dimensional laser and camera.
FIG. 4 is a schematic diagram of feature ruffles for extracting a three-dimensional line point cloud.
FIG. 5 is a schematic diagram of determining a coplanar line point cloud segment.
Fig. 6 is a schematic view of estimating the center point and end point of the folding fan calibration plate.
Fig. 7 is a point cloud and image fusion result diagram.
Detailed Description
The invention will be further explained with reference to the drawings.
As shown in fig. 1, the three-dimensional laser scanner and camera calibration method based on sector features includes the following steps:
step 1, manufacturing a black and white folding fan calibration plate, wherein the unfolding angle of the folding fan calibration plate is 180 degrees, and the radius raThe characteristics are as follows when the length is 50 cm: between 0 to 180 degrees, every 30 degrees is a peak pleat, total 7 peak pleats, each peak pleat length is 50cm, and 7 peak pleats are coplanar; between 15 degrees and 165 degrees, every 30 degrees is a valley fold, 6 valley folds are totally formed, the length of each valley fold is 50cm, and the 6 valley folds are coplanar; starting at 0 degrees, connecting end points of the crest pleat and the valley pleat in sequence to form a folding fan surface in sequence to form a folding fan calibration plate; the included angle between the plane formed by 7 peak folds and the plane formed by 6 valley folds is 30 degrees, as shown in fig. 2, wherein, the picture (a) is the front view of the black and white folding fan calibration plate, and the picture (b) is the left view of the black and white folding fan calibration plate;
step 2, collecting three-dimensional point cloud and two-dimensional image of the calibration plate, fixing a three-dimensional laser scanner and a camera, enabling the calibration plate to face the three-dimensional laser scanner and the camera, scanning the calibration plate by using the three-dimensional laser scanner, obtaining the three-dimensional point cloud P of the calibration plate, shooting the calibration plate by using the camera, and obtaining the two-dimensional image I (q) of the calibration platei=(ui,vi)|1≤i≤niWherein q isi=(ui,vi) Is the ith pixel point, n, in the two-dimensional image of the calibration plateiThe number of pixel points in the two-dimensional image of the calibration plate is determined; laser coordinate system [ O ]l;x,y,z]Origin O oflThe xy plane is parallel to the three-dimensional laser scanner base; camera coordinate system
Figure BDA0002441864520000121
Origin O ofcIs positioned in the optical center of the camera lens,
Figure BDA0002441864520000122
the plane is parallel to the image sensor plane; image coordinate system [ O ]a;u,v]Origin O ofaThe vertex of the upper left corner of the image sensor plane is located, and the uv plane is located in the image sensor plane, as shown in fig. 3, wherein, the drawing (a) is a black and white folding fan calibration plate, and the drawing (b) is a three-dimensional laser and camera;
step 3, extracting characteristic ruffles of the three-dimensional line point clouds, decomposing the three-dimensional point clouds of the calibration plate into a plurality of line point clouds according to the scanning mode of the three-dimensional laser scanner, and projecting each line point cloud to a corresponding projection reference coordinate system to form projection line point clouds; carrying out noise reduction treatment on each projection line point cloud by utilizing Gaussian regression to form a smooth projection line point cloud; calculating the discrete curvature of each smooth projection line point cloud, wherein the maximum value point of the discrete curvature corresponds to the characteristic inflection point of the line point cloud, and the method specifically comprises the following substeps:
(a) three-dimensional laser scanner has two kinds of working methods, one kind is that horizontal and vertical rotation through single line laser realizes the three-dimensional scanning, the horizontal and vertical rotation of single line laser can form laser horizontal scanning face and the vertical scanning face of laser, another kind is that horizontal rotation through multi-line laser realizes the three-dimensional scanning, the horizontal rotation of multi-line laser can form the horizontal scanning face of laser, the three-dimensional point cloud that three-dimensional laser scanner of these two kinds of working methods obtained is regular grid three-dimensional point cloud data, consequently, the three-dimensional point cloud P of calibration board has following expression: p ═ Pi=(xi,yi,zi)|1≤i≤miIn which p isi=(xi,yi,zi) For the ith laser scanning point m in the three-dimensional point cloudiThe number of laser scanning points in the three-dimensional point cloud is obtained; decomposing the three-dimensional point cloud P into a plurality of line point clouds according to the line number of the laser scanning, namely P ═ { P ═ Pj|1≤j≤mj},
Figure BDA0002441864520000131
Wherein, PjIs the jth transverse line in the three-dimensional point cloudA point cloud consisting of a series of ordered discrete points, m, distributed over the laser scan linejIs the number of the line point clouds in the three-dimensional point cloud,
Figure BDA0002441864520000132
is the ith laser scanning point in the jth transverse line point cloud in the three-dimensional point cloud,
Figure BDA0002441864520000133
the number of laser scanning points in the jth transverse line point cloud is obtained; decomposing the three-dimensional point cloud P into a plurality of sector point clouds according to the sector distribution of the three-dimensional point cloud of the calibration plate, namely P ═ Pk|1≤k≤mkIn which P iskIs the kth sector point cloud in the three-dimensional point cloud, which consists of a series of ordered discrete points distributed on the sector of the calibration plate, mkThe total number of surface point clouds in the three-dimensional point cloud is obtained;
(b) for the jth transverse line point cloud P in the three-dimensional point cloudjMaking a cutting plane of the laser transverse scanning plane through the central laser scanning point and the laser optical center, and establishing a projection reference coordinate system on the cutting plane
Figure BDA0002441864520000134
The origin of the projection reference coordinate system is located at the laser optical center,
Figure BDA0002441864520000141
the axis coincides with the x-axis of the laser coordinate system; will be provided with
Figure BDA0002441864520000142
Performing coordinate transformation of formula (1), and projecting the coordinate transformation to a projection reference coordinate system to form a projection line point cloud
Figure BDA0002441864520000143
Wherein the content of the first and second substances,
Figure BDA0002441864520000144
the point cloud of the projection line of the jth item,
Figure BDA0002441864520000145
the projection laser scanning point is the ith projection laser scanning point in the jth projection line point cloud;
Figure BDA0002441864520000146
(c) using Gaussian regression to make point cloud of j-th projection line
Figure BDA0002441864520000147
Performing noise reduction processing to obtain smooth projection line point cloud
Figure BDA0002441864520000148
Wherein the content of the first and second substances,
Figure BDA0002441864520000149
for the jth smoothed projection line point cloud,
Figure BDA00024418645200001410
the ith smooth projection laser scanning point in the jth smooth projection line point cloud is obtained;
(d) method for determining smooth projection laser scanning point by adopting accumulated chord length parameterization method
Figure BDA00024418645200001411
Value of (2)
Figure BDA00024418645200001412
The expression form is described by formula (2),
Figure BDA00024418645200001413
parameter(s)
Figure BDA00024418645200001414
And smoothly projecting the laser scanning spot
Figure BDA00024418645200001415
Form a one-to-one mapping relation
Figure BDA00024418645200001416
For each parameter
Figure BDA00024418645200001417
All have a smooth projection laser scanning spot
Figure BDA00024418645200001418
Corresponding to it, is shown as
Figure BDA00024418645200001419
Wherein the content of the first and second substances,
Figure BDA00024418645200001420
in order to be a set of parameters, the parameters,
Figure BDA00024418645200001421
and
Figure BDA00024418645200001422
as a parameter
Figure BDA00024418645200001423
Thus, the point is determined
Figure BDA00024418645200001424
Expressed as parameters
Figure BDA00024418645200001425
Of discrete vector functions, i.e.
Figure BDA00024418645200001426
(e) Estimating discrete functions
Figure BDA0002441864520000151
And
Figure BDA0002441864520000152
in that
Figure BDA0002441864520000153
Derivative of (A) according toThe formula (3) and the formula (4) are calculated,
Figure BDA0002441864520000154
Figure BDA0002441864520000155
wherein the content of the first and second substances,
Figure BDA0002441864520000156
as a discrete function
Figure BDA0002441864520000157
In that
Figure BDA0002441864520000158
The derivative of (a) of (b),
Figure BDA0002441864520000159
discrete function
Figure BDA00024418645200001510
In that
Figure BDA00024418645200001511
The derivative of (a), m is the radius of the neighborhood, then the vector function is discrete
Figure BDA00024418645200001512
In that
Figure BDA00024418645200001513
The derivative of (a) is that of,
Figure BDA00024418645200001514
wherein the content of the first and second substances,
Figure BDA00024418645200001515
also referred to as discrete derivatives for short;
(f) estimating
Figure BDA00024418645200001516
In that
Figure BDA00024418645200001517
Unit tangent vector of
Figure BDA00024418645200001518
And discrete curvature
Figure BDA00024418645200001519
In order to realize the purpose,
Figure BDA00024418645200001520
Figure BDA00024418645200001521
wherein the content of the first and second substances,
Figure BDA00024418645200001522
the method is the same as the method of the formula (3) and the formula (4);
(g) obtaining the point cloud of the jth smooth projection line by using the discrete curvature result calculated in the substep (f) in the step 3
Figure BDA00024418645200001523
The maximum point of discrete curvature corresponds to the jth transverse line point cloud P under the laser coordinate systemjSo as to obtain the jth transverse line point cloud PjAs shown in fig. 4;
step 4, determining a coplanar line point cloud segment, and segmenting the line point cloud by using characteristic ruffles of the line point cloud under a laser coordinate system, wherein each segmented part is called a line point cloud segment; performing linear fitting on the line point cloud segment to obtain a fitting linear line, wherein the fitting linear line is called a laser scanning line segment; determining coplanar laser scanning line segments on adjacent line point clouds to obtain coplanar line point cloud segments, and specifically comprising the following substeps:
(a) obtaining P by using substep (g) in step 3jIs characterized by the fold point of PjDivided into several line-cloud segments, PjCan be expressed as
Figure BDA0002441864520000161
Wherein the content of the first and second substances,
Figure BDA0002441864520000162
for the kth line point cloud segment in the jth line point cloud,
Figure BDA0002441864520000163
the number of line cloud segments in the point cloud of the jth line is the number of the line cloud segments in the point cloud of the jth line;
(b) utilizing a least square method to segment the kth line point cloud in the jth line point cloud
Figure BDA0002441864520000164
Fitting straight line called laser scanning line segment
Figure BDA0002441864520000165
And is provided with
Figure BDA0002441864520000166
Figure BDA0002441864520000167
For the kth laser scanning line segment, L, in the jth line point cloudjThe method comprises the steps of collecting laser scanning line segments in a jth line point cloud;
(c) for a set L of laser scan line segmentsjAnd Lj+1Determining L by using formula (8) to judge that two straight lines are coplanarjAnd Lj+1If the laser scanning line segments are coplanar, the corresponding line point cloud segments are coplanar and are laser scanning points in the same sector, which is called as sector point cloud PkAnd has P ═ Pk|1≤k≤mk};
(o1-o2)·(l1×l2)=0 (8)
Wherein o is1、o2Of any two straight linesPassing point, l1、l2Is a direction vector of two straight lines;
step 5, estimating the central point and the end point of the folding fan calibration plate, and performing plane fitting on the sector point cloud under a laser coordinate system to obtain a fitting plane, wherein the fitting plane is called a sector; calculating the intersecting line of the adjacent sectors, wherein the intersecting line is the fold line of the folding fan calibration plate; estimating the center point of the folding fan calibration plate by utilizing the intersection of the folding lines, and estimating the end point of the folding fan calibration plate according to the direction vector of the folding lines and the folding line length of the folding fan calibration plate, wherein the method specifically comprises the following substeps:
(a) using least square method to make a point cloud P of sectorkFitting a plane, called a sector FkAnd has F ═ Fk|1≤k≤mkIn which FkIs the kth sector, and F is the set of all sectors;
(b) for adjacent sectors FkAnd Fk+1And simultaneous two-plane equation can obtain the cross line SkAnd obtaining the direction vector of the intersection line as dkAnd has S ═ Sk|1≤k≤mk-1} and D ═ Dk|1≤k≤mk-1, the intersection line is the fold line of the folding fan calibration plate, wherein S is the collection of fold lines of the folding fan calibration plate, SkThe fold line of the kth folding fan calibration plate, D is the collection of direction vectors of the fold line of the folding fan calibration plate, DkThe direction vector of the fold line of the kth folding fan calibration plate is defined;
(c) folding line set S ═ S for folding fan calibration platek|1≤k≤mk-1}, calculating mkThe intersection point of the fold lines of 1 folding fan calibration plate is the central point p of the folding fan calibration platecWherein p iscThe central point of the folding fan calibration plate is taken as the central point;
(d) estimating the end point of the folding fan calibration plate by the formula (9)
Figure BDA0002441864520000171
And is provided with
Figure BDA0002441864520000172
Figure BDA0002441864520000173
Wherein r isa50cm is the fold line length of the folding fan calibration plate, PeThe set of endpoints of the panels are calibrated for the folding fan,
Figure BDA0002441864520000174
calibrating the kth end point of the folding fan calibration plate;
step 6, optimizing the central point and the end point of the folding fan calibration plate, and regarding the collection of the end points of the folding fan calibration plate under the laser coordinate system
Figure BDA0002441864520000175
According to the characteristic of staggered arrangement of crest folds and valley folds, P is addedeIs divided into peak pleat end points
Figure BDA0002441864520000176
And point of valley fold
Figure BDA0002441864520000177
And m isp+mv=mk-1, wherein PpThe set of peak pleat ends of the panel is scaled for the folding fan,
Figure BDA0002441864520000178
scaling the k-th peak pleat end point, m, of the panel for a folding fanpFor calibrating the number of peak-fold end points, P, of the panel for a folding fanvThe collection of tuck endpoints of the panel is scaled for the folding fan,
Figure BDA0002441864520000179
scaling the kth valley-fold end point, m, of the panel for a folding fanvThe number of valley fold end points of the folding fan calibration plate is determined; the optimal problem is constructed by utilizing the included angle of adjacent crest pleats as 30 degrees and the included angle of adjacent valley pleats as 30 degrees
Figure BDA0002441864520000181
Wherein the optimal problem is solved by the folding fan calibration plate estimated in step 5Center point pcPeak pleat end point
Figure BDA0002441864520000182
And point of valley fold
Figure BDA0002441864520000183
The central point of the optimized folding fan calibration plate can be obtained for the initial value of the optimal problem
Figure BDA0002441864520000184
Peak pleat end point
Figure BDA0002441864520000185
And point of valley fold
Figure BDA0002441864520000186
Wherein the content of the first and second substances,
Figure BDA0002441864520000187
the center point of the optimized folding fan calibration plate under the laser coordinate system,
Figure BDA0002441864520000188
the set of the peak pleat end points of the folding fan calibration plate after optimization under the laser coordinate system is obtained,
Figure BDA0002441864520000189
the k-th peak pleat end point in the plate peak pleat end point set is calibrated for the optimized folding fan under the laser coordinate system,
Figure BDA00024418645200001810
the set of valley fold end points of the folding fan calibration plate after optimization under the laser coordinate system is obtained,
Figure BDA00024418645200001811
calibrating the kth valley-fold endpoint in the panel valley-fold endpoint set for the optimized folding fan under the laser coordinate system;
step 7, extracting the central point and the end point of the folding fan calibration plate in the image coordinate system, and obtaining the image coordinate system [ O ]a;u,v]In, by means of corner pointsObtaining the central point q of the calibration plate of the folding fan by an algorithmcPeak pleat end point
Figure BDA00024418645200001812
And point of valley fold
Figure BDA00024418645200001813
Wherein q iscFor the centre point, I, of the calibration plate for the folding fan under the image coordinate systempSets of peak pleat end points of the folding fan calibration plate under the image coordinate system,
Figure BDA00024418645200001814
scaling the kth crest fold end point, I, of the panel for an image coordinate systemvSets of valley fold end points of the folding fan calibration panel under the image coordinate system,
Figure BDA00024418645200001815
calibrating the kth valley-fold endpoint of the panel for the folding fan under the image coordinate system;
step 8, obtaining three-dimensional point cloud and two-dimensional images of the calibration plate under different poses, changing the pose of the folding fan calibration plate, scanning and shooting the calibration plate, and obtaining
Figure BDA00024418645200001816
Repeating the steps 3 to 7 to obtain the central point and the end point of the laser coordinate system folding fan calibration plate under different poses and the central point and the end point of the corresponding image coordinate system folding fan calibration plate;
step 9, calculating a geometric mapping relation between the point cloud and the image, constructing an over-determined equation set by using a camera pinhole model, calculating the geometric mapping relation between the point cloud and the image, and completing calibration of the three-dimensional laser scanner and the camera, wherein the method specifically comprises the following substeps:
(a) constructing a geometric mapping relation model of the central point and the end point of the folding fan calibration plate of the laser coordinate system and the central point and the end point of the folding fan calibration plate of the image coordinate system according to the camera pinhole model, the spatial rotation matrix and the spatial translation vector, describing according to a formula (10),
Figure BDA0002441864520000191
wherein s is a camera magnification factor, e ═ u, v is a central point and an end point of the image coordinate system folding fan calibration plate, a is a camera internal reference matrix, [ R t ] is an external reference matrix, R is a 3 × 3 rotation matrix, t is a 3 × 1 translation vector, and c ═ x, y, z is a central point and an end point of the laser coordinate system folding fan calibration plate;
(b) order to
Figure BDA0002441864520000192
Figure BDA0002441864520000193
Figure BDA0002441864520000194
Figure BDA0002441864520000195
Figure BDA0002441864520000196
Wherein, T is a vector transposition symbol, and a formula (11) can be calculated by a formula (10);
Figure BDA0002441864520000201
(c) according to the matrix equality principle, a formula (12) can be calculated from the formula (11);
Figure BDA0002441864520000202
(d) constructing an equation set by using the formula (12), wherein the expression form of the equation set is described by the formula (13);
Figure BDA0002441864520000203
(e) in step 8, the central point, the crest pleat end point and the trough pleat end point of the folding fan calibration plate of the laser coordinate system under different poses are obtained, and the number of the central point, the crest pleat end point and the trough pleat end point is
Figure BDA0002441864520000204
Wherein the content of the first and second substances,
Figure BDA0002441864520000205
the number of the peak pleat end points of the lower folding fan calibration plate of the jth position calibration plate,
Figure BDA0002441864520000206
the number of valley fold end points of the folding fan calibration plate of the jth position calibration plate is shown, n is
Figure BDA0002441864520000207
The number of the central point, the peak pleat end point and the valley pleat end point of the calibration plate under the posture is one; meanwhile, the central point, the crest pleat end point and the trough pleat end point of the image coordinate system folding fan calibration plate under different poses are obtained, and the number of the central point, the crest pleat end point and the trough pleat end point is also
Figure BDA0002441864520000208
An overdetermined equation set is constructed by using the formula (13), the expression form of the overdetermined equation set is described by the formula (14),
Figure BDA0002441864520000209
order to
Figure BDA00024418645200002010
And as shown in fig. 7, the upper left corner in the figure is a two-dimensional image obtained by shooting by the camera, the right side is a three-dimensional point cloud obtained by scanning by the three-dimensional laser scanner, and the lower left corner is a three-dimensional color point cloud obtained by fusing the three-dimensional point cloud and the two-dimensional image, so that the colors of a brown outdoor wall, a white iron box and a brown indoor wall in the two-dimensional image can be clearly seen to be matched with the colors of the outdoor wall, the iron box and the indoor wall in the three-dimensional color point cloud.
The invention has the advantages that: a brand-new calibration plate appliance is designed, the structure of three-dimensional point cloud is deeply analyzed, the characteristic fold points of three-dimensional line point cloud are obtained by adopting a discrete curvature extremum method, the three-dimensional point cloud is divided according to the characteristic fold points, and the geometric characteristics of the calibration plate appliance are used as the basis of nonlinear optimization, so that the calibration of the three-dimensional laser scanner and the camera is more accurate and reliable.

Claims (1)

1. The three-dimensional laser scanner and camera calibration method based on the fan-shaped features is characterized by comprising the following steps of:
step 1, manufacturing a black and white folding fan calibration plate, wherein the unfolding angle of the folding fan calibration plate is 180 degrees, and the radius raThe characteristics are as follows when the length is 50 cm: between 0 degree and 180 degrees, every 30 degrees is a peak pleat, totally 7 peak pleats, the length of each peak pleat is 50cm, and the 7 peak pleats are coplanar; between 15 degrees and 165 degrees, every 30 degrees is a valley fold, 6 valley folds are totally formed, the length of each valley fold is 50cm, and the 6 valley folds are coplanar; starting at 0 degrees, connecting end points of the crest pleat and the valley pleat in sequence to form a folding fan surface in sequence to form a folding fan calibration plate; the included angle between the plane formed by 7 peak folds and the plane formed by 6 valley folds is 30 degrees;
step 2, collecting three-dimensional point cloud and two-dimensional image of the calibration plate, fixing a three-dimensional laser scanner and a camera, enabling the calibration plate to face the three-dimensional laser scanner and the camera, scanning the calibration plate by using the three-dimensional laser scanner, obtaining the three-dimensional point cloud P of the calibration plate, shooting the calibration plate by using the camera, and obtaining the two-dimensional image I (q) of the calibration platei=(ui,vi)|1≤i≤niWherein q isi=(ui,vi) Is the ith pixel point, n, in the two-dimensional image of the calibration plateiThe number of pixel points in the two-dimensional image of the calibration plate is determined; laser coordinate system[Ol;x,y,z]Origin O oflThe xy plane is parallel to the three-dimensional laser scanner base; camera coordinate system
Figure FDA0002441864510000011
Origin O ofcIs positioned in the optical center of the camera lens,
Figure FDA0002441864510000012
the plane is parallel to the image sensor plane; image coordinate system [ O ]a;u,v]Origin O ofaThe vertex of the upper left corner of the plane of the image sensor is positioned, and the uv plane is positioned on the plane of the image sensor;
step 3, extracting characteristic ruffles of the three-dimensional line point clouds, decomposing the three-dimensional point clouds of the calibration plate into a plurality of line point clouds according to the scanning mode of the three-dimensional laser scanner, and projecting each line point cloud to a corresponding projection reference coordinate system to form projection line point clouds; carrying out noise reduction treatment on each projection line point cloud by utilizing Gaussian regression to form a smooth projection line point cloud; calculating the discrete curvature of each smooth projection line point cloud, wherein the maximum value point of the discrete curvature corresponds to the characteristic inflection point of the line point cloud, and the method specifically comprises the following substeps:
(a) three-dimensional laser scanner has two kinds of working methods, one kind is that horizontal and vertical rotation through single line laser realizes the three-dimensional scanning, the horizontal and vertical rotation of single line laser can form laser horizontal scanning face and the vertical scanning face of laser, another kind is that horizontal rotation through multi-line laser realizes the three-dimensional scanning, the horizontal rotation of multi-line laser can form the horizontal scanning face of laser, the three-dimensional point cloud that three-dimensional laser scanner of these two kinds of working methods obtained is regular grid three-dimensional point cloud data, consequently, the three-dimensional point cloud P of calibration board has following expression: p ═ Pi=(xi,yi,zi)|1≤i≤miIn which p isi=(xi,yi,zi) For the ith laser scanning point m in the three-dimensional point cloudiThe number of laser scanning points in the three-dimensional point cloud is obtained; decomposing the three-dimensional point cloud P into a plurality of line point clouds according to the line number of the laser scanning, namely P ═ { P ═ Pj|1≤j≤mj},
Figure FDA0002441864510000021
Wherein, PjIs the jth transverse line point cloud in the three-dimensional point cloud, which is composed of a series of ordered discrete points distributed on the laser scanning line, mjIs the number of the line point clouds in the three-dimensional point cloud,
Figure FDA0002441864510000022
is the ith laser scanning point in the jth transverse line point cloud in the three-dimensional point cloud,
Figure FDA0002441864510000023
the number of laser scanning points in the jth transverse line point cloud is obtained; decomposing the three-dimensional point cloud P into a plurality of sector point clouds according to the sector distribution of the three-dimensional point cloud of the calibration plate, namely P ═ Pk|1≤k≤mkIn which P iskIs the kth sector point cloud in the three-dimensional point cloud, which consists of a series of ordered discrete points distributed on the sector of the calibration plate, mkThe total number of the sector point clouds in the three-dimensional point cloud is obtained;
(b) for the jth transverse line point cloud P in the three-dimensional point cloudjMaking a cutting plane of the laser transverse scanning plane through the central laser scanning point and the laser optical center, and establishing a projection reference coordinate system on the cutting plane
Figure FDA0002441864510000024
The origin of the projection reference coordinate system is located at the laser optical center,
Figure FDA0002441864510000025
the axis coincides with the x-axis of the laser coordinate system; will be provided with
Figure FDA0002441864510000026
Performing coordinate transformation of formula (1), and projecting the coordinate transformation to a projection reference coordinate system to form a projection line point cloud
Figure FDA0002441864510000027
Wherein the content of the first and second substances,
Figure FDA0002441864510000028
the point cloud of the projection line of the jth item,
Figure FDA0002441864510000031
the projection laser scanning point is the ith projection laser scanning point in the jth projection line point cloud;
Figure FDA0002441864510000032
(c) using Gaussian regression to make point cloud of j-th projection line
Figure FDA0002441864510000033
Performing noise reduction processing to obtain smooth projection line point cloud
Figure FDA0002441864510000034
Wherein the content of the first and second substances,
Figure FDA0002441864510000035
for the jth smoothed projection line point cloud,
Figure FDA0002441864510000036
the ith smooth projection laser scanning point in the jth smooth projection line point cloud is obtained;
(d) method for determining smooth projection laser scanning point by adopting accumulated chord length parameterization method
Figure FDA0002441864510000037
Value of (2)
Figure FDA0002441864510000038
The expression form is described by formula (2),
Figure FDA0002441864510000039
parameter(s)
Figure FDA00024418645100000310
And smoothly projecting the laser scanning spot
Figure FDA00024418645100000311
Form a one-to-one mapping relation
Figure FDA00024418645100000312
For each parameter
Figure FDA00024418645100000313
All have a smooth projection laser scanning spot
Figure FDA00024418645100000314
Corresponding to it, is shown as
Figure FDA00024418645100000315
Wherein the content of the first and second substances,
Figure FDA00024418645100000316
in order to be a set of parameters, the parameters,
Figure FDA00024418645100000317
and
Figure FDA00024418645100000318
as a parameter
Figure FDA00024418645100000319
Thus, the point is determined
Figure FDA00024418645100000320
Expressed as parameters
Figure FDA00024418645100000321
Of discrete vector functions, i.e.
Figure FDA00024418645100000322
(e) Estimating discrete functions
Figure FDA00024418645100000323
And
Figure FDA00024418645100000324
in that
Figure FDA00024418645100000325
The derivative of (c) is calculated according to the formula (3) and the formula (4),
Figure FDA00024418645100000326
Figure FDA0002441864510000041
wherein the content of the first and second substances,
Figure FDA0002441864510000042
as a discrete function
Figure FDA0002441864510000043
In that
Figure FDA0002441864510000044
The derivative of (a) of (b),
Figure FDA0002441864510000045
discrete function
Figure FDA0002441864510000046
In that
Figure FDA0002441864510000047
The derivative of (a), m is the radius of the neighborhood, then the vector function is discrete
Figure FDA0002441864510000048
In that
Figure FDA0002441864510000049
The derivative of (a) is that of,
Figure FDA00024418645100000410
wherein the content of the first and second substances,
Figure FDA00024418645100000411
also referred to as discrete derivatives for short;
(f) estimating
Figure FDA00024418645100000412
In that
Figure FDA00024418645100000413
Unit tangent vector of
Figure FDA00024418645100000414
And discrete curvature
Figure FDA00024418645100000415
In order to realize the purpose,
Figure FDA00024418645100000416
Figure FDA00024418645100000417
wherein the content of the first and second substances,
Figure FDA00024418645100000418
the method is the same as the method of the formula (3) and the formula (4);
(g) discrete curve calculated by using the substep (f) in the step 3Obtaining the point cloud of the jth smooth projection line
Figure FDA00024418645100000419
The maximum point of discrete curvature corresponds to the jth transverse line point cloud P under the laser coordinate systemjSo as to obtain the jth transverse line point cloud PjThe characteristic fold points of (1);
step 4, determining a coplanar line point cloud segment, and segmenting the line point cloud by using characteristic ruffles of the line point cloud under a laser coordinate system, wherein each segmented part is called a line point cloud segment; performing linear fitting on the line point cloud segment to obtain a fitting linear line, wherein the fitting linear line is called a laser scanning line segment; determining coplanar laser scanning line segments on adjacent line point clouds to obtain coplanar line point cloud segments, and specifically comprising the following substeps:
(a) obtaining P by using substep (g) in step 3jIs characterized by the fold point of PjDivided into several line-cloud segments, PjCan be expressed as
Figure FDA00024418645100000420
Wherein the content of the first and second substances,
Figure FDA00024418645100000421
for the kth line point cloud segment in the jth line point cloud,
Figure FDA0002441864510000051
the number of line cloud segments in the point cloud of the jth line is the number of the line cloud segments in the point cloud of the jth line;
(b) utilizing a least square method to segment the kth line point cloud in the jth line point cloud
Figure FDA0002441864510000052
Fitting straight line called laser scanning line segment
Figure FDA0002441864510000053
And is provided with
Figure FDA0002441864510000054
Figure FDA0002441864510000055
For the kth laser scanning line segment, L, in the jth line point cloudjThe method comprises the steps of collecting laser scanning line segments in a jth line point cloud;
(c) for a set L of laser scan line segmentsjAnd Lj+1Determining L by using formula (8) to judge that two straight lines are coplanarjAnd Lj+1If the laser scanning line segments are coplanar, the corresponding line point cloud segments are coplanar and are laser scanning points in the same sector, which is called as sector point cloud PkAnd has P ═ Pk|1≤k≤mk};
(o1-o2)·(l1×l2)=0 (8)
Wherein o is1、o2Is the passing point of any two straight lines,/1、l2Is a direction vector of two straight lines;
step 5, estimating the central point and the end point of the folding fan calibration plate, and performing plane fitting on the sector point cloud under a laser coordinate system to obtain a fitting plane, wherein the fitting plane is called a sector; calculating the intersecting line of the adjacent sectors, wherein the intersecting line is the fold line of the folding fan calibration plate; estimating the center point of the folding fan calibration plate by utilizing the intersection of the folding lines, and estimating the end point of the folding fan calibration plate according to the direction vector of the folding lines and the folding line length of the folding fan calibration plate, wherein the method specifically comprises the following substeps:
(a) using least square method to make a point cloud P of sectorkFitting a plane, called a sector FkAnd has F ═ Fk|1≤k≤mkIn which FkIs the kth sector, and F is the set of all sectors;
(b) for adjacent sectors FkAnd Fk+1And simultaneous two-plane equation can obtain the cross line SkAnd obtaining the direction vector of the intersection line as dkAnd has S ═ Sk|1≤k≤mk-1} and D ═ Dk|1≤k≤mk-1} and the intersection line is the folding fan calibration boardA fold line, wherein S is the collection of fold lines of the folding fan calibration plate, SkThe fold line of the kth folding fan calibration plate, D is the collection of direction vectors of the fold line of the folding fan calibration plate, DkThe direction vector of the fold line of the kth folding fan calibration plate is defined;
(c) folding line set S ═ S for folding fan calibration platek|1≤k≤mk-1}, calculating mkThe intersection point of the fold lines of 1 folding fan calibration plate is the central point p of the folding fan calibration platecWherein p iscThe central point of the folding fan calibration plate is taken as the central point;
(d) estimating the end point of the folding fan calibration plate by the formula (9)
Figure FDA0002441864510000061
And is provided with
Figure FDA0002441864510000062
Figure FDA0002441864510000063
Wherein r isa50cm is the fold line length of the folding fan calibration plate, PeThe set of endpoints of the panels are calibrated for the folding fan,
Figure FDA0002441864510000064
calibrating the kth end point of the folding fan calibration plate;
step 6, optimizing the central point and the end point of the folding fan calibration plate, and regarding the collection of the end points of the folding fan calibration plate under the laser coordinate system
Figure FDA0002441864510000065
According to the characteristic of staggered arrangement of crest folds and valley folds, P is addedeIs divided into peak pleat end points
Figure FDA0002441864510000066
And point of valley fold
Figure FDA0002441864510000067
And m isp+mv=mk-1, wherein PpThe set of peak pleat ends of the panel is scaled for the folding fan,
Figure FDA0002441864510000068
scaling the k-th peak pleat end point, m, of the panel for a folding fanpFor calibrating the number of peak-fold end points, P, of the panel for a folding fanvThe collection of tuck endpoints of the panel is scaled for the folding fan,
Figure FDA0002441864510000069
scaling the kth valley-fold end point, m, of the panel for a folding fanvThe number of valley fold end points of the folding fan calibration plate is determined; the optimal problem is constructed by utilizing the included angle of adjacent crest pleats as 30 degrees and the included angle of adjacent valley pleats as 30 degrees
Figure FDA00024418645100000610
Wherein, the optimal problem is solved by the central point p of the folding fan calibration plate estimated in the step 5cPeak pleat end point
Figure FDA0002441864510000071
And point of valley fold
Figure FDA0002441864510000072
The central point of the optimized folding fan calibration plate can be obtained for the initial value of the optimal problem
Figure FDA0002441864510000073
Peak pleat end point
Figure FDA0002441864510000074
And point of valley fold
Figure FDA0002441864510000075
Wherein the content of the first and second substances,
Figure FDA0002441864510000076
the center point of the optimized folding fan calibration plate under the laser coordinate system,
Figure FDA0002441864510000077
the set of the peak pleat end points of the folding fan calibration plate after optimization under the laser coordinate system is obtained,
Figure FDA0002441864510000078
the k-th peak pleat end point in the plate peak pleat end point set is calibrated for the optimized folding fan under the laser coordinate system,
Figure FDA0002441864510000079
the set of valley fold end points of the folding fan calibration plate after optimization under the laser coordinate system is obtained,
Figure FDA00024418645100000710
calibrating the kth valley-fold endpoint in the panel valley-fold endpoint set for the optimized folding fan under the laser coordinate system;
step 7, extracting the central point and the end point of the folding fan calibration plate in the image coordinate system, and obtaining the image coordinate system [ O ]a;u,v]In the method, a central point q of a folding fan calibration plate is obtained by using an angular point extraction algorithmcPeak pleat end point
Figure FDA00024418645100000711
And point of valley fold
Figure FDA00024418645100000712
Wherein q iscFor the centre point, I, of the calibration plate for the folding fan under the image coordinate systempSets of peak pleat end points of the folding fan calibration plate under the image coordinate system,
Figure FDA00024418645100000713
scaling the kth crest fold end point, I, of the panel for an image coordinate systemvSets of valley fold end points of the folding fan calibration panel under the image coordinate system,
Figure FDA00024418645100000714
calibrating the kth valley-fold endpoint of the panel for the folding fan under the image coordinate system;
step 8, obtaining three-dimensional point cloud and two-dimensional images of the calibration plate under different poses, changing the pose of the folding fan calibration plate, scanning and shooting the calibration plate, and obtaining
Figure FDA00024418645100000715
Repeating the steps 3 to 7 to obtain the central point and the end point of the laser coordinate system folding fan calibration plate under different poses and the central point and the end point of the corresponding image coordinate system folding fan calibration plate;
step 9, calculating a geometric mapping relation between the point cloud and the image, constructing an over-determined equation set by using a camera pinhole model, calculating the geometric mapping relation between the point cloud and the image, and completing calibration of the three-dimensional laser scanner and the camera, wherein the method specifically comprises the following substeps:
(a) constructing a geometric mapping relation model of the central point and the end point of the folding fan calibration plate of the laser coordinate system and the central point and the end point of the folding fan calibration plate of the image coordinate system according to the camera pinhole model, the spatial rotation matrix and the spatial translation vector, describing according to a formula (10),
Figure FDA0002441864510000081
wherein s is a camera magnification factor, e ═ u, v is a central point and an end point of the image coordinate system folding fan calibration plate, a is a camera internal reference matrix, [ R t ] is an external reference matrix, R is a 3 × 3 rotation matrix, t is a 3 × 1 translation vector, and c ═ x, y, z is a central point and an end point of the laser coordinate system folding fan calibration plate;
(b) order to
Figure FDA0002441864510000082
Figure FDA0002441864510000083
Figure FDA0002441864510000084
Figure FDA0002441864510000085
Figure FDA0002441864510000086
Wherein, T is a vector transposition symbol, and a formula (11) can be calculated by a formula (10);
Figure FDA0002441864510000087
(c) according to the matrix equality principle, a formula (12) can be calculated from the formula (11);
Figure FDA0002441864510000088
(d) constructing an equation set by using the formula (12), wherein the expression form of the equation set is described by the formula (13);
Figure FDA0002441864510000091
(e) in step 8, the central point, the crest pleat end point and the trough pleat end point of the folding fan calibration plate of the laser coordinate system under different poses are obtained, and the number of the central point, the crest pleat end point and the trough pleat end point is
Figure FDA0002441864510000092
Wherein the content of the first and second substances,
Figure FDA0002441864510000093
the number of the peak pleat end points of the lower folding fan calibration plate of the jth position calibration plate,
Figure FDA0002441864510000094
the number of valley fold end points of the folding fan calibration plate of the jth position calibration plate is shown, n is
Figure FDA0002441864510000095
The number of the central point, the peak pleat end point and the valley pleat end point of the calibration plate under the posture is one; meanwhile, the central point, the crest pleat end point and the trough pleat end point of the image coordinate system folding fan calibration plate under different poses are obtained, and the number of the central point, the crest pleat end point and the trough pleat end point is also
Figure FDA0002441864510000096
An overdetermined equation set is constructed by using the formula (13), the expression form of the overdetermined equation set is described by the formula (14),
Figure FDA0002441864510000097
order to
Figure FDA0002441864510000098
And F is a 2n multiplied by 12 matrix, a coefficient matrix of the over-determined equation set is formed, the over-determined equation set is solved by using a least square method, a geometric mapping relation H is obtained, and the calibration of the three-dimensional laser scanner and the camera is completed.
CN202010267549.2A 2020-04-08 2020-04-08 Three-dimensional laser scanner and camera calibration method based on sector features Active CN111612844B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010267549.2A CN111612844B (en) 2020-04-08 2020-04-08 Three-dimensional laser scanner and camera calibration method based on sector features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010267549.2A CN111612844B (en) 2020-04-08 2020-04-08 Three-dimensional laser scanner and camera calibration method based on sector features

Publications (2)

Publication Number Publication Date
CN111612844A true CN111612844A (en) 2020-09-01
CN111612844B CN111612844B (en) 2022-10-21

Family

ID=72202276

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010267549.2A Active CN111612844B (en) 2020-04-08 2020-04-08 Three-dimensional laser scanner and camera calibration method based on sector features

Country Status (1)

Country Link
CN (1) CN111612844B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112710235A (en) * 2020-12-21 2021-04-27 北京百度网讯科技有限公司 Calibration method and device of structured light measuring sensor
CN117095065A (en) * 2023-09-18 2023-11-21 合肥埃科光电科技股份有限公司 Calibration method, system and equipment for linear spectrum copolymerization Jiao Weiyi sensor

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108389233A (en) * 2018-02-23 2018-08-10 大连理工大学 The laser scanner and camera calibration method approached based on boundary constraint and mean value
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108389233A (en) * 2018-02-23 2018-08-10 大连理工大学 The laser scanner and camera calibration method approached based on boundary constraint and mean value
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张星等: "基于Point-to-Plane ICP的点云与影像数据自动配准", 《计算机与数字工程》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112710235A (en) * 2020-12-21 2021-04-27 北京百度网讯科技有限公司 Calibration method and device of structured light measuring sensor
CN117095065A (en) * 2023-09-18 2023-11-21 合肥埃科光电科技股份有限公司 Calibration method, system and equipment for linear spectrum copolymerization Jiao Weiyi sensor

Also Published As

Publication number Publication date
CN111612844B (en) 2022-10-21

Similar Documents

Publication Publication Date Title
JP4785880B2 (en) System and method for 3D object recognition
EP0526881B1 (en) Three-dimensional model processing method, and apparatus therefor
US20050140670A1 (en) Photogrammetric reconstruction of free-form objects with curvilinear structures
Stamos et al. Integration of range and image sensing for photo-realistic 3D modeling
CN100468457C (en) Method for matching depth image
CN108389233B (en) Laser scanner and camera calibration method based on boundary constraint and mean value approximation
CN107767440A (en) Historical relic sequential images subtle three-dimensional method for reconstructing based on triangulation network interpolation and constraint
CN111612844B (en) Three-dimensional laser scanner and camera calibration method based on sector features
Teutsch Model-based analysis and evaluation of point sets from optical 3D laser scanners
IL178299A (en) Fine stereoscopic image matching and dedicated instrument having a low stereoscopic coefficient
CN112927302B (en) Calibration plate and calibration method for combined calibration of multi-line laser radar and camera
Jung et al. Range image registration based on 2D synthetic images
CN111815710A (en) Automatic calibration method for fisheye camera
Lin et al. Vision system for fast 3-D model reconstruction
Pacheco et al. Reconstruction of high resolution 3D objects from incomplete images and 3D information
Lee et al. Interactive 3D building modeling using a hierarchical representation
Li et al. Using laser measuring and SFM algorithm for fast 3D reconstruction of objects
Wu et al. Photogrammetric reconstruction of free-form objects with curvilinear structures
CN116402904A (en) Combined calibration method based on laser radar inter-camera and monocular camera
Mokhtarian et al. Multi-Scale 3-D Free-Form Surface Smoothing.
Becker et al. Lidar inpainting from a single image
Haala et al. Combining Laser Scanning and Photogrammetry-A Hybrid Approach for Heritage Documentation.
Alshawabkeh et al. 2D-3D feature extraction and registration of real world scenes
Eskandari et al. Covariance Based Differential Geometry Segmentation Techniques for Surface Representation Using Vector Field Framework
Briese Structure line modelling based on terrestrial laserscanner data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant