CN117392268A - Laser scanning mapping method and system based on self-adaption combined CPD and ICP algorithm - Google Patents
Laser scanning mapping method and system based on self-adaption combined CPD and ICP algorithm Download PDFInfo
- Publication number
- CN117392268A CN117392268A CN202311495896.0A CN202311495896A CN117392268A CN 117392268 A CN117392268 A CN 117392268A CN 202311495896 A CN202311495896 A CN 202311495896A CN 117392268 A CN117392268 A CN 117392268A
- Authority
- CN
- China
- Prior art keywords
- point
- point cloud
- cloud data
- data
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000013507 mapping Methods 0.000 title claims abstract description 25
- 230000009466 transformation Effects 0.000 claims description 17
- 230000003044 adaptive effect Effects 0.000 claims description 12
- 239000011159 matrix material Substances 0.000 claims description 11
- 238000007781 pre-processing Methods 0.000 claims description 7
- 238000009826 distribution Methods 0.000 claims description 6
- 238000005315 distribution function Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 5
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 3
- 238000000354 decomposition reaction Methods 0.000 claims description 3
- 238000009827 uniform distribution Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 description 8
- 238000005457 optimization Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
Abstract
The invention discloses a laser scanning mapping method and a system based on self-adaption combined CPD and ICP algorithm, aiming at solving the problem that in the prior art, a proper algorithm cannot be selected in a self-adaption mode for carrying out point cloud matching on different scene structures. The invention realizes the self-adaptive combination of CPD algorithm and ICP algorithm, and reduces the calculated amount and time consumption.
Description
Technical Field
The invention relates to a laser scanning mapping method and system based on self-adaption combined CPD and ICP algorithm, and belongs to the technical field of automatic driving.
Background
The autopilot technology is in a very aggressive stage, and the laser SLAM (Simultaneous Localization and Mapping) technology plays a vital role in the autopilot field. The laser SLAM aims to realize simultaneous positioning and environment map construction of a vehicle by using a laser radar, so that the vehicle can accurately sense the surrounding environment, and accurate positioning and path planning are realized.
Ji Zhang and Sanjiv Singh in 2014 proposed LOAM (Lidar Odometry and Mapping) algorithm based on point-to-line and point-to-face feature matching, using Iterative Closest Point (ICP) algorithm for point cloud registration. The ICP algorithm aligns the point cloud data of different frames to the same coordinate system in a continuous iterative optimization mode so as to realize accurate matching and fusion. However, when the ICP algorithm processes unstructured scenes such as squares and streets, degradation problems occur due to difficulty in extracting reliable and stable features, and the ICP algorithm is easily affected by factors such as noise, shielding and surface variation, so that the difficulty of matching point clouds is increased, and the positioning accuracy and the reliability of mapping are reduced.
However, the problems of non-rigid deformation, partial shielding and the like can be solved by performing feature extraction on point cloud data through a coherent point drift (CPD: coherentPoint Drift) algorithm based on a variable Gaussian mixture model and performing matching optimization by using a probability model, and the method has certain advantages in registration accuracy. However, it is not reasonable to use the CPD algorithm for coarse registration of the point cloud for all frames, because the CPD algorithm provides more accurate and more robust point cloud registration results, and also brings greater computational cost. The use of ICP algorithms alone is advantageous in improving operating efficiency and accuracy when the scene is in a structured scene.
Therefore, in practical applications, how to adaptively select an appropriate algorithm to handle different types of scenes remains a challenge.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides a laser scanning mapping method and system based on self-adaption combination of CPD and ICP algorithms, which realize the self-adaption combination of the CPD algorithm and the ICP algorithm and reduce the calculated amount and time consumption.
In order to achieve the above purpose, the invention is realized by adopting the following technical scheme:
in one aspect, the invention provides a laser scanning mapping method based on self-adaptive combination of CPD and ICP algorithms, which comprises the following steps:
s1, acquiring laser radar data;
s2, preprocessing laser radar data to obtain characteristic point cloud data;
s3, carrying out inter-frame matching on the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame:
calculating scene characteristic variables and surface curvatures in the first iteration of interframe matching, and judging whether to adopt a CPD algorithm to perform rough registration of point clouds according to the scene characteristic variables and the surface curvatures;
when the second iteration is matched between frames, registering the point cloud by adopting an ICP algorithm;
when the iteration number of the inter-frame matching is more than 2, updating the characteristic point cloud data of the previous frame by using the registered characteristic point cloud data of the current frame to obtain updated characteristic point cloud data of the previous frame, and obtaining laser odometer data of the frame according to the updated characteristic point cloud data of the previous frame;
s4, taking the characteristic point cloud data of the current frame as the characteristic point cloud data of the previous frame in the new round, taking the characteristic point cloud data of the next frame as the characteristic point cloud data of the current frame in the new round, carrying out inter-frame matching on the characteristic point cloud data and the characteristic point cloud data of the current frame in the new round to obtain updated point cloud data of the previous frame in the new round, and obtaining laser odometer data of the frame according to the updated point cloud data of the previous frame in the new round;
s5, repeating the steps S3-S4 until the current frame of the new round is the last frame, and combining the laser odometer data of all frames to obtain total laser odometer data;
s6, performing coarse registration on the total laser odometer data by using a CPD algorithm, and performing fine registration by using an ICP algorithm to obtain registered total laser odometer data;
and S7, fusing the vehicle pose and constructing a map through the registered total laser mileage data, and outputting a vehicle track and the map.
Further, preprocessing the laser radar data to obtain feature point cloud data, including:
performing motion de-distortion on the laser radar data to obtain de-distorted laser radar data;
removing outliers from the de-distorted laser radar data to obtain laser radar data from which the outliers are removed;
and performing voxel filtering pretreatment on the laser radar data with outliers removed, and performing downsampling on the laser radar data to obtain characteristic point cloud data.
Further, performing motion de-distortion on the laser radar data to obtain de-distorted laser radar data, and performing motion de-distortion on the laser radar data to obtain de-distorted laser radar data, including:
all points in a frame are transformed into a unified coordinate system by estimating the radar pose of the acquisition moment of each point in the frame, and time interpolation is carried out by estimation to obtain the radar pose of the acquisition moment of all points, so that the point cloud is transformed into the unified coordinate system to obtain the de-distorted laser radar data, wherein the expression is as follows:
wherein T is k-1,i Estimating the radar pose at the i-th point acquisition moment in the k-1-th frame point cloud data, and t k-1,i T is the acquisition time of the ith point in the kth-1 frame point cloud data k-1 Inter-frame motion for the k-1 th frame, t k-1 Start time, t, of inter-frame motion for the k-1 th frame k Is the end time of the inter motion of the k-1 frame.
Further, performing outlier removal on the de-distorted laser radar data to obtain laser radar data from which outliers are removed, including:
the curvature of each point in the undistorted laser radar data is calculated, and the expression is as follows:
wherein c is curvature, L represents a laser radar coordinate system,for the ith point in the kth frame under the lidar coordinate system,/for>Is the j-th point in the k-th frame in the lidar coordinate system, which is +.>S represents the set of neighbor points of the ith point;
judging whether the point is an outlier according to the curvature of each point, if the curvature is larger than a set value, removing, otherwise, reserving, and obtaining laser radar data after the outlier is removed.
Further, performing voxel filtering preprocessing on the laser radar data from which the outliers are removed, and performing downsampling on the laser radar data to obtain feature point cloud data, wherein the method comprises the following steps:
dividing the laser radar data with outliers removed into cube units, wherein each cube unit is used as a voxel;
and taking the nearest point from the voxel center as a sampling point, and combining all the sampling points as characteristic point cloud data.
Further, the inter-frame matches include point-to-line feature matches and point-to-face feature matches; the point-to-line feature matching includes:
selecting a point i from the edge point set in the characteristic point cloud data of the previous frame, selecting a point j nearest to the point i and a nearest point l of the laser scanning beam same as the point j from the edge point set of the characteristic point cloud data of the current frame, and respectively marking the three points asAnd->
Connecting jl, calculating the shortest distance from the point i to the line jl, wherein the shortest distance is expressed as follows:
wherein d L The shortest distance from point i to line jl;
the point-to-face feature matching includes:
selecting a point i from the plane point set of the characteristic point cloud data of the previous frame, selecting a point j nearest to the point i and a point l nearest to the point j in a scanning line adjacent to the point j from the plane point set of the characteristic point cloud data of the current frame, selecting a point m nearest to the point j in the adjacent frame, and respectively marking the four points asAnd->
Connection jlm, calculating the shortest distance from point i to face jlm, expressed as:
wherein d p Is the shortest distance from point i to face jlm.
Further, calculating a scene feature variable and a surface curvature in the first iteration of inter-frame matching, and judging whether to perform coarse registration of the point cloud by adopting a CPD algorithm according to the scene feature variable and the surface curvature, wherein the method comprises the following steps:
calculating scene characteristic variables, wherein the expression is as follows:
wherein SFV is scene characteristic variable, K line Indicating the laser radar beam used, F n For the number of frames matched between the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame, the PC n The number of the frame point cloud data;
scene feature variable and threshold V 0 Comparing to obtain a comparison result;
the comparison result is verified by calculating the surface curvature, and the expression is as follows:
G=λ 1 *λ 2
wherein C is covariance matrix of characteristic point cloud data of the current frame, N is normal direction vector of the characteristic point cloud data of the current frame, N is number of nearest neighbor points of the characteristic point cloud data of the current frame, lambda 1 、λ 2 The characteristic value of the covariance matrix is G, and the surface curvature is G;
the surface curvature is matched with a threshold G 0 Comparison is performed:
if the scene characteristic variable is smaller than the threshold value V 0 And the surface curvature is greater than the threshold G 0 In unstructured scene, CPD algorithm is introduced to perform coarse registration, and the rest are performedThe situation is in a structured scene, and CPD algorithm is not introduced to perform coarse registration.
Further, the method for performing coarse registration by using the CPD algorithm comprises the following steps:
let the feature point cloud data point set of the previous frame be x= { X 1 ,x 2 ,x 3 ,...,x n Characteristic point cloud data point set of the current frame is y= { Y } 1 ,y 2 ,y 3 ,...,y m The transformation relation between the characteristic point cloud data point set of the previous frame and the characteristic point cloud data point set of the current frame is x=t (Y, θ), the characteristic point cloud data point set of the current frame is taken as each sub-model of the mixed gaussian model, the data points in the characteristic point cloud data point set of the previous frame are regarded as the data points generated by each sub-model, and the expression of the probability density function of the mixed gaussian model is as follows:
wherein p (x) is a probability distribution function of a characteristic point cloud data point set of the previous frame, p (x|m) is a conditional function, x is a data point of the characteristic point cloud data point set of the previous frame, M is an mth data point of the characteristic point cloud data point set of the current frame, M is the number of data points in the characteristic point cloud data point set of the current frame, and x-y m Sigma is the distance between a data point in the characteristic point cloud data point set of the current frame and a data point in the characteristic point cloud data point set of the previous frame 2 For the covariance value, D is the dimension of the point set, exp represents the exponential function;
the posterior probability distribution is calculated through the distribution model parameters, and the expression is as follows:
wherein p is old (m|x n ) As posterior probability distribution function, x n For the nth data point within the feature point cloud data point set of the previous frame, τ (y) m ,θ old ) As affine transformation function, θ old For posterior transformation parameters, y m The method comprises the steps that (1) an mth data point in a characteristic point cloud data point set of a current frame is obtained, omega is weight of uniform distribution of point clouds, M is the number of data points in the characteristic point cloud data point set of the current frame, and N is the number of data points in the characteristic point cloud data point set of a previous frame;
model parameters (theta, sigma) are obtained by minimizing loss function solution 2 ) The expression is as follows:
wherein cost (θ, σ) 2 ) Representing a minimization loss function;
and applying the model parameters to the characteristic point cloud data point set of the current frame to realize coarse registration.
Further, when the second iteration is matched between frames, performing coarse registration of the point cloud by adopting an ICP algorithm, including: and when the second iteration is matched between frames, registering the point cloud by adopting an ICP algorithm, wherein the method comprises the following steps:
the shortest distance function from point to line and the shortest distance function from point to surface are constrained to be uniform, and the expression is as follows:
loss[R,t]
wherein, R, t represents the relative pose transformation of the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame;
solving the increment of R and t through a singular value decomposition method to realize point cloud registration of an ICP algorithm, wherein the expression is as follows:
wherein F is n Argmin table for number of frames matched between characteristic point cloud data of previous frame and characteristic point cloud data of current frameThe variable values at which the objective function takes the minimum value are shown.
On the other hand, the invention provides a laser scanning mapping system based on self-adaptive combination of CPD and ICP algorithms, which comprises a processor and a storage medium;
the storage medium is used for storing instructions;
the processor is operative to perform the steps of any one of the methods described above in accordance with the instructions.
Compared with the prior art, the invention has the beneficial effects that:
according to the method, whether the scene is an unstructured scene or not is judged, whether the CPD algorithm is used for carrying out coarse registration of the point cloud or not is adaptively selected, the CPD algorithm has good robustness and registration accuracy when the unstructured scene is processed, and the registration accuracy can be effectively improved;
according to the invention, the scene characteristic variable is calculated before each frame of point cloud registration through an intelligent algorithm selection mechanism, and then verification is carried out through calculating the surface curvature, so that whether coarse registration is carried out by using a CPD algorithm is determined, the CPD algorithm is effectively prevented from being applied to a structured scene, the introduction of errors is reduced, and the calculation amount and the time consumption are reduced.
Drawings
FIG. 1 is a flow chart of a laser scanning mapping method based on adaptive CPD and ICP algorithms in an embodiment of the invention;
FIG. 2 is a schematic point-to-line calculation diagram of a laser scanning mapping method based on adaptive combination of CPD and ICP algorithms in an embodiment of the present invention;
fig. 3 is a schematic point-to-plane calculation diagram of a laser scanning mapping method based on adaptive combination of CPD and ICP algorithms according to an embodiment of the invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings. The following examples are only for more clearly illustrating the technical aspects of the present invention, and are not intended to limit the scope of the present invention.
Example 1:
as shown in fig. 1, an embodiment of the present invention provides a laser scanning mapping method based on adaptive combination of CPD and ICP algorithm, including the following steps:
s1, acquiring laser radar data.
S2, preprocessing the laser radar data to obtain characteristic point cloud data.
The pretreatment is specifically as follows:
first, the laser radar data is subjected to motion de-distortion:
after laser radar data are acquired from a laser radar, estimating the radar pose of each point at the acquisition time, and then transforming all points in a frame into a unified coordinate system. After the inter-frame motion estimation is completed, performing time interpolation by using the estimation to obtain radar pose of all points at the acquisition time, thereby converting the original point cloud data into a unified coordinate system to obtain de-distorted laser radar data, wherein the expression is as follows:
wherein T is k-1,i Estimating the radar pose at the i-th point acquisition moment in the k-1-th frame point cloud data, and t k-1,i T is the acquisition time of the ith point in the kth-1 frame point cloud data k-1 Inter-frame motion for the k-1 th frame, t k-1 Start time, t, of inter-frame motion for the k-1 th frame k Is the end time of the inter motion of the k-1 frame.
For each point in the k-1 frame point cloud data, the start time of the transition to the k-1 frame can be calculated as follows:
wherein,for the start time of the k-1 th frame, T k-1,i Acquisition for the ith point in the kth-1 frame point cloud dataTime-integrated radar pose estimation, P i And the ith point in the k-1 frame point cloud data.
Secondly, outlier removal is carried out on the laser radar data after the de-distortion:
because the large error points caused by the fact that the measuring laser beam concentrates on a plane or the scanning line passes through a shielding position at a large incidence angle can introduce large errors into matching, the measuring laser beam is not suitable for being used as a characteristic point, and therefore each data point in the de-distorted laser radar data needs to be judged to belong to a plane point or an edge point.
Edge points: when the target point is at the position of the edge or the angle, the difference value between the natural point and the surrounding point is larger, and the obtained curvature is larger; plane point: on the contrary, when the target point is on the plane, the coordinates of the surrounding points and the target point are similar, and the obtained curvature is naturally smaller.
Accordingly, the plane smoothness can be embodied by calculating the curvature of the data points as an index for extracting the feature information of the current frame, the calculation expression of which is as follows:
wherein c is curvature, l represents a lidar coordinate system,for the ith point in the kth frame under the lidar coordinate system,/for>Is the j-th point in the k-th frame in the lidar coordinate system, which is +.>S represents the set of neighbor points of the i-th point.
Judging whether each point is an outlier according to the curvature of each point, if the curvature is larger than a set value, removing, otherwise, reserving, and obtaining laser radar data after the outlier is removed.
Finally, performing voxel filtering pretreatment on the laser radar data with outliers removed, and performing downsampling on the laser radar data:
dividing the laser radar data with outliers removed into small cube units, wherein each cube unit is a voxel, and selecting a point closest to the center of the voxel as a sampling point for each voxel. And combining the selected sampling points to form the characteristic point cloud data.
S3, carrying out inter-frame matching on the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame:
the inter-frame matching is to extract straight line and plane characteristics from the point cloud, and then perform point-to-line and point-to-plane characteristic matching.
First, point-to-line feature matching, as shown in FIG. 2, is performed byRepresenting a set of edge points in the feature point cloud data of the previous frame, E k A set of edge points in the feature point cloud data representing the current frame, from +.>Selecting a point i in E k Selecting a point j nearest to the point i and a nearest point l of the same laser scanning beam as the point j, and respectively marking the three points as +.>Andthus, the posture transformation problem is converted into the shortest distance from the point i to the line jl, and the distance from the point to the line is expressed as follows:
wherein d L Is the shortest distance from point i to line jl.
Next, point-to-face feature matching is performed, as shown in FIG. 3, usingRepresenting a set of plane points in the feature point cloud data of the previous frame, H k A set of plane points in the feature point cloud data representing the current frame is selected from +.>Selecting a point i in H k Selecting a point j nearest to the point i and a point l nearest to the point j in a scanning line adjacent to the point j, selecting a point m nearest to the point j in an adjacent frame, and marking the four points as +.>And->
Connection jlm, calculating the shortest distance from point i to face jlm, expressed as:
wherein d p Is the shortest distance from point i to face jlm.
Since the number of features is related to not only the structure of the scene, but also the scan line beam of the radar, the degree of overlapping of the point cloud, and the geometric features of the scene, it is necessary to calculate the surface curvature G of the current frame and verify the scene by judging the geometric features of the scene after performing the first iterative calculation of the scene feature variable SFV. The calculation formula of SFV is:
wherein SFV is scene characteristic variable, K line Indicating the laser radar beam used, F n For the number of frames matched between the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame, the PC n Is the number of frame point cloud data. Because unstructured scenes have diversity, irregularities, and occlusions and noise, the number of extractable feature points is far less than structured scenes, and the value of SFV is also smaller.
In the case of the same degree of downsampling per frame scan and constant speed of motion of the carrier, there is theoretically a threshold V 0 . At SFV and V 0 After the comparison, the surface curvature G is calculated again for verification, if G is smaller than the set threshold G 0 The description scene is smoother; if it is greater than or equal to the set threshold G 0 The scene is irregular.
The calculation method of the surface curvature G specifically comprises the following steps: the main curvature is obtained by calculating a covariance matrix C of characteristic point cloud data of the current frame and then calculating characteristic values of the covariance matrix C, namely maximum and minimum curvature values at the point cloud data P, so as to obtain a surface curvature G, wherein the calculation formula is as follows:
G=λ 1 *λ 2
wherein C is covariance matrix of characteristic point cloud data of the current frame, N is normal direction vector of the characteristic point cloud data of the current frame, N is number of nearest neighbor points of the characteristic point cloud data of the current frame, lambda 1 、λ 2 And G is the surface curvature, which is the eigenvalue of the covariance matrix.
If the scene characteristic variable is smaller than the threshold value V 0 And the surface curvature is greater than the threshold G 0 The method has the advantages that the moving speed is high, ICP is easy to converge to a local extremum and is in an unstructured scene, a CPD algorithm is introduced at the moment to perform coarse registration so as to reduce the absolute error of system operation, and other conditions are in a structured scene and the CPD algorithm is not introduced to perform coarse registration.
The basic idea of the CPD algorithm is to estimate the registration transformation between the point clouds by maximizing the probability density function. Let the feature point cloud data point set of the previous frame be x= { X 1 ,x 2 ,x 3 ,…,x n Characteristic point cloud data point set of the current frame is y= { Y } 1 ,y 2 ,y 3 ,…,y m The transformation relationship between the two point sets is x=t (Y, θ), the characteristic point cloud data point set of the current frame is taken as each sub-model of the mixed gaussian model, the data point in the characteristic point cloud data point set of the previous frame is regarded as the data point generated by each sub-model, and at this time, the expression of the probability density function of the mixed gaussian model is as follows:
wherein p (x) is a probability distribution function of a characteristic point cloud data point set of the previous frame, p (x|m) is a conditional function, x is a data point of the characteristic point cloud data point set of the previous frame, M is an mth data point of the characteristic point cloud data point set of the current frame, M is the number of data points in the characteristic point cloud data point set of the current frame, and x-y m Sigma is the distance between a data point in the characteristic point cloud data point set of the current frame and a data point in the characteristic point cloud data point set of the previous frame 2 For the covariance value, D is the dimension of the point set and exp represents the exponential function.
The center of the mixture gaussian model is correlated with the transformation parameter θ during registration in order to find the desired parameter (θ, σ) 2 ) Comprising two steps:
E-Step: the posterior probability distribution is calculated by the old distribution model parameters, and the expression is as follows:
wherein p is old (m|x n ) As posterior probability distribution function, x n For the nth data point within the feature point cloud data point set of the previous frame, τ (y) m ,θ old ) As affine transformation function, θ old For posterior transformation parameters, y m And (3) for the mth data point in the characteristic point cloud data point set of the current frame, omega is the weight of uniform distribution of the point clouds, M is the number of data points in the characteristic point cloud data point set of the current frame, and N is the number of data points in the characteristic point cloud data point set of the previous frame.
M-Step: minimizing the loss function to obtain new model parameters (θ, σ) 2 ) The expression is as follows:
wherein cost (θ, σ) 2 ) Representing minimizing the loss function.
And applying the model parameters to the characteristic point cloud data point set of the current frame to realize final coarse registration.
After coarse registration with the CPD algorithm, a second iteration is performed, the principle of which is the same as the LOAM, using the ICP algorithm. Unifying the two distance function constraints obtained in the step one of the inter-frame matching stage:
loss[R,t]
wherein, R, t represents the relative pose transformation of the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame, so that the problem can be converted into the minimized problem after the distance functions are unified:
wherein F is n For the number of frames that match between the feature point cloud data of the current frame and the feature point cloud data of the previous frame, argmin represents a variable value when the objective function is made to take the minimum value.
The traditional method is to solve the increment of R and t through a Gaussian-Newton (G-N) iteration method, but the G-N iteration only can solve the local optimal solution, the global optimal solution can not be ensured, and meanwhile, when the singular value of the matrix is smaller, the G-N iteration is interfered by the numerical stability, so that the accuracy is reduced. And the space transformation matrix iterated in each step is directly solved by a Singular Value Decomposition (SVD), so that the global optimal solution can be ensured, interference caused by the existence of smaller singular values of the matrix can be avoided, the numerical stability and the accuracy are improved, and the increment of R and t is finally obtained.
And when the iteration times are more than 2, updating the point cloud data of the reference frame so as to obtain the laser odometer data of the frame, and using the laser odometer data for the optimization link of the rear end.
And taking the characteristic point cloud data of the current frame as the characteristic point cloud data of the previous frame in the new round, taking the characteristic point cloud data of the next frame as the characteristic point cloud data of the current frame in the new round, carrying out inter-frame matching on the characteristic point cloud data and the characteristic point cloud data of the current frame in the new round, repeating the operation until the current frame in the new round is the last frame, and combining the laser odometer data of all frames to obtain the total laser odometer data.
S4, rear-end optimization scan-to-map stage
Due to the accumulated error of the odometer, a large deviation exists between the estimated pose and the actual pose of the current scanning frame, which can lead to an excessive initial value of ICP matching, so that a convergence state cannot be successfully achieved. CPD algorithm convergence domain is wider, first, CPD algorithm is used for carrying out preliminary non-rigid deformation registration on total laser mileage data, and then ICP algorithm is used for carrying out fine rigid transformation optimization. By combining the two algorithms, the advantages of the two algorithms can be fully utilized, the drift of the odometer can be corrected better, and therefore the efficiency and the accuracy of point cloud registration are improved.
And S5, fusing the output pose and updating the map, and finally outputting the track and the map to provide support for subsequent path planning and navigation tasks.
Example 2:
on the basis of embodiment 1, the embodiment provides a laser scanning mapping system based on self-adaption combined CPD and ICP algorithms, which comprises a processor and a storage medium.
The storage medium is for storing instructions.
The processor is operative to perform the steps of the method of embodiment 1 in accordance with the instructions.
The foregoing is merely a preferred embodiment of the present invention, and it should be noted that modifications and variations could be made by those skilled in the art without departing from the technical principles of the present invention, and such modifications and variations should also be regarded as being within the scope of the invention.
Claims (10)
1. The laser scanning mapping method based on the self-adaptive CPD and ICP algorithm is characterized by comprising the following steps:
s1, acquiring laser radar data;
s2, preprocessing laser radar data to obtain characteristic point cloud data;
s3, carrying out inter-frame matching on the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame:
calculating scene characteristic variables and surface curvatures in the first iteration of interframe matching, and judging whether to adopt a CPD algorithm to perform rough registration of point clouds according to the scene characteristic variables and the surface curvatures;
when the second iteration is matched between frames, registering the point cloud by adopting an ICP algorithm;
when the iteration number of the inter-frame matching is more than 2, updating the characteristic point cloud data of the previous frame by using the registered characteristic point cloud data of the current frame, and obtaining laser odometer data of the frame according to the updated characteristic point cloud data of the previous frame;
s4, taking the characteristic point cloud data of the current frame as the characteristic point cloud data of the previous frame in the new round, taking the characteristic point cloud data of the next frame as the characteristic point cloud data of the current frame in the new round, carrying out inter-frame matching on the characteristic point cloud data and the characteristic point cloud data of the current frame in the new round to obtain updated point cloud data of the previous frame in the new round, and obtaining laser odometer data of the frame according to the updated point cloud data of the previous frame in the new round;
s5, repeating the steps S3-S4 until the current frame of the new round is the last frame, and combining the laser odometer data of all frames to obtain total laser odometer data;
s6, performing coarse registration on the total laser odometer data by using a CPD algorithm, and performing fine registration by using an ICP algorithm to obtain registered total laser odometer data;
and S7, fusing the vehicle pose and constructing a map through the registered total laser mileage data, and outputting a vehicle track and the map.
2. The laser scanning mapping method based on the adaptive combination of CPD and ICP algorithms according to claim 1, wherein preprocessing the laser radar data to obtain the characteristic point cloud data comprises the following steps:
performing motion de-distortion on the laser radar data to obtain de-distorted laser radar data;
removing outliers from the de-distorted laser radar data to obtain laser radar data from which the outliers are removed;
and performing voxel filtering pretreatment on the laser radar data with outliers removed, and performing downsampling on the laser radar data to obtain characteristic point cloud data.
3. The laser scanning mapping method based on the adaptive combination of the CPD and the ICP algorithm according to claim 2, wherein the performing the motion de-distortion on the laser radar data to obtain de-distorted laser radar data, includes:
all points in a frame are transformed into a unified coordinate system by estimating the radar pose of the acquisition moment of each point in the frame, and time interpolation is carried out by estimation to obtain the radar pose of the acquisition moment of all points, so that the point cloud is transformed into the unified coordinate system to obtain the de-distorted laser radar data, wherein the expression is as follows:
wherein T is k-1,i Estimating the radar pose at the i-th point acquisition moment in the k-1-th frame point cloud data, and t k-1,i T is the acquisition time of the ith point in the kth-1 frame point cloud data k-1 Inter-frame motion for the k-1 th frame, t k-1 Start time, t, of inter-frame motion for the k-1 th frame k Is the end time of the inter motion of the k-1 frame.
4. The laser scanning mapping method based on the adaptive combination of the CPD and the ICP algorithm according to claim 2, wherein the performing outlier removal on the de-distorted laser radar data to obtain the laser radar data from which the outlier is removed includes:
the curvature of each point in the undistorted laser radar data is calculated, and the expression is as follows:
wherein c is curvature, L represents a laser radar coordinate system,for the ith point in the kth frame in the lidar coordinate system,is the j-th point in the k-th frame in the lidar coordinate system, which is +.>S represents the set of neighbor points of the ith point;
judging whether the point is an outlier according to the curvature of each point, if the curvature is larger than a set value, removing, otherwise, reserving, and obtaining laser radar data after the outlier is removed.
5. The laser scanning mapping method based on the adaptive combination of the CPD and the ICP algorithm according to claim 2, wherein performing voxel filtering preprocessing on the laser radar data from which the outliers are removed, and performing downsampling on the laser radar data to obtain the feature point cloud data comprises:
dividing the laser radar data with outliers removed into cube units, wherein each cube unit is used as a voxel;
and taking the nearest point from the voxel center as a sampling point, and combining all the sampling points as characteristic point cloud data.
6. The laser scanning mapping method based on adaptive combination CPD and ICP algorithms of claim 1, wherein the inter-frame matching includes point-to-line feature matching and point-to-face feature matching;
the point-to-line feature matching includes:
selecting a point i from the edge point set in the characteristic point cloud data of the previous frame, selecting a point j nearest to the point i and a nearest point l of the laser scanning beam same as the point j from the edge point set of the characteristic point cloud data of the current frame, and respectively marking the three points asAnd->
Connecting jl, calculating the shortest distance from the point i to the line jl, wherein the shortest distance is expressed as follows:
wherein d L The shortest distance from point i to line jl;
the point-to-face feature matching includes:
selecting a point i from the plane point set of the characteristic point cloud data of the previous frame, selecting a point j nearest to the point i and a point l nearest to the point j in a scanning line adjacent to the point j from the plane point set of the characteristic point cloud data of the current frame, selecting a point m nearest to the point j in the adjacent frame, and respectively marking the four points asAnd->
Connection jlm, calculating the shortest distance from point i to face jlm, expressed as:
wherein d p Is the shortest distance from point i to face jlm.
7. The laser scanning mapping method based on the adaptive combination of the CPD and ICP algorithm according to claim 1, wherein calculating the scene feature variable and the surface curvature at the first iteration of the inter-frame matching, and determining whether to perform the coarse registration of the point cloud by using the CPD algorithm according to the scene feature variable and the surface curvature comprises:
calculating scene characteristic variables, wherein the expression is as follows:
wherein SFV is scene characteristic variable, K line Indicating the laser radar beam used, F n For the number of frames matched between the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame, the PC n The number of the frame point cloud data;
scene feature variable and threshold V 0 Comparing to obtain a comparison result;
the comparison result is verified by calculating the surface curvature, and the expression is as follows:
λ 1 ,λ 2 =λ(C)
G=λ 1 *λ 2
wherein C is covariance matrix of characteristic point cloud data of the current frame, N is normal direction vector of the characteristic point cloud data of the current frame, N is number of nearest neighbor points of the characteristic point cloud data of the current frame, lambda 1 、λ 2 The characteristic value of the covariance matrix is G, and the surface curvature is G;
the surface curvature is matched with a threshold G 0 Comparison is performed:
if the scene characteristic variable is smaller than the threshold value V 0 And the surface curvature is greater than a threshold g 0 And in the unstructured scene, introducing a CPD algorithm to perform coarse registration, and in the other cases, in the structured scene, not introducing the CPD algorithm to perform coarse registration.
8. The laser scanning mapping method based on the adaptive combination of the CPD and ICP algorithm according to claim 7, wherein the method for performing coarse registration by the CPD algorithm comprises:
let the feature point cloud data point set of the previous frame be x= { X 1 ,x 2 ,x 3 ,…,x n Characteristic point cloud data point set of the current frame is y= { Y } 1 ,y 2 ,y 3 ,…,y m The transformation relation between the characteristic point cloud data point set of the previous frame and the characteristic point cloud data point set of the current frame is x=t (Y, θ), the characteristic point cloud data point set of the current frame is taken as each sub-model of the mixed gaussian model, the data points in the characteristic point cloud data point set of the previous frame are regarded as the data points generated by each sub-model, and the expression of the probability density function of the mixed gaussian model is as follows:
wherein p (x) is a probability distribution function of a characteristic point cloud data point set of the previous frame, p (x|m) is a conditional function, and x is the previous frameData points in the characteristic point cloud data point set of the frame, M represents the mth data point in the characteristic point cloud data point set of the current frame, M is the number of data points in the characteristic point cloud data point set of the current frame, and x-y m Sigma is the distance between a data point in the characteristic point cloud data point set of the current frame and a data point in the characteristic point cloud data point set of the previous frame 2 For the covariance value, D is the dimension of the point set, exp represents the exponential function;
the posterior probability distribution is calculated through the distribution model parameters, and the expression is as follows:
wherein p is old (m|x n ) As posterior probability distribution function, x n For the nth data point within the feature point cloud data point set of the previous frame, τ (y) m ,θ old ) As affine transformation function, θ old For posterior transformation parameters, y m The method comprises the steps that (1) an mth data point in a characteristic point cloud data point set of a current frame is obtained, omega is weight of uniform distribution of point clouds, M is the number of data points in the characteristic point cloud data point set of the current frame, and N is the number of data points in the characteristic point cloud data point set of a previous frame;
model parameters (theta, sigma) are obtained by minimizing loss function solution 2 ) The expression is as follows:
wherein cost (θ, σ) 2 ) Representing a minimization loss function;
and applying the model parameters to the characteristic point cloud data point set of the current frame to realize coarse registration.
9. The laser scanning mapping method based on the adaptive combination of CPD and ICP algorithms according to claim 1, wherein when the second iteration is matched between frames, the ICP algorithm is adopted to perform the coarse registration of the point cloud, comprising: and when the second iteration is matched between frames, registering the point cloud by adopting an ICP algorithm, wherein the method comprises the following steps:
the shortest distance function from point to line and the shortest distance function from point to surface are constrained to be uniform, and the expression is as follows:
loss[R,t]
wherein, R, t represents the relative pose transformation of the characteristic point cloud data of the current frame and the characteristic point cloud data of the previous frame;
solving the increment of R and t through a singular value decomposition method to realize point cloud registration of an ICP algorithm, wherein the expression is as follows:
wherein F is n The argmin represents a variable value when the objective function is minimized for the number of frames that match between the feature point cloud data of the previous frame and the feature point cloud data of the current frame.
10. The laser scanning mapping system based on the self-adaptive combination CPD and ICP algorithm is characterized by comprising a processor and a storage medium;
the storage medium is used for storing instructions;
the processor being operative according to the instructions to perform the steps of the method according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311495896.0A CN117392268A (en) | 2023-11-10 | 2023-11-10 | Laser scanning mapping method and system based on self-adaption combined CPD and ICP algorithm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311495896.0A CN117392268A (en) | 2023-11-10 | 2023-11-10 | Laser scanning mapping method and system based on self-adaption combined CPD and ICP algorithm |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117392268A true CN117392268A (en) | 2024-01-12 |
Family
ID=89440858
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311495896.0A Pending CN117392268A (en) | 2023-11-10 | 2023-11-10 | Laser scanning mapping method and system based on self-adaption combined CPD and ICP algorithm |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117392268A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117635896A (en) * | 2024-01-24 | 2024-03-01 | 吉林大学 | Point cloud splicing method based on automobile body point cloud motion prediction |
-
2023
- 2023-11-10 CN CN202311495896.0A patent/CN117392268A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117635896A (en) * | 2024-01-24 | 2024-03-01 | 吉林大学 | Point cloud splicing method based on automobile body point cloud motion prediction |
CN117635896B (en) * | 2024-01-24 | 2024-04-05 | 吉林大学 | Point cloud splicing method based on automobile body point cloud motion prediction |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108320329B (en) | 3D map creation method based on 3D laser | |
CN113409410B (en) | Multi-feature fusion IGV positioning and mapping method based on 3D laser radar | |
CN111398984B (en) | Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot | |
CN109100731B (en) | Mobile robot positioning method based on laser radar scanning matching algorithm | |
CN115372989A (en) | Laser radar-based long-distance real-time positioning system and method for cross-country automatic trolley | |
CN117392268A (en) | Laser scanning mapping method and system based on self-adaption combined CPD and ICP algorithm | |
CN112444246B (en) | Laser fusion positioning method in high-precision digital twin scene | |
CN110986956A (en) | Autonomous learning global positioning method based on improved Monte Carlo algorithm | |
CN110688440B (en) | Map fusion method suitable for less sub-map overlapping parts | |
CN115908539A (en) | Target volume automatic measurement method and device and storage medium | |
CN113313200B (en) | Point cloud precision matching method based on normal constraint | |
CN112612034B (en) | Pose matching method based on laser frame and probability map scanning | |
CN115267724B (en) | Position re-identification method of mobile robot capable of estimating pose based on laser radar | |
CN108469729B (en) | Human body target identification and following method based on RGB-D information | |
CN114581519A (en) | Laser autonomous positioning mapping method for quadruped robot in cross-country environment | |
CN112762824B (en) | Unmanned vehicle positioning method and system | |
CN115115702A (en) | Autonomous positioning method, device, equipment and computer readable storage medium | |
CN115143958A (en) | Multi-sensor fusion SLAM method based on GPU acceleration | |
CN114777775A (en) | Multi-sensor fusion positioning method and system | |
Guo et al. | 3D Lidar SLAM Based on Ground Segmentation and Scan Context Loop Detection | |
CN117710603B (en) | Unmanned aerial vehicle image three-dimensional building modeling method under constraint of linear geometry | |
CN111207755B (en) | Positioning method and storage medium for distribution robot in multi-floor indoor environment | |
CN117541614B (en) | Space non-cooperative target close-range relative pose tracking method based on improved ICP algorithm | |
CN116007634A (en) | Star table inspection device autonomous navigation method for cross-time domain feature search | |
Meng et al. | Section-LIO: A High Accuracy LiDAR-Inertial Odometry Using Undistorted Sectional Point |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |