CN111932614B - Laser radar instant positioning and mapping method based on clustering center characteristics - Google Patents
Laser radar instant positioning and mapping method based on clustering center characteristics Download PDFInfo
- Publication number
- CN111932614B CN111932614B CN202010638139.4A CN202010638139A CN111932614B CN 111932614 B CN111932614 B CN 111932614B CN 202010638139 A CN202010638139 A CN 202010638139A CN 111932614 B CN111932614 B CN 111932614B
- Authority
- CN
- China
- Prior art keywords
- point
- frame
- ground
- cluster
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000013507 mapping Methods 0.000 title claims abstract description 29
- 230000011218 segmentation Effects 0.000 claims abstract description 8
- 238000006073 displacement reaction Methods 0.000 claims description 12
- 238000000513 principal component analysis Methods 0.000 claims description 8
- 238000005457 optimization Methods 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 6
- 230000004927 fusion Effects 0.000 claims description 6
- 238000000605 extraction Methods 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 4
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000010845 search algorithm Methods 0.000 claims description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000005305 interferometry Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
- G06F18/2135—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a laser radar instant positioning and mapping method based on clustering center characteristics, and belongs to the field of autonomous positioning and navigation. The method comprises the steps of firstly, carrying out segmentation clustering on environment local point cloud obtained by a laser radar to obtain local ground clustering and local non-ground point clustering in the point cloud, extracting a clustering center for each non-ground sub-cluster to form a non-ground clustering center point set, extracting a flat point from the ground clustering as a plane point, and respectively establishing constraint conditions by combining the non-ground clustering center point set and a non-ground clustering normal vector to realize instant positioning of the laser radar and updating of a global map. The method can reduce the interference of positioning and mapping precision caused by point cloud data noise, does not depend on angular point characteristics, has universality, can be directly used for carrying out instant positioning and mapping scene establishment through a laser radar, and has a great application prospect in the fields of automatic driving, mobile robots and virtual reality.
Description
Technical Field
The invention belongs to the field of autonomous positioning and navigation, and particularly relates to a laser radar instant positioning and mapping method based on clustering center characteristics.
Background
An instant positioning and Mapping (SLAM) technology is one of the most basic and important technologies in the technical field of positioning and navigation, and is essentially a robot or other sensor carrier, wherein the pose information of the robot in the environment is determined through the sensing data of the sensor to the surrounding environment, and meanwhile, the map is established for the environment. The laser radar is a high-precision and high-efficiency three-dimensional measuring device, and is widely applied to the fields of all-weather automatic driving, high-precision three-dimensional surveying and mapping and high-precision mobile robots based on a method for instantly positioning and mapping the sensing data, namely point cloud, of the external environment by the laser radar and based on the advantages of high mapping precision, accurate scale information and small influence of ambient illumination. In 2014, ji Zhang et al proposed a famous LOAM (LiDAR interferometry and Mapping) algorithm, and realized the LiDAR real-time positioning and Mapping by disassembling the LiDAR real-time positioning and Mapping problem into a high-frequency and low-precision LiDAR pose estimation problem and a low-frequency and high-precision LiDAR Mapping problem. On the basis, the images acquired by the camera are further used as one of environment perception data, and the laser radar point cloud data is fused for real-time positioning and mapping. In 2018, tixiao Shan et al improved on the basis of a LOAM algorithm, and proposed a light-weight ground-optimized laser radar instant positioning and mapping method, namely LeGO-LOAM. Xingliang Ji et al added a loop back test to the LOAM in 2019, and proposed an LLOAM algorithm to further improve the accuracy of positioning and mapping.
The method adopts the angular points (corners) and the Flat points (plat) in the point cloud data to establish data association and distance constraint in the instant positioning and mapping process of the laser radar, thereby realizing positioning and mapping. The detection and extraction of boundary points in the point cloud need to calculate flatness or curvature, and the calculation is greatly interfered by noise, so that missing detection and false detection of corner points occur, thereby causing wrong data association and influencing the positioning and drawing accuracy.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a laser radar instant positioning and mapping method based on clustering center characteristics. The method can reduce the interference of positioning and mapping precision caused by point cloud data noise, does not depend on angular point characteristics, has universality, can be directly used for carrying out instant positioning and mapping scene establishment through a laser radar, and has a great application prospect in the fields of automatic driving, mobile robots and virtual reality.
The invention provides a laser radar instant positioning and mapping method based on clustering center characteristics, which is characterized by comprising the following steps of:
1) Defining a laser radar body coordinate system and a world coordinate system as follows: recording a laser radar body coordinate system as { L }, wherein an X axis points to the motion direction of the laser radar, a Y axis points to the left side, a Z axis points to the upper side, and an XOY plane of a laser radar body coordinate system is parallel to the local ground; the world coordinate system is consistent with the laser radar body coordinate system of the 1 st frame starting time;
2) Let the state quantity of the laser radar be: Ψ = { t x ,t y ,t z ,θ pitch ,θ yaw ,θ roll Therein, { t } x ,t y ,t z Is the laser radar displacement, t x For the displacement of the lidar along the X-axis of the world coordinate system, t y For lidar displacement along the Y-axis of the world coordinate system, t z For lidar displacements along the Z-axis of the world coordinate system, { θ } pitch ,θ yaw ,θ toll Is the amount of rotation of the lidar, θ pitch For the amount of lidar rotation along the X-axis of the world coordinate system, θ yaw For the amount of rotation of the lidar along the Y-axis of the world coordinate system, θ roll The rotation amount of the laser radar along the Z axis of the world coordinate system;
3) Obtaining an environmental local point cloud image P by a laser radar k K =1, 2.,, wherein P k Representing a k frame image acquired by a laser radar; the pose estimation result of the laser radar at the k frame end time isWill P k Let k =1 as the current frame, setAnd as the currentSet Ψ = {0, 0} when k =1 and as the current Ψ;
4) Adopting breadth-first search algorithm BFS to process current frame P k Performing cluster segmentation on the local point cloud of the environment, and dividing the point cloud after the cluster segmentation into a kth frame of local ground clusterAnd local non-ground clustering of the k frameThe method comprises the following specific steps:
4-1) reacting P k Projected as a corresponding depth image I k ;
4-2) is provided withThreshold TH of horizontal included angle between fixed ground and laser radar G ;
4-3) traversing the depth image I by adopting a BFS algorithm k And calculate I k The included angle between each point and the adjacent point; if the included angle is smaller than TH G Then the point is determined to be at P k The corresponding point in (1) is a ground point and the corresponding point is added into the local ground cluster of the k frameOtherwise, the point is judged to be at P k The corresponding point in the k frame is a non-ground point and is added into the local non-ground cluster of the k frame
5) Using BFS, toClustering segmentation is carried out on all points in the cluster, and the specific steps are as follows:
5-1) setting clustering included angle threshold TH NG ;
5-2) traversing the depth image I by adopting a BFS algorithm k Local non-ground clustering of middle-belonging k-th frameCalculating the angle between each point traversed and the adjacent point of the point, wherein the adjacent point isAny point in I k The corresponding point in (1); if the included angle is larger than TH NG If the traversed point and the adjacent point belong to the same sub-cluster, judging that the traversed point and the adjacent point belong to the same sub-cluster;
5-3) after the traversal is finished,is divided into L local non-ground sub-clustersL is a memberThe total number of clusters;
6) Let the k frame scan start time be t k-1 End time is t k According to the presentFor each point i ∈ P in the k frame point cloud k The result of the estimation of the pose of the pointAs shown in formula (1), wherein t is (k,i) ∈[t k-1 ,t k ]Corresponding a timestamp for the point;
if the current frame P k For frame 1, then useCalculating P according to equation (1) 1 Corresponding to each point inThen according to calculationWill P 1 Converts each point in the global coordinate system to form a global map as a current global map M, and then let k = k +1, willCurrent as frame 2Then returning to the step 4) again; if the current frame P k If not, go to step 7);
7) FromExtracting the flat feature of the k frameSign point setH is to be k Each point in the projection plane is used as a plane point and a distance constraint from the plane point to the corresponding projection plane is established; the method comprises the following specific steps:
7-1) setting a plane threshold TH p Calculated using equation (2)Flatness S of each point in the i :
Wherein Nb i Is composed ofA local neighborhood of the ith point in the map,is composed ofThe coordinates of the ith point in the image,is a neighborhood Nb i Coordinates of the inner jth point;
to S i And (4) judging: if S i Less than a threshold value TH p Then, thenThe ith point is a flat characteristic point; will be provided withAll the flat characteristic points in the frame K form a flat characteristic point set H of the frame K k ;
7-2) reacting H k Each point in (2) corresponding to the pointProjection is carried out to the laser radar coordinate system of the k frame scanning start time, and H is carried out after the projection is finished k The corresponding point of each point in the coordinate system forms the k frame start plane point set, and is recorded as
7-3) set H of flat feature points of the k-1 frame k-1 Corresponding to each point inProjecting to the laser radar coordinate system corresponding to the scanning end time of the (k-1) th frame, and after the projection is finished, H k-1 The corresponding point of each point in the coordinate system forms the terminating plane point set of the (k-1) th frame, and is recorded as
7-4) fromIn search andeach point inThree points nearest to each other and not collinearAnd withComposition ofCorresponding projection planes are established by the formula (3)The distance from each point to the projection plane corresponding to the point is constrained:
wherein,is composed ofThe distance between the ith point and the corresponding projection plane is constrained;
8) FromMiddle extraction of clustered central point set Ct k And establishing distance constraint from the clustering center point to the point, which comprises the following steps:
8-1) mixingCorresponding to each point inProjecting the image to the laser radar coordinate system at the scanning start time of the kth frame, after the projection is finished,the corresponding point of each point in the frame K under the coordinate system forms the initial local non-ground cluster of the frame K, and the initial local non-ground cluster is recorded as
8-2) extracting the initial local non-ground clustering center point set of the kth frame according to the formula (4):
wherein,for the ith starting local non-ground sub-clusterThe coordinates of the center point of (a),is composed ofInner point, N i Is composed ofNumber of inner points, allForming a k-th frame starting local non-ground cluster center point set, and recording as
8-3) non-ground clustering center point set of the k-1 frameCorresponding to each point inProjecting the image to the laser radar coordinate system at the scanning end time of the (k-1) th frame, after the projection is finished,the corresponding point of each point in the coordinate system forms a (k-1) th frame to terminate local non-ground clustering, and the frame is recorded asRepeating the step 8-2) to obtain a k-1 frame termination local non-ground cluster center point set which is recorded as
8-4) fromIn search andeach point inClosest point of distanceEstablishingAndpoint-to-point data association is carried out, and point-to-point distance constraint is established according to the formula (5):
9) FromExtracting each sub-clusterNormal toMeasurement ofAnd establishing sub-cluster center normal vector rotation constraint, which comprises the following steps:
9-1) clustering each sub-clusterPerforming principal component analysis to obtain principal component direction of the sub-cluster as normal vector of center of the sub-clusterClustering each sub-groupPerforming principal component analysis to obtain principal component direction of the sub-cluster as normal vector of the center of the sub-cluster
9-2) obtained according to step 8-4)Andpoint-to-point data association is carried out, and a sub-cluster center normal vector included angle constraint is established, wherein the formula (6) is as follows:
wherein,constraint of a normal vector included angle of an ith sub-cluster center of the current frame;
10 ) iteratively optimizing the constraints established in steps 7) -9) by a non-linear optimization method, updating the current Ψ and the current ΨRealizing the positioning of the laser radar of the current frame;
if the iteration is not converged, returning to the step 6) again, and utilizing the updated currentUpdatingIf the iteration is converged, entering the step 11);
11 Utilizing the updated current Ψ and the current Ψ of step 10)Converting the environment local point cloud of the kth frame into a world coordinate system, and performing iterative closest point ICP optimization on the k frame local non-ground cluster projected to the world coordinate system and the current global map to further optimize the current psi and the current global mapUpdating the current global map; the method comprises the following specific steps:
11-1) clustering each point in the k-th frame local non-ground pointsUpdated Current according to step 10)Corresponding to the point inProjecting the point to the world coordinate system, and recording the projected point asAfter the projection is finished, all the projection points form a k frame global non-ground point cluster which is recorded asThe expression is as follows:
11-2) Point non-ground points in the current global map M toPerforming ICP operation to further optimize currentObtaining a final laser radar pose state quantity psi and a final pose matrix corresponding to the current frame
11-3) clustering each point in the k frame local non-ground point clusterFinal according to step 11-2)Corresponding to the point inProjected to the world coordinate system for updatingTo be updatedCarrying out point fusion based on voxel filtering with non-ground points in the current global map M;
11-4) clustering each point in the k-th frame local ground point clusterUpdated final according to step 11-2)Corresponding to the point inProjecting the point to the world coordinate system, and recording the projected point asAfter the projection is finished, all projection pointsForming a current frame global non-ground point cluster which is marked asWill be provided withPerforming point fusion based on voxel filtering with the ground points in the current global map M, and finishing the updating of the current global map M;
12 Let k = k +1 when the lidar acquires a new frame of ambient local point cloud, and combine the final Ψ and the final Ψ obtained in step 11-2)As new current Ψ and current, respectivelyAnd then returns to step 4) again.
The invention has the characteristics and beneficial effects that:
the invention distinguishes the ground points and various non-ground points in the point cloud by clustering and dividing the laser radar point cloud, further, establishes point-to-point distance constraint by directly using the center point of the non-ground point cluster, realizes the laser radar instant positioning and mapping without angular point extraction, reduces the noise interference by the average characteristic of the cluster center, and avoids the error angular point extraction and the following error data association caused by the noise interference because of no need of extracting the angular point. The method can be directly used for carrying out instant positioning and mapping scene through the laser radar, and has a wide application prospect in the fields of automatic driving, mobile robots and virtual reality.
Drawings
FIG. 1 is an overall flow diagram of the method of the present invention.
Detailed Description
The invention provides a laser radar instant positioning and mapping method based on clustering center characteristics, which is further described in detail below by combining the accompanying drawings and specific embodiments.
The invention provides a laser radar instant positioning and mapping method based on clustering center characteristics, wherein a Velodyne VLP-16 type laser radar is used as an environment perception sensor, and on the basis of acquisition frequency of 10Hz, the laser radar has 360 degrees of horizontal field of view, 0.2 degree of horizontal resolution, 30 degrees of vertical field of view, 2 degrees of vertical resolution and 16 laser scanning lines in the vertical direction. The laser radar data is composed of an environmental sampling point and a depth value from the center of the laser radar, and the depth value can be converted into a three-dimensional coordinate under a laser radar coordinate system, namely a { X, Y, Z } value according to a horizontal included angle and a vertical included angle between the sampling point and the laser radar. The overall flow of the method is shown in fig. 1, and comprises the following steps:
1) Defining a laser radar body coordinate system and a world coordinate system as follows: the coordinate system of the laser radar body is recorded as { L }, the X axis points to the movement direction of the laser radar, the Y axis points to the left side, and the Z axis points to the upper side, so that the XOY plane of the laser radar body coordinate system is parallel to the local ground where the XOY plane is located. The world coordinate system coincides with the lidar body coordinate system at the start time of frame 1.
2) The state quantity estimated in the real-time positioning and mapping process of the laser radar is as follows: Ψ = { t x ,t y ,t z ,θ pitch ,θ yaw ,θ roll Therein, { t } x ,t y ,t z Is the lidar displacement, t x For the displacement of the lidar along the X-axis of the world coordinate system, t y For lidar displacement along the Y-axis of the world coordinate system, t z For laser minesUp to a displacement along the Z-axis of the world coordinate system, { θ } pitch ,θ yaw ,θ roll Is the amount of rotation of the lidar,. Theta } pitch For the amount of lidar rotation along the X-axis of the world coordinate system, θ yaw For the amount of rotation of the lidar along the Y-axis of the world coordinate system, θ roll Is the rotation amount of the laser radar along the Z axis of the world coordinate system.
3) Obtaining an environmental local point cloud image P by a laser radar k (k =1,2.,) wherein, P is k The image of the kth frame obtained by the laser radar is represented, and k represents the frame number sequence number of the image obtained by the laser radar; will P k As the current frame, the pose estimation result of the laser radar at the k frame ending time is recorded asLet k =1, setAnd as the currentSet Ψ = {0, 0} when k =1 and as the current Ψ;
4) Applying a Breadth First Search algorithm (BFS) to the current frame P k Performing cluster segmentation on the local point cloud of the environment, and dividing the point cloud after the cluster segmentation into a kth frame of local ground clusterAnd k frame local non-ground clusteringThe method comprises the following specific steps:
4-1) local point cloud P of the k frame environment k Projected as a corresponding depth image I k ;
4-2) setting a threshold TH of a horizontal included angle between the ground and the laser radar according to the installation angle of the laser radar G (in this example the lidar is parallel to the ground, so TH is set G =10°);
4-3) traversing the depth image I by adopting a BFS algorithm k And calculate I k If the included angle between each point and its adjacent point is less than TH G Then the point is determined to be at P k The corresponding point in (1) is a ground point and the corresponding point is added into the local ground cluster of the k frameOtherwise, the point is judged to be at P k The corresponding point in the k frame is a non-ground point and is added into the local non-ground cluster of the k frame
5) Local non-ground clustering of kth frame using BFSAll the points in the method are clustered and segmented, and the method comprises the following specific steps:
5-1) setting a clustering included angle threshold TH according to the laser point cloud density and the noise level NG (this example is given by TH NG =60°);
5-2) traversing the depth image I by adopting a BFS algorithm k Local non-ground clustering of middle-belonging k-th frameCalculating the angle between each point traversed and the adjacent point of the point, wherein the adjacent points are the sameAny point in I k If the neighboring point is notThe corresponding point of any point in the two points is not calculated), if the included angle is larger than TH NG If the traversed point and the adjacent point belong to the same sub-cluster, judging that the traversed point and the adjacent point belong to the same sub-cluster;
5-3) after traversing, local non-ground clustering is carried out on the k frameIs divided into L local non-ground sub-clustersL is the total number of sub-clusters, its value and TH NG And the k frame point cloud P k The number of non-ground objects present in the ground is related;
6) Let the k frame scan start time be t k-1 End time t k According to the presentFor each point i epsilon P in the point cloud obtained by scanning in the kth frame k The result of pose estimation of the pointAs shown in formula (1), wherein t is (k,i) ∈[t k-1 ,t k ]A timestamp is assigned to the point.
If the current frame is the 1 st frame, then useCalculating P according to equation (1) 1 Corresponding to each point inThen according to calculationWill P 1 Converts each point in the global coordinate system to form a global map as a current global map M, and then let k = k +1, willCurrent as frame 2(i.e., the current of the 2 nd frame initial time)Or also) And returning to the step 4) again. If the current frame is not the 1 st frame, go to step 7).
7) Clustering from local groundExtracting flat characteristic point set of the k frameH is to be k Each point in the projection plane is used as a plane point and a distance constraint from the plane point to the corresponding projection plane is established; the method comprises the following specific steps:
7-1) setting a plane threshold TH according to the laser point cloud density and the noise level p (in this example, TH is given p = 0.1). Computing ground clusters using equation (2)Flatness S of each point in the i :
Wherein Nb i Is composed ofA local neighborhood of the ith point (the local neighborhood can be formed by using m points nearest to the Euclidean distance of the point to be solved or the local neighborhood can be formed by using the point in a sphere with the radius of r and the spherical center of the point to be solved, the local neighborhood is formed by using 10 points nearest to the Euclidean distance of the point to be solved in the example),is composed ofThe coordinates of the ith point in the image,is a neighborhood Nb i Coordinates of the inner jth point.
To S i And (4) judging: if S i Less than threshold TH p Then S is i Corresponding toThe ith point is a flat characteristic point; will be provided withAll the flat characteristic points in the frame K form a flat characteristic point set H of the frame K k ;
7-2) reacting H k Corresponding to each point inProjecting to the laser radar coordinate system of the k frame scanning start time, and after the projection is finished, H k The corresponding point of each point in the coordinate system forms the k frame start plane point set, and is recorded as
7-3) set H of flat feature points of the previous frame (frame k-1) k-1 Each point in (2) corresponding to the pointProjecting to the position corresponding to the scanning end time of the previous frame (the k-1 th frame) under the laser radar coordinate system, and after the projection is finished, H k-1 The corresponding point of each point in the coordinate system forms the terminating plane point set of the (k-1) th frame, and is recorded as
7-4) fromIn search andeach point inThree points nearest to each other and not collinearAnd withComposition ofCorresponding projection planes are established by the formula (3)And (3) constraining the distance from each point to the corresponding projection plane:
wherein,is composed ofThe distance between the ith point and the corresponding projection plane is constrained;
8) Clustering from local non-groundMiddle-extracted clustered central point set Ct k And establishing distance constraint from a cluster central point to a point, which comprises the following specific steps:
8-1) clustering local non-ground surfacesEach point in (2) corresponding to the pointProjecting to the laser radar coordinate system at the scanning start time of the kth frame, after the projection is finished,the corresponding point of each point in the frame K under the coordinate system forms the initial local non-ground cluster of the frame K, and the initial local non-ground cluster is recorded as
8-2) extracting the initial local non-ground clustering center point set of the kth frame according to the formula (4):
wherein subscript i is the starting local non-ground sub-cluster index,for the ith starting local non-ground sub-clusterThe coordinates of the center point of (a),is composed ofInner dot, N i Is composed ofNumber of inner points, allMake up the k frame startSet of local non-ground cluster center points, denoted as
8-3) non-ground clustering center point set of the previous frame (the k-1 frame)Each point in (2) corresponding to the pointProjecting to the laser radar coordinate system of the scanning end time of the previous frame (the k-1 th frame), after the projection is finished,the corresponding point of each point in the coordinate system forms a (k-1) th frame to terminate local non-ground clustering, and the frame is recorded asRepeating the step 8-2) to obtain a k-1 frame termination local non-ground cluster center point set which is recorded as
8-4) fromIn search andeach point inClosest point of distanceEstablishingAnd withPoint-to-point data association is carried out, and point-to-point distance constraint is established according to the formula (5):
9) Starting local non-terrestrial clustering from the kth frameExtracting each sub-clusterNormal vector of (1)And establishing a sub-cluster center normal vector rotation constraint, which comprises the following specific steps:
9-1) for each starting local non-ground sub-cluster of the kth framePerforming Principal Component Analysis (PCA) to obtain principal Component direction of the sub-cluster as normal vector of center of the sub-clusterLocal non-ground sub-clustering for each termination of the k-1 framePerforming Principal Component Analysis (PCA) to obtain principal Component direction of the sub-cluster as normal vector of center of the sub-cluster
9-2) obtained according to step 8-4)And withPoint-to-point data association is performed, and a sub-cluster center normal vector included angle constraint is established, as shown in formula (6):
wherein,and the angle of the normal vector of the ith sub-cluster center of the current frame is constrained.
10 ) iteratively optimizing the constraints established in steps 7) -9) by a non-linear optimization method, updating the current Ψ and the current ΨAnd realizing the positioning of the laser radar of the current frame.
If the iteration is not converged, returning to the step 6) again, and utilizing the updated currentUpdatingIf the iteration converges, step 11) is entered.
11 Utilizing the updated current Ψ and the current Ψ of step 10)Converting the environment local point cloud of the kth frame into a world coordinate system, and performing Iterative Closest Point (ICP) optimization on the k frame local non-ground cluster projected to the world coordinate system and the current global map to further optimize the current psi and the current global mapUpdating the current global map; the method comprises the following specific steps:
11-1) clustering each point in the k-th frame local non-ground pointsUpdated Current according to step 10)Corresponding to the point inProjecting the point to the world coordinate system, and recording the projected point asAfter the projection is finished, all the projection points form the k frame global non-ground point cluster which is recorded asThe expression is as follows:
11-2) comparing the non-ground point in the current global map M with the ground pointPerforming ICP operation to further optimize the currentObtaining the final laser radar pose state quantity psi corresponding to the current frame after map optimization andfinal pose matrix
11-3) clustering each point in the k-th frame local non-ground pointsFinal according to step 11-2)Corresponding to the point inProjected to the world coordinate system for updatingTo be updatedAnd carrying out point fusion based on voxel filtering with non-ground points in the current global map M.
11-4) clustering each point in the k-th frame local ground point clusterUpdated final according to step 11-2)Corresponding to the point inProjecting the point to the world coordinate system, and recording the projected point asAfter the projection is finished, all projection pointsForming a current frame global non-ground point cluster which is marked asWill be provided withPerforming point fusion based on voxel filtering with the ground points in the current global map M, and finishing the updating of the current global map M;
Claims (1)
1. A laser radar instant positioning and mapping method based on clustering center features is characterized by comprising the following steps:
1) Defining a laser radar body coordinate system and a world coordinate system as follows: recording a laser radar body coordinate system as { L }, wherein an X axis points to the motion direction of the laser radar, a Y axis points to the left side, a Z axis points to the upper side, and an XOY plane of a laser radar body coordinate system is parallel to the local ground; the world coordinate system is consistent with the laser radar body coordinate system of the 1 st frame starting time;
2) Let the state quantity of the laser radar be: Ψ = { t x ,t y ,t z ,θ pitch ,θ yaw ,θ roll In which, { t } x ,t y ,t z Is the lidar displacement, t x For the displacement of the lidar along the X-axis of the world coordinate system, t y For the displacement of the lidar along the Y-axis of the world coordinate system, t z For lidar displacements along the Z-axis of the world coordinate system, { θ } pitch ,θ yaw ,θ roll Is the amount of rotation of the lidar, θ pitch For the amount of lidar rotation along the X-axis of the world coordinate system, θ yaw For lidar along the world coordinate system YAmount of rotation of the shaft, θ roll The rotation amount of the laser radar along the Z axis of the world coordinate system is obtained;
3) Obtaining an ambient local point cloud image P by a laser radar k K =1, 2., wherein P k Representing the k frame image acquired by the laser radar; the pose estimation result of the laser radar at the kth frame end time isWill P k Let k =1 as the current frame, setAnd as the currentSet Ψ = {0, 0} when k =1 and as the current Ψ;
4) Adopting breadth-first search algorithm BFS to process current frame P k Performing cluster segmentation on the local point cloud of the environment, and dividing the point cloud after the cluster segmentation into a kth frame of local ground clusterAnd local non-ground clustering of the k frameThe method comprises the following specific steps:
4-1) reacting P k Projected as a corresponding depth image I k ;
4-2) setting threshold TH of horizontal included angle between ground and laser radar G ;
4-3) traversing the depth image I by adopting a BFS algorithm k And calculate I k The included angle between each point and the adjacent point; if the included angle is smaller than TH G Then the point is determined to be at P k The corresponding point in (1) is a ground point and is added into the k frame local ground clusterOtherwise, the point is judged to be at P k The corresponding point in the k frame is a non-ground point and is added into the local non-ground cluster of the k frame
5) Using BFS, toAll the points in the method are clustered and segmented, and the method comprises the following specific steps:
5-1) setting clustering included angle threshold TH NG ;
5-2) traversing the depth image I by adopting a BFS algorithm k Local non-ground clustering of the k frameCalculating the angle between each point traversed and the adjacent point of the point, wherein the adjacent point isAny point in I k The corresponding point in (1); if the included angle is larger than TH NG If the traversed point and the adjacent point belong to the same sub-cluster, judging that the traversed point and the adjacent point belong to the same sub-cluster;
5-3) after the traversal is finished,is divided into L local non-ground sub-clustersL is the total number of sub-clusters;
6) Let the k frame scan start time be t k-1 End time is t k According to the presentFor each point i ∈ P in the k frame point cloud k The result of pose estimation of the pointAs shown in formula (1), wherein t (k,i) ∈[t k-1 ,t k ]Corresponding a timestamp for the point;
if the current frame P k For frame 1, then useCalculating P according to equation (1) 1 Corresponding to each point inThen according to calculationWill P 1 Each point in the map is converted into a global map under a world coordinate system and used as a current global map M, and then k = k +1 is set to beCurrent as frame 2Then returning to the step 4) again; if the current frame P k If not, go to step 7);
7) FromAnd extracting a flat feature point set of the k frameWill H k Each point in the projection plane is used as a plane point and a distance constraint from the plane point to the corresponding projection plane is established; in particular toThe method comprises the following steps:
7-1) setting a plane threshold TH p Calculated using equation (2)Flatness S of each point in the i :
Wherein Nb i Is composed ofA local neighborhood of the ith point in (c),is composed ofThe coordinates of the ith point in the (c) plane,is a neighborhood Nb i Coordinates of the inner jth point;
to S i And (4) judging: if S i Less than threshold TH p Then, thenThe ith point is a flat characteristic point; will be provided withAll the flat characteristic points in the frame K form a flat characteristic point set H of the frame K k ;
7-2) reacting H k Corresponding to each point inLaser beam projected to k-th frame scanning start timeIn the coordinate system, after the projection is finished, H k The corresponding point of each point in the coordinate system forms the k frame starting plane point set, which is recorded as
7-3) set H of flat feature points of the k-1 frame k-1 Each point in (2) corresponding to the pointProjecting to the position of the laser radar coordinate system corresponding to the scanning end time of the (k-1) th frame, and after the projection is finished, H k-1 The corresponding point of each point in the coordinate system forms the terminating plane point set of the (k-1) th frame, and is recorded as
7-4) fromIn search andeach point inThree closest and non-collinear pointsAnd withComposition ofCorresponding projection planes are established by the formula (3)And (3) constraining the distance from each point to the corresponding projection plane:
wherein,is composed ofThe distance from the ith point to the projection plane corresponding to the point is constrained;
8) FromMiddle extraction of clustered central point set Ct k And establishing distance constraint from the clustering center point to the point, which comprises the following steps:
8-1) willCorresponding to each point inProjecting the image to the laser radar coordinate system at the scanning start time of the kth frame, after the projection is finished,the corresponding point of each point in the frame K under the coordinate system forms the initial local non-ground cluster of the frame K, and the initial local non-ground cluster is recorded as
8-2) extracting the initial local non-ground clustering center point set of the kth frame according to the formula (4):
wherein,for the ith starting local non-ground sub-clusterThe coordinates of the center point of (a),is composed ofInner point, N i Is composed ofNumber of inner points, allForming a k-th frame starting local non-ground cluster center point set, and recording as
8-3) collecting the k-1 frame non-ground clustering central pointsCorresponding to each point inProjecting to the laser radar coordinate system of the k-1 frame scanning end time, after the projection is finished,the corresponding point of each point in the coordinate system forms the (k-1) th frame to terminate the local non-ground cluster, and the local non-ground cluster is recorded asRepeating the step 8-2), obtaining a k-1 frame termination local non-ground cluster center point set which is recorded as
8-4) fromIn search andeach point inClosest point of distanceEstablishingAndpoint-to-point data association is carried out, and point-to-point distance constraint is established according to the formula (5):
9) FromExtracting each sub-clusterNormal vector of (2)And establishing a sub-cluster center normal vector rotation constraint, which comprises the following specific steps:
9-1) clustering each sub-clusterPerforming principal component analysis to obtain principal component direction of the sub-cluster as normal vector of the center of the sub-clusterClustering each sub-groupPerforming principal component analysis to obtain principal component direction of the sub-cluster as normal vector of the center of the sub-cluster
9-2) obtained according to step 8-4)Andpoint-to-point data association is carried out, and a sub-cluster center normal vector included angle constraint is established, wherein the formula (6) is as follows:
wherein,constraint of a normal vector included angle of the ith sub-cluster center of the current frame;
10 ) iteratively optimizing the constraints established in steps 7) -9) by a non-linear optimization method, updating the current Ψ and the current ΨRealizing the positioning of the laser radar of the current frame;
if the iteration is not converged, the step 6) is returned again to utilize the updated current valueUpdatingIf the iteration is converged, entering the step 11);
11 Utilizing the updated current Ψ and the current Ψ of step 10)Converting the environment local point cloud of the kth frame into a world coordinate system, and performing iterative closest point ICP optimization on the k frame local non-ground cluster projected to the world coordinate system and the current global map to further optimize the current psi and the current global mapAnd updating the current global map; the method comprises the following specific steps:
11-1) clustering each point in the k-th frame local non-ground pointsUpdated Current according to step 10)Corresponding to the point inProjecting the point to the world coordinate system, and recording the projected point asAfter the projection is finished, all the projection points form a k frame global non-ground point cluster which is recorded asThe expression is as follows:
11-2) Point non-ground points in the current global map M toPerforming ICP operation to further optimize the currentObtaining the final laser radar position and posture state quantity psi and the final position and posture matrix corresponding to the current frame
11-3) clustering each point in the k frame local non-ground point clusterFinal according to step 11-2)Corresponding to the point inProjected to the world coordinate system for updatingWill be updatedCarrying out point fusion based on voxel filtering with non-ground points in the current global map M;
11-4) clustering each point in the k-th frame local ground point clusterUpdated final according to step 11-2)Corresponding to the point inProjecting the image to a world coordinate system, and recording the projected points as pointsAfter the projection is finished, all projection pointsForming a current frame global non-ground point cluster which is marked asWill be provided withPerforming point fusion based on voxel filtering with ground points in the current global map M, and finishing updating the current global map M;
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010638139.4A CN111932614B (en) | 2020-07-06 | 2020-07-06 | Laser radar instant positioning and mapping method based on clustering center characteristics |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010638139.4A CN111932614B (en) | 2020-07-06 | 2020-07-06 | Laser radar instant positioning and mapping method based on clustering center characteristics |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111932614A CN111932614A (en) | 2020-11-13 |
CN111932614B true CN111932614B (en) | 2022-10-14 |
Family
ID=73312482
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010638139.4A Active CN111932614B (en) | 2020-07-06 | 2020-07-06 | Laser radar instant positioning and mapping method based on clustering center characteristics |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111932614B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112614186A (en) * | 2020-12-28 | 2021-04-06 | 上海汽车工业(集团)总公司 | Target pose calculation method and calculation module |
CN113192174B (en) * | 2021-04-06 | 2024-03-26 | 中国计量大学 | Picture construction method and device and computer storage medium |
CN113340296B (en) * | 2021-06-21 | 2024-04-09 | 上海仙工智能科技有限公司 | Method and device for automatically updating mobile robot map |
CN113848947B (en) * | 2021-10-20 | 2024-06-28 | 上海擎朗智能科技有限公司 | Path planning method, path planning device, computer equipment and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110223379A (en) * | 2019-06-10 | 2019-09-10 | 于兴虎 | Three-dimensional point cloud method for reconstructing based on laser radar |
CN110221603A (en) * | 2019-05-13 | 2019-09-10 | 浙江大学 | A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud |
CN111045017A (en) * | 2019-12-20 | 2020-04-21 | 成都理工大学 | Method for constructing transformer substation map of inspection robot by fusing laser and vision |
-
2020
- 2020-07-06 CN CN202010638139.4A patent/CN111932614B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110221603A (en) * | 2019-05-13 | 2019-09-10 | 浙江大学 | A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud |
CN110223379A (en) * | 2019-06-10 | 2019-09-10 | 于兴虎 | Three-dimensional point cloud method for reconstructing based on laser radar |
CN111045017A (en) * | 2019-12-20 | 2020-04-21 | 成都理工大学 | Method for constructing transformer substation map of inspection robot by fusing laser and vision |
Non-Patent Citations (2)
Title |
---|
一种基于结构化环境的线性距离特征提取算法;匡兵等;《科学技术与工程》;20200228(第06期);全文 * |
一种改进ICP算法的移动机器人激光与视觉建图方法研究;张杰等;《机电工程》;20171218(第12期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN111932614A (en) | 2020-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111932614B (en) | Laser radar instant positioning and mapping method based on clustering center characteristics | |
CN115407357B (en) | Low-harness laser radar-IMU-RTK positioning mapping algorithm based on large scene | |
CN103413352A (en) | Scene three-dimensional reconstruction method based on RGBD multi-sensor fusion | |
CN111862316B (en) | Three-dimensional reconstruction method of dense direct RGBD (Red Green blue-white) of tight coupling of IMU (inertial measurement Unit) based on optimization | |
CN114526745A (en) | Drawing establishing method and system for tightly-coupled laser radar and inertial odometer | |
CN115372989A (en) | Laser radar-based long-distance real-time positioning system and method for cross-country automatic trolley | |
CN101916445A (en) | Affine parameter estimation-based image registration method | |
CN114659514A (en) | LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration | |
CN112465849B (en) | Registration method for laser point cloud and sequence image of unmanned aerial vehicle | |
CN116449384A (en) | Radar inertial tight coupling positioning mapping method based on solid-state laser radar | |
CN115218889A (en) | Multi-sensor indoor positioning method based on dotted line feature fusion | |
CN118168545A (en) | Positioning navigation system and method for weeding robot based on multi-source sensor fusion | |
CN115908539A (en) | Target volume automatic measurement method and device and storage medium | |
CN117541614B (en) | Space non-cooperative target close-range relative pose tracking method based on improved ICP algorithm | |
CN115355904A (en) | Slam method for Lidar-IMU fusion of ground mobile robot | |
CN113554705B (en) | Laser radar robust positioning method under changing scene | |
CN114581519A (en) | Laser autonomous positioning mapping method for quadruped robot in cross-country environment | |
CN112767459A (en) | Unmanned aerial vehicle laser point cloud and sequence image registration method based on 2D-3D conversion | |
US20240185595A1 (en) | Method for evaluating quality of point cloud map based on matching | |
CN117029870A (en) | Laser odometer based on road surface point cloud | |
CN116878542A (en) | Laser SLAM method for inhibiting height drift of odometer | |
CN116045965A (en) | Multi-sensor-integrated environment map construction method | |
CN115113170A (en) | Laser radar edge feature prediction method based on indoor feature degradation environment | |
CN115512054A (en) | Method for constructing three-dimensional space-time continuous point cloud map | |
CN114897967A (en) | Material form recognition method for autonomous operation of excavating equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |