CN111932614B - Laser radar instant positioning and mapping method based on clustering center characteristics - Google Patents

Laser radar instant positioning and mapping method based on clustering center characteristics Download PDF

Info

Publication number
CN111932614B
CN111932614B CN202010638139.4A CN202010638139A CN111932614B CN 111932614 B CN111932614 B CN 111932614B CN 202010638139 A CN202010638139 A CN 202010638139A CN 111932614 B CN111932614 B CN 111932614B
Authority
CN
China
Prior art keywords
point
frame
ground
cluster
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010638139.4A
Other languages
Chinese (zh)
Other versions
CN111932614A (en
Inventor
邓博文
孟子阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202010638139.4A priority Critical patent/CN111932614B/en
Publication of CN111932614A publication Critical patent/CN111932614A/en
Application granted granted Critical
Publication of CN111932614B publication Critical patent/CN111932614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a laser radar instant positioning and mapping method based on clustering center characteristics, and belongs to the field of autonomous positioning and navigation. The method comprises the steps of firstly, carrying out segmentation clustering on environment local point cloud obtained by a laser radar to obtain local ground clustering and local non-ground point clustering in the point cloud, extracting a clustering center for each non-ground sub-cluster to form a non-ground clustering center point set, extracting a flat point from the ground clustering as a plane point, and respectively establishing constraint conditions by combining the non-ground clustering center point set and a non-ground clustering normal vector to realize instant positioning of the laser radar and updating of a global map. The method can reduce the interference of positioning and mapping precision caused by point cloud data noise, does not depend on angular point characteristics, has universality, can be directly used for carrying out instant positioning and mapping scene establishment through a laser radar, and has a great application prospect in the fields of automatic driving, mobile robots and virtual reality.

Description

Laser radar instant positioning and mapping method based on clustering center characteristics
Technical Field
The invention belongs to the field of autonomous positioning and navigation, and particularly relates to a laser radar instant positioning and mapping method based on clustering center characteristics.
Background
An instant positioning and Mapping (SLAM) technology is one of the most basic and important technologies in the technical field of positioning and navigation, and is essentially a robot or other sensor carrier, wherein the pose information of the robot in the environment is determined through the sensing data of the sensor to the surrounding environment, and meanwhile, the map is established for the environment. The laser radar is a high-precision and high-efficiency three-dimensional measuring device, and is widely applied to the fields of all-weather automatic driving, high-precision three-dimensional surveying and mapping and high-precision mobile robots based on a method for instantly positioning and mapping the sensing data, namely point cloud, of the external environment by the laser radar and based on the advantages of high mapping precision, accurate scale information and small influence of ambient illumination. In 2014, ji Zhang et al proposed a famous LOAM (LiDAR interferometry and Mapping) algorithm, and realized the LiDAR real-time positioning and Mapping by disassembling the LiDAR real-time positioning and Mapping problem into a high-frequency and low-precision LiDAR pose estimation problem and a low-frequency and high-precision LiDAR Mapping problem. On the basis, the images acquired by the camera are further used as one of environment perception data, and the laser radar point cloud data is fused for real-time positioning and mapping. In 2018, tixiao Shan et al improved on the basis of a LOAM algorithm, and proposed a light-weight ground-optimized laser radar instant positioning and mapping method, namely LeGO-LOAM. Xingliang Ji et al added a loop back test to the LOAM in 2019, and proposed an LLOAM algorithm to further improve the accuracy of positioning and mapping.
The method adopts the angular points (corners) and the Flat points (plat) in the point cloud data to establish data association and distance constraint in the instant positioning and mapping process of the laser radar, thereby realizing positioning and mapping. The detection and extraction of boundary points in the point cloud need to calculate flatness or curvature, and the calculation is greatly interfered by noise, so that missing detection and false detection of corner points occur, thereby causing wrong data association and influencing the positioning and drawing accuracy.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a laser radar instant positioning and mapping method based on clustering center characteristics. The method can reduce the interference of positioning and mapping precision caused by point cloud data noise, does not depend on angular point characteristics, has universality, can be directly used for carrying out instant positioning and mapping scene establishment through a laser radar, and has a great application prospect in the fields of automatic driving, mobile robots and virtual reality.
The invention provides a laser radar instant positioning and mapping method based on clustering center characteristics, which is characterized by comprising the following steps of:
1) Defining a laser radar body coordinate system and a world coordinate system as follows: recording a laser radar body coordinate system as { L }, wherein an X axis points to the motion direction of the laser radar, a Y axis points to the left side, a Z axis points to the upper side, and an XOY plane of a laser radar body coordinate system is parallel to the local ground; the world coordinate system is consistent with the laser radar body coordinate system of the 1 st frame starting time;
2) Let the state quantity of the laser radar be: Ψ = { t x ,t y ,t z ,θ pitch ,θ yaw ,θ roll Therein, { t } x ,t y ,t z Is the laser radar displacement, t x For the displacement of the lidar along the X-axis of the world coordinate system, t y For lidar displacement along the Y-axis of the world coordinate system, t z For lidar displacements along the Z-axis of the world coordinate system, { θ } pitch ,θ yaw ,θ toll Is the amount of rotation of the lidar, θ pitch For the amount of lidar rotation along the X-axis of the world coordinate system, θ yaw For the amount of rotation of the lidar along the Y-axis of the world coordinate system, θ roll The rotation amount of the laser radar along the Z axis of the world coordinate system;
3) Obtaining an environmental local point cloud image P by a laser radar k K =1, 2.,, wherein P k Representing a k frame image acquired by a laser radar; the pose estimation result of the laser radar at the k frame end time is
Figure BDA0002570206270000021
Will P k Let k =1 as the current frame, set
Figure BDA0002570206270000022
And as the current
Figure BDA0002570206270000023
Set Ψ = {0, 0} when k =1 and as the current Ψ;
4) Adopting breadth-first search algorithm BFS to process current frame P k Performing cluster segmentation on the local point cloud of the environment, and dividing the point cloud after the cluster segmentation into a kth frame of local ground cluster
Figure BDA0002570206270000024
And local non-ground clustering of the k frame
Figure BDA0002570206270000025
The method comprises the following specific steps:
4-1) reacting P k Projected as a corresponding depth image I k
4-2) is provided withThreshold TH of horizontal included angle between fixed ground and laser radar G
4-3) traversing the depth image I by adopting a BFS algorithm k And calculate I k The included angle between each point and the adjacent point; if the included angle is smaller than TH G Then the point is determined to be at P k The corresponding point in (1) is a ground point and the corresponding point is added into the local ground cluster of the k frame
Figure BDA0002570206270000026
Otherwise, the point is judged to be at P k The corresponding point in the k frame is a non-ground point and is added into the local non-ground cluster of the k frame
Figure BDA0002570206270000027
5) Using BFS, to
Figure BDA0002570206270000028
Clustering segmentation is carried out on all points in the cluster, and the specific steps are as follows:
5-1) setting clustering included angle threshold TH NG
5-2) traversing the depth image I by adopting a BFS algorithm k Local non-ground clustering of middle-belonging k-th frame
Figure BDA0002570206270000029
Calculating the angle between each point traversed and the adjacent point of the point, wherein the adjacent point is
Figure BDA00025702062700000210
Any point in I k The corresponding point in (1); if the included angle is larger than TH NG If the traversed point and the adjacent point belong to the same sub-cluster, judging that the traversed point and the adjacent point belong to the same sub-cluster;
5-3) after the traversal is finished,
Figure BDA0002570206270000031
is divided into L local non-ground sub-clusters
Figure BDA0002570206270000032
L is a memberThe total number of clusters;
6) Let the k frame scan start time be t k-1 End time is t k According to the present
Figure BDA0002570206270000033
For each point i ∈ P in the k frame point cloud k The result of the estimation of the pose of the point
Figure BDA0002570206270000034
As shown in formula (1), wherein t is (k,i) ∈[t k-1 ,t k ]Corresponding a timestamp for the point;
Figure BDA0002570206270000035
if the current frame P k For frame 1, then use
Figure BDA0002570206270000036
Calculating P according to equation (1) 1 Corresponding to each point in
Figure BDA0002570206270000037
Then according to calculation
Figure BDA0002570206270000038
Will P 1 Converts each point in the global coordinate system to form a global map as a current global map M, and then let k = k +1, will
Figure BDA0002570206270000039
Current as frame 2
Figure BDA00025702062700000310
Then returning to the step 4) again; if the current frame P k If not, go to step 7);
7) From
Figure BDA00025702062700000311
Extracting the flat feature of the k frameSign point set
Figure BDA00025702062700000312
H is to be k Each point in the projection plane is used as a plane point and a distance constraint from the plane point to the corresponding projection plane is established; the method comprises the following specific steps:
7-1) setting a plane threshold TH p Calculated using equation (2)
Figure BDA00025702062700000313
Flatness S of each point in the i
Figure BDA00025702062700000314
Wherein Nb i Is composed of
Figure BDA00025702062700000315
A local neighborhood of the ith point in the map,
Figure BDA00025702062700000316
is composed of
Figure BDA00025702062700000317
The coordinates of the ith point in the image,
Figure BDA00025702062700000318
is a neighborhood Nb i Coordinates of the inner jth point;
to S i And (4) judging: if S i Less than a threshold value TH p Then, then
Figure BDA00025702062700000319
The ith point is a flat characteristic point; will be provided with
Figure BDA00025702062700000320
All the flat characteristic points in the frame K form a flat characteristic point set H of the frame K k
7-2) reacting H k Each point in (2) corresponding to the point
Figure BDA00025702062700000321
Projection is carried out to the laser radar coordinate system of the k frame scanning start time, and H is carried out after the projection is finished k The corresponding point of each point in the coordinate system forms the k frame start plane point set, and is recorded as
Figure BDA00025702062700000322
7-3) set H of flat feature points of the k-1 frame k-1 Corresponding to each point in
Figure BDA00025702062700000323
Projecting to the laser radar coordinate system corresponding to the scanning end time of the (k-1) th frame, and after the projection is finished, H k-1 The corresponding point of each point in the coordinate system forms the terminating plane point set of the (k-1) th frame, and is recorded as
Figure BDA00025702062700000324
7-4) from
Figure BDA00025702062700000325
In search and
Figure BDA00025702062700000326
each point in
Figure BDA00025702062700000327
Three points nearest to each other and not collinear
Figure BDA00025702062700000328
And with
Figure BDA00025702062700000329
Composition of
Figure BDA00025702062700000330
Corresponding projection planes are established by the formula (3)
Figure BDA00025702062700000331
The distance from each point to the projection plane corresponding to the point is constrained:
Figure BDA0002570206270000041
wherein,
Figure BDA0002570206270000042
is composed of
Figure BDA0002570206270000043
The distance between the ith point and the corresponding projection plane is constrained;
8) From
Figure BDA0002570206270000044
Middle extraction of clustered central point set Ct k And establishing distance constraint from the clustering center point to the point, which comprises the following steps:
8-1) mixing
Figure BDA0002570206270000045
Corresponding to each point in
Figure BDA0002570206270000046
Projecting the image to the laser radar coordinate system at the scanning start time of the kth frame, after the projection is finished,
Figure BDA0002570206270000047
the corresponding point of each point in the frame K under the coordinate system forms the initial local non-ground cluster of the frame K, and the initial local non-ground cluster is recorded as
Figure BDA0002570206270000048
8-2) extracting the initial local non-ground clustering center point set of the kth frame according to the formula (4):
Figure BDA0002570206270000049
wherein,
Figure BDA00025702062700000436
for the ith starting local non-ground sub-cluster
Figure BDA00025702062700000410
The coordinates of the center point of (a),
Figure BDA00025702062700000411
is composed of
Figure BDA00025702062700000412
Inner point, N i Is composed of
Figure BDA00025702062700000413
Number of inner points, all
Figure BDA00025702062700000414
Forming a k-th frame starting local non-ground cluster center point set, and recording as
Figure BDA00025702062700000415
8-3) non-ground clustering center point set of the k-1 frame
Figure BDA00025702062700000416
Corresponding to each point in
Figure BDA00025702062700000417
Projecting the image to the laser radar coordinate system at the scanning end time of the (k-1) th frame, after the projection is finished,
Figure BDA00025702062700000418
the corresponding point of each point in the coordinate system forms a (k-1) th frame to terminate local non-ground clustering, and the frame is recorded as
Figure BDA00025702062700000419
Repeating the step 8-2) to obtain a k-1 frame termination local non-ground cluster center point set which is recorded as
Figure BDA00025702062700000420
8-4) from
Figure BDA00025702062700000421
In search and
Figure BDA00025702062700000422
each point in
Figure BDA00025702062700000423
Closest point of distance
Figure BDA00025702062700000424
Establishing
Figure BDA00025702062700000425
And
Figure BDA00025702062700000426
point-to-point data association is carried out, and point-to-point distance constraint is established according to the formula (5):
Figure BDA00025702062700000427
wherein,
Figure BDA00025702062700000437
is composed of
Figure BDA00025702062700000428
To
Figure BDA00025702062700000429
A distance constraint of (c);
9) From
Figure BDA00025702062700000430
Extracting each sub-cluster
Figure BDA00025702062700000431
Normal toMeasurement of
Figure BDA00025702062700000432
And establishing sub-cluster center normal vector rotation constraint, which comprises the following steps:
9-1) clustering each sub-cluster
Figure BDA00025702062700000433
Performing principal component analysis to obtain principal component direction of the sub-cluster as normal vector of center of the sub-cluster
Figure BDA00025702062700000434
Clustering each sub-group
Figure BDA00025702062700000435
Performing principal component analysis to obtain principal component direction of the sub-cluster as normal vector of the center of the sub-cluster
Figure BDA0002570206270000051
9-2) obtained according to step 8-4)
Figure BDA0002570206270000052
And
Figure BDA0002570206270000053
point-to-point data association is carried out, and a sub-cluster center normal vector included angle constraint is established, wherein the formula (6) is as follows:
Figure BDA0002570206270000054
wherein,
Figure BDA0002570206270000055
constraint of a normal vector included angle of an ith sub-cluster center of the current frame;
10 ) iteratively optimizing the constraints established in steps 7) -9) by a non-linear optimization method, updating the current Ψ and the current Ψ
Figure BDA0002570206270000056
Realizing the positioning of the laser radar of the current frame;
if the iteration is not converged, returning to the step 6) again, and utilizing the updated current
Figure BDA0002570206270000057
Updating
Figure BDA0002570206270000058
If the iteration is converged, entering the step 11);
11 Utilizing the updated current Ψ and the current Ψ of step 10)
Figure BDA0002570206270000059
Converting the environment local point cloud of the kth frame into a world coordinate system, and performing iterative closest point ICP optimization on the k frame local non-ground cluster projected to the world coordinate system and the current global map to further optimize the current psi and the current global map
Figure BDA00025702062700000510
Updating the current global map; the method comprises the following specific steps:
11-1) clustering each point in the k-th frame local non-ground points
Figure BDA00025702062700000511
Updated Current according to step 10)
Figure BDA00025702062700000512
Corresponding to the point in
Figure BDA00025702062700000513
Projecting the point to the world coordinate system, and recording the projected point as
Figure BDA00025702062700000514
After the projection is finished, all the projection points form a k frame global non-ground point cluster which is recorded as
Figure BDA00025702062700000515
The expression is as follows:
Figure BDA00025702062700000516
11-2) Point non-ground points in the current global map M to
Figure BDA00025702062700000517
Performing ICP operation to further optimize current
Figure BDA00025702062700000518
Obtaining a final laser radar pose state quantity psi and a final pose matrix corresponding to the current frame
Figure BDA00025702062700000519
11-3) clustering each point in the k frame local non-ground point cluster
Figure BDA00025702062700000520
Final according to step 11-2)
Figure BDA00025702062700000521
Corresponding to the point in
Figure BDA00025702062700000522
Projected to the world coordinate system for updating
Figure BDA00025702062700000523
To be updated
Figure BDA00025702062700000524
Carrying out point fusion based on voxel filtering with non-ground points in the current global map M;
11-4) clustering each point in the k-th frame local ground point cluster
Figure BDA00025702062700000525
Updated final according to step 11-2)
Figure BDA00025702062700000526
Corresponding to the point in
Figure BDA00025702062700000527
Projecting the point to the world coordinate system, and recording the projected point as
Figure BDA00025702062700000528
After the projection is finished, all projection points
Figure BDA00025702062700000529
Forming a current frame global non-ground point cluster which is marked as
Figure BDA00025702062700000530
Will be provided with
Figure BDA00025702062700000531
Performing point fusion based on voxel filtering with the ground points in the current global map M, and finishing the updating of the current global map M;
12 Let k = k +1 when the lidar acquires a new frame of ambient local point cloud, and combine the final Ψ and the final Ψ obtained in step 11-2)
Figure BDA0002570206270000061
As new current Ψ and current, respectively
Figure BDA0002570206270000062
And then returns to step 4) again.
The invention has the characteristics and beneficial effects that:
the invention distinguishes the ground points and various non-ground points in the point cloud by clustering and dividing the laser radar point cloud, further, establishes point-to-point distance constraint by directly using the center point of the non-ground point cluster, realizes the laser radar instant positioning and mapping without angular point extraction, reduces the noise interference by the average characteristic of the cluster center, and avoids the error angular point extraction and the following error data association caused by the noise interference because of no need of extracting the angular point. The method can be directly used for carrying out instant positioning and mapping scene through the laser radar, and has a wide application prospect in the fields of automatic driving, mobile robots and virtual reality.
Drawings
FIG. 1 is an overall flow diagram of the method of the present invention.
Detailed Description
The invention provides a laser radar instant positioning and mapping method based on clustering center characteristics, which is further described in detail below by combining the accompanying drawings and specific embodiments.
The invention provides a laser radar instant positioning and mapping method based on clustering center characteristics, wherein a Velodyne VLP-16 type laser radar is used as an environment perception sensor, and on the basis of acquisition frequency of 10Hz, the laser radar has 360 degrees of horizontal field of view, 0.2 degree of horizontal resolution, 30 degrees of vertical field of view, 2 degrees of vertical resolution and 16 laser scanning lines in the vertical direction. The laser radar data is composed of an environmental sampling point and a depth value from the center of the laser radar, and the depth value can be converted into a three-dimensional coordinate under a laser radar coordinate system, namely a { X, Y, Z } value according to a horizontal included angle and a vertical included angle between the sampling point and the laser radar. The overall flow of the method is shown in fig. 1, and comprises the following steps:
1) Defining a laser radar body coordinate system and a world coordinate system as follows: the coordinate system of the laser radar body is recorded as { L }, the X axis points to the movement direction of the laser radar, the Y axis points to the left side, and the Z axis points to the upper side, so that the XOY plane of the laser radar body coordinate system is parallel to the local ground where the XOY plane is located. The world coordinate system coincides with the lidar body coordinate system at the start time of frame 1.
2) The state quantity estimated in the real-time positioning and mapping process of the laser radar is as follows: Ψ = { t x ,t y ,t z ,θ pitch ,θ yaw ,θ roll Therein, { t } x ,t y ,t z Is the lidar displacement, t x For the displacement of the lidar along the X-axis of the world coordinate system, t y For lidar displacement along the Y-axis of the world coordinate system, t z For laser minesUp to a displacement along the Z-axis of the world coordinate system, { θ } pitch ,θ yaw ,θ roll Is the amount of rotation of the lidar,. Theta } pitch For the amount of lidar rotation along the X-axis of the world coordinate system, θ yaw For the amount of rotation of the lidar along the Y-axis of the world coordinate system, θ roll Is the rotation amount of the laser radar along the Z axis of the world coordinate system.
3) Obtaining an environmental local point cloud image P by a laser radar k (k =1,2.,) wherein, P is k The image of the kth frame obtained by the laser radar is represented, and k represents the frame number sequence number of the image obtained by the laser radar; will P k As the current frame, the pose estimation result of the laser radar at the k frame ending time is recorded as
Figure BDA00025702062700000714
Let k =1, set
Figure BDA0002570206270000071
And as the current
Figure BDA0002570206270000072
Set Ψ = {0, 0} when k =1 and as the current Ψ;
4) Applying a Breadth First Search algorithm (BFS) to the current frame P k Performing cluster segmentation on the local point cloud of the environment, and dividing the point cloud after the cluster segmentation into a kth frame of local ground cluster
Figure BDA0002570206270000073
And k frame local non-ground clustering
Figure BDA0002570206270000074
The method comprises the following specific steps:
4-1) local point cloud P of the k frame environment k Projected as a corresponding depth image I k
4-2) setting a threshold TH of a horizontal included angle between the ground and the laser radar according to the installation angle of the laser radar G (in this example the lidar is parallel to the ground, so TH is set G =10°);
4-3) traversing the depth image I by adopting a BFS algorithm k And calculate I k If the included angle between each point and its adjacent point is less than TH G Then the point is determined to be at P k The corresponding point in (1) is a ground point and the corresponding point is added into the local ground cluster of the k frame
Figure BDA0002570206270000075
Otherwise, the point is judged to be at P k The corresponding point in the k frame is a non-ground point and is added into the local non-ground cluster of the k frame
Figure BDA0002570206270000076
5) Local non-ground clustering of kth frame using BFS
Figure BDA0002570206270000077
All the points in the method are clustered and segmented, and the method comprises the following specific steps:
5-1) setting a clustering included angle threshold TH according to the laser point cloud density and the noise level NG (this example is given by TH NG =60°);
5-2) traversing the depth image I by adopting a BFS algorithm k Local non-ground clustering of middle-belonging k-th frame
Figure BDA0002570206270000078
Calculating the angle between each point traversed and the adjacent point of the point, wherein the adjacent points are the same
Figure BDA0002570206270000079
Any point in I k If the neighboring point is not
Figure BDA00025702062700000710
The corresponding point of any point in the two points is not calculated), if the included angle is larger than TH NG If the traversed point and the adjacent point belong to the same sub-cluster, judging that the traversed point and the adjacent point belong to the same sub-cluster;
5-3) after traversing, local non-ground clustering is carried out on the k frame
Figure BDA00025702062700000711
Is divided into L local non-ground sub-clusters
Figure BDA00025702062700000712
L is the total number of sub-clusters, its value and TH NG And the k frame point cloud P k The number of non-ground objects present in the ground is related;
6) Let the k frame scan start time be t k-1 End time t k According to the present
Figure BDA00025702062700000713
For each point i epsilon P in the point cloud obtained by scanning in the kth frame k The result of pose estimation of the point
Figure BDA0002570206270000081
As shown in formula (1), wherein t is (k,i) ∈[t k-1 ,t k ]A timestamp is assigned to the point.
Figure BDA0002570206270000082
If the current frame is the 1 st frame, then use
Figure BDA0002570206270000083
Calculating P according to equation (1) 1 Corresponding to each point in
Figure BDA0002570206270000084
Then according to calculation
Figure BDA0002570206270000085
Will P 1 Converts each point in the global coordinate system to form a global map as a current global map M, and then let k = k +1, will
Figure BDA0002570206270000086
Current as frame 2
Figure BDA0002570206270000087
(i.e., the current of the 2 nd frame initial time)
Figure BDA0002570206270000088
Or also
Figure BDA0002570206270000089
) And returning to the step 4) again. If the current frame is not the 1 st frame, go to step 7).
7) Clustering from local ground
Figure BDA00025702062700000810
Extracting flat characteristic point set of the k frame
Figure BDA00025702062700000811
H is to be k Each point in the projection plane is used as a plane point and a distance constraint from the plane point to the corresponding projection plane is established; the method comprises the following specific steps:
7-1) setting a plane threshold TH according to the laser point cloud density and the noise level p (in this example, TH is given p = 0.1). Computing ground clusters using equation (2)
Figure BDA00025702062700000812
Flatness S of each point in the i
Figure BDA00025702062700000813
Wherein Nb i Is composed of
Figure BDA00025702062700000814
A local neighborhood of the ith point (the local neighborhood can be formed by using m points nearest to the Euclidean distance of the point to be solved or the local neighborhood can be formed by using the point in a sphere with the radius of r and the spherical center of the point to be solved, the local neighborhood is formed by using 10 points nearest to the Euclidean distance of the point to be solved in the example),
Figure BDA00025702062700000815
is composed of
Figure BDA00025702062700000830
The coordinates of the ith point in the image,
Figure BDA00025702062700000816
is a neighborhood Nb i Coordinates of the inner jth point.
To S i And (4) judging: if S i Less than threshold TH p Then S is i Corresponding to
Figure BDA00025702062700000817
The ith point is a flat characteristic point; will be provided with
Figure BDA00025702062700000818
All the flat characteristic points in the frame K form a flat characteristic point set H of the frame K k
7-2) reacting H k Corresponding to each point in
Figure BDA00025702062700000819
Projecting to the laser radar coordinate system of the k frame scanning start time, and after the projection is finished, H k The corresponding point of each point in the coordinate system forms the k frame start plane point set, and is recorded as
Figure BDA00025702062700000820
7-3) set H of flat feature points of the previous frame (frame k-1) k-1 Each point in (2) corresponding to the point
Figure BDA00025702062700000821
Projecting to the position corresponding to the scanning end time of the previous frame (the k-1 th frame) under the laser radar coordinate system, and after the projection is finished, H k-1 The corresponding point of each point in the coordinate system forms the terminating plane point set of the (k-1) th frame, and is recorded as
Figure BDA00025702062700000822
7-4) from
Figure BDA00025702062700000823
In search and
Figure BDA00025702062700000824
each point in
Figure BDA00025702062700000825
Three points nearest to each other and not collinear
Figure BDA00025702062700000826
And with
Figure BDA00025702062700000827
Composition of
Figure BDA00025702062700000828
Corresponding projection planes are established by the formula (3)
Figure BDA00025702062700000829
And (3) constraining the distance from each point to the corresponding projection plane:
Figure BDA0002570206270000091
wherein,
Figure BDA0002570206270000092
is composed of
Figure BDA0002570206270000093
The distance between the ith point and the corresponding projection plane is constrained;
8) Clustering from local non-ground
Figure BDA0002570206270000094
Middle-extracted clustered central point set Ct k And establishing distance constraint from a cluster central point to a point, which comprises the following specific steps:
8-1) clustering local non-ground surfaces
Figure BDA0002570206270000095
Each point in (2) corresponding to the point
Figure BDA0002570206270000096
Projecting to the laser radar coordinate system at the scanning start time of the kth frame, after the projection is finished,
Figure BDA0002570206270000097
the corresponding point of each point in the frame K under the coordinate system forms the initial local non-ground cluster of the frame K, and the initial local non-ground cluster is recorded as
Figure BDA0002570206270000098
8-2) extracting the initial local non-ground clustering center point set of the kth frame according to the formula (4):
Figure BDA0002570206270000099
wherein subscript i is the starting local non-ground sub-cluster index,
Figure BDA00025702062700000910
for the ith starting local non-ground sub-cluster
Figure BDA00025702062700000911
The coordinates of the center point of (a),
Figure BDA00025702062700000912
is composed of
Figure BDA00025702062700000913
Inner dot, N i Is composed of
Figure BDA00025702062700000914
Number of inner points, all
Figure BDA00025702062700000915
Make up the k frame startSet of local non-ground cluster center points, denoted as
Figure BDA00025702062700000916
8-3) non-ground clustering center point set of the previous frame (the k-1 frame)
Figure BDA00025702062700000917
Each point in (2) corresponding to the point
Figure BDA00025702062700000918
Projecting to the laser radar coordinate system of the scanning end time of the previous frame (the k-1 th frame), after the projection is finished,
Figure BDA00025702062700000919
the corresponding point of each point in the coordinate system forms a (k-1) th frame to terminate local non-ground clustering, and the frame is recorded as
Figure BDA00025702062700000920
Repeating the step 8-2) to obtain a k-1 frame termination local non-ground cluster center point set which is recorded as
Figure BDA00025702062700000921
8-4) from
Figure BDA00025702062700000922
In search and
Figure BDA00025702062700000923
each point in
Figure BDA00025702062700000924
Closest point of distance
Figure BDA00025702062700000925
Establishing
Figure BDA00025702062700000926
And with
Figure BDA00025702062700000927
Point-to-point data association is carried out, and point-to-point distance constraint is established according to the formula (5):
Figure BDA00025702062700000928
wherein,
Figure BDA00025702062700000929
is composed of
Figure BDA00025702062700000930
To
Figure BDA00025702062700000931
A distance constraint of (d);
9) Starting local non-terrestrial clustering from the kth frame
Figure BDA00025702062700000932
Extracting each sub-cluster
Figure BDA00025702062700000933
Normal vector of (1)
Figure BDA00025702062700000934
And establishing a sub-cluster center normal vector rotation constraint, which comprises the following specific steps:
9-1) for each starting local non-ground sub-cluster of the kth frame
Figure BDA0002570206270000101
Performing Principal Component Analysis (PCA) to obtain principal Component direction of the sub-cluster as normal vector of center of the sub-cluster
Figure BDA0002570206270000102
Local non-ground sub-clustering for each termination of the k-1 frame
Figure BDA0002570206270000103
Performing Principal Component Analysis (PCA) to obtain principal Component direction of the sub-cluster as normal vector of center of the sub-cluster
Figure BDA0002570206270000104
9-2) obtained according to step 8-4)
Figure BDA0002570206270000105
And with
Figure BDA0002570206270000106
Point-to-point data association is performed, and a sub-cluster center normal vector included angle constraint is established, as shown in formula (6):
Figure BDA0002570206270000107
wherein,
Figure BDA0002570206270000108
and the angle of the normal vector of the ith sub-cluster center of the current frame is constrained.
10 ) iteratively optimizing the constraints established in steps 7) -9) by a non-linear optimization method, updating the current Ψ and the current Ψ
Figure BDA0002570206270000109
And realizing the positioning of the laser radar of the current frame.
If the iteration is not converged, returning to the step 6) again, and utilizing the updated current
Figure BDA00025702062700001010
Updating
Figure BDA00025702062700001011
If the iteration converges, step 11) is entered.
11 Utilizing the updated current Ψ and the current Ψ of step 10)
Figure BDA00025702062700001012
Converting the environment local point cloud of the kth frame into a world coordinate system, and performing Iterative Closest Point (ICP) optimization on the k frame local non-ground cluster projected to the world coordinate system and the current global map to further optimize the current psi and the current global map
Figure BDA00025702062700001013
Updating the current global map; the method comprises the following specific steps:
11-1) clustering each point in the k-th frame local non-ground points
Figure BDA00025702062700001014
Updated Current according to step 10)
Figure BDA00025702062700001015
Corresponding to the point in
Figure BDA00025702062700001016
Projecting the point to the world coordinate system, and recording the projected point as
Figure BDA00025702062700001017
After the projection is finished, all the projection points form the k frame global non-ground point cluster which is recorded as
Figure BDA00025702062700001018
The expression is as follows:
Figure BDA00025702062700001019
11-2) comparing the non-ground point in the current global map M with the ground point
Figure BDA00025702062700001020
Performing ICP operation to further optimize the current
Figure BDA00025702062700001021
Obtaining the final laser radar pose state quantity psi corresponding to the current frame after map optimization andfinal pose matrix
Figure BDA00025702062700001022
11-3) clustering each point in the k-th frame local non-ground points
Figure BDA00025702062700001023
Final according to step 11-2)
Figure BDA00025702062700001024
Corresponding to the point in
Figure BDA00025702062700001025
Projected to the world coordinate system for updating
Figure BDA00025702062700001026
To be updated
Figure BDA00025702062700001027
And carrying out point fusion based on voxel filtering with non-ground points in the current global map M.
11-4) clustering each point in the k-th frame local ground point cluster
Figure BDA0002570206270000111
Updated final according to step 11-2)
Figure BDA0002570206270000112
Corresponding to the point in
Figure BDA0002570206270000113
Projecting the point to the world coordinate system, and recording the projected point as
Figure BDA0002570206270000114
After the projection is finished, all projection points
Figure BDA0002570206270000115
Forming a current frame global non-ground point cluster which is marked as
Figure BDA0002570206270000116
Will be provided with
Figure BDA0002570206270000117
Performing point fusion based on voxel filtering with the ground points in the current global map M, and finishing the updating of the current global map M;
12 Let k = k +1 when the lidar acquires a new frame of ambient local point cloud, and combine the final Ψ and the final Ψ obtained in step 11-2)
Figure BDA0002570206270000118
As new current Ψ and current, respectively
Figure BDA0002570206270000119
And then returns to step 4) again.

Claims (1)

1. A laser radar instant positioning and mapping method based on clustering center features is characterized by comprising the following steps:
1) Defining a laser radar body coordinate system and a world coordinate system as follows: recording a laser radar body coordinate system as { L }, wherein an X axis points to the motion direction of the laser radar, a Y axis points to the left side, a Z axis points to the upper side, and an XOY plane of a laser radar body coordinate system is parallel to the local ground; the world coordinate system is consistent with the laser radar body coordinate system of the 1 st frame starting time;
2) Let the state quantity of the laser radar be: Ψ = { t x ,t y ,t z ,θ pitch ,θ yaw ,θ roll In which, { t } x ,t y ,t z Is the lidar displacement, t x For the displacement of the lidar along the X-axis of the world coordinate system, t y For the displacement of the lidar along the Y-axis of the world coordinate system, t z For lidar displacements along the Z-axis of the world coordinate system, { θ } pitch ,θ yaw ,θ roll Is the amount of rotation of the lidar, θ pitch For the amount of lidar rotation along the X-axis of the world coordinate system, θ yaw For lidar along the world coordinate system YAmount of rotation of the shaft, θ roll The rotation amount of the laser radar along the Z axis of the world coordinate system is obtained;
3) Obtaining an ambient local point cloud image P by a laser radar k K =1, 2., wherein P k Representing the k frame image acquired by the laser radar; the pose estimation result of the laser radar at the kth frame end time is
Figure FDA0002570206260000011
Will P k Let k =1 as the current frame, set
Figure FDA0002570206260000012
And as the current
Figure FDA0002570206260000013
Set Ψ = {0, 0} when k =1 and as the current Ψ;
4) Adopting breadth-first search algorithm BFS to process current frame P k Performing cluster segmentation on the local point cloud of the environment, and dividing the point cloud after the cluster segmentation into a kth frame of local ground cluster
Figure FDA0002570206260000014
And local non-ground clustering of the k frame
Figure FDA0002570206260000015
The method comprises the following specific steps:
4-1) reacting P k Projected as a corresponding depth image I k
4-2) setting threshold TH of horizontal included angle between ground and laser radar G
4-3) traversing the depth image I by adopting a BFS algorithm k And calculate I k The included angle between each point and the adjacent point; if the included angle is smaller than TH G Then the point is determined to be at P k The corresponding point in (1) is a ground point and is added into the k frame local ground cluster
Figure FDA0002570206260000016
Otherwise, the point is judged to be at P k The corresponding point in the k frame is a non-ground point and is added into the local non-ground cluster of the k frame
Figure FDA0002570206260000017
5) Using BFS, to
Figure FDA0002570206260000018
All the points in the method are clustered and segmented, and the method comprises the following specific steps:
5-1) setting clustering included angle threshold TH NG
5-2) traversing the depth image I by adopting a BFS algorithm k Local non-ground clustering of the k frame
Figure FDA0002570206260000019
Calculating the angle between each point traversed and the adjacent point of the point, wherein the adjacent point is
Figure FDA00025702062600000110
Any point in I k The corresponding point in (1); if the included angle is larger than TH NG If the traversed point and the adjacent point belong to the same sub-cluster, judging that the traversed point and the adjacent point belong to the same sub-cluster;
5-3) after the traversal is finished,
Figure FDA0002570206260000021
is divided into L local non-ground sub-clusters
Figure FDA0002570206260000022
L is the total number of sub-clusters;
6) Let the k frame scan start time be t k-1 End time is t k According to the present
Figure FDA0002570206260000023
For each point i ∈ P in the k frame point cloud k The result of pose estimation of the point
Figure FDA0002570206260000024
As shown in formula (1), wherein t (k,i) ∈[t k-1 ,t k ]Corresponding a timestamp for the point;
Figure FDA0002570206260000025
if the current frame P k For frame 1, then use
Figure FDA0002570206260000026
Calculating P according to equation (1) 1 Corresponding to each point in
Figure FDA0002570206260000027
Then according to calculation
Figure FDA0002570206260000028
Will P 1 Each point in the map is converted into a global map under a world coordinate system and used as a current global map M, and then k = k +1 is set to be
Figure FDA0002570206260000029
Current as frame 2
Figure FDA00025702062600000210
Then returning to the step 4) again; if the current frame P k If not, go to step 7);
7) From
Figure FDA00025702062600000211
And extracting a flat feature point set of the k frame
Figure FDA00025702062600000212
Will H k Each point in the projection plane is used as a plane point and a distance constraint from the plane point to the corresponding projection plane is established; in particular toThe method comprises the following steps:
7-1) setting a plane threshold TH p Calculated using equation (2)
Figure FDA00025702062600000213
Flatness S of each point in the i
Figure FDA00025702062600000214
Wherein Nb i Is composed of
Figure FDA00025702062600000215
A local neighborhood of the ith point in (c),
Figure FDA00025702062600000216
is composed of
Figure FDA00025702062600000217
The coordinates of the ith point in the (c) plane,
Figure FDA00025702062600000218
is a neighborhood Nb i Coordinates of the inner jth point;
to S i And (4) judging: if S i Less than threshold TH p Then, then
Figure FDA00025702062600000219
The ith point is a flat characteristic point; will be provided with
Figure FDA00025702062600000220
All the flat characteristic points in the frame K form a flat characteristic point set H of the frame K k
7-2) reacting H k Corresponding to each point in
Figure FDA00025702062600000221
Laser beam projected to k-th frame scanning start timeIn the coordinate system, after the projection is finished, H k The corresponding point of each point in the coordinate system forms the k frame starting plane point set, which is recorded as
Figure FDA00025702062600000222
7-3) set H of flat feature points of the k-1 frame k-1 Each point in (2) corresponding to the point
Figure FDA00025702062600000223
Projecting to the position of the laser radar coordinate system corresponding to the scanning end time of the (k-1) th frame, and after the projection is finished, H k-1 The corresponding point of each point in the coordinate system forms the terminating plane point set of the (k-1) th frame, and is recorded as
Figure FDA00025702062600000224
7-4) from
Figure FDA00025702062600000225
In search and
Figure FDA00025702062600000226
each point in
Figure FDA00025702062600000227
Three closest and non-collinear points
Figure FDA00025702062600000228
And with
Figure FDA0002570206260000031
Composition of
Figure FDA0002570206260000032
Corresponding projection planes are established by the formula (3)
Figure FDA0002570206260000033
And (3) constraining the distance from each point to the corresponding projection plane:
Figure FDA0002570206260000034
wherein,
Figure FDA0002570206260000035
is composed of
Figure FDA0002570206260000036
The distance from the ith point to the projection plane corresponding to the point is constrained;
8) From
Figure FDA0002570206260000037
Middle extraction of clustered central point set Ct k And establishing distance constraint from the clustering center point to the point, which comprises the following steps:
8-1) will
Figure FDA0002570206260000038
Corresponding to each point in
Figure FDA0002570206260000039
Projecting the image to the laser radar coordinate system at the scanning start time of the kth frame, after the projection is finished,
Figure FDA00025702062600000310
the corresponding point of each point in the frame K under the coordinate system forms the initial local non-ground cluster of the frame K, and the initial local non-ground cluster is recorded as
Figure FDA00025702062600000311
8-2) extracting the initial local non-ground clustering center point set of the kth frame according to the formula (4):
Figure FDA00025702062600000312
wherein,
Figure FDA00025702062600000313
for the ith starting local non-ground sub-cluster
Figure FDA00025702062600000314
The coordinates of the center point of (a),
Figure FDA00025702062600000315
is composed of
Figure FDA00025702062600000316
Inner point, N i Is composed of
Figure FDA00025702062600000317
Number of inner points, all
Figure FDA00025702062600000318
Forming a k-th frame starting local non-ground cluster center point set, and recording as
Figure FDA00025702062600000319
8-3) collecting the k-1 frame non-ground clustering central points
Figure FDA00025702062600000320
Corresponding to each point in
Figure FDA00025702062600000321
Projecting to the laser radar coordinate system of the k-1 frame scanning end time, after the projection is finished,
Figure FDA00025702062600000322
the corresponding point of each point in the coordinate system forms the (k-1) th frame to terminate the local non-ground cluster, and the local non-ground cluster is recorded as
Figure FDA00025702062600000323
Repeating the step 8-2), obtaining a k-1 frame termination local non-ground cluster center point set which is recorded as
Figure FDA00025702062600000324
8-4) from
Figure FDA00025702062600000325
In search and
Figure FDA00025702062600000326
each point in
Figure FDA00025702062600000327
Closest point of distance
Figure FDA00025702062600000328
Establishing
Figure FDA00025702062600000329
And
Figure FDA00025702062600000330
point-to-point data association is carried out, and point-to-point distance constraint is established according to the formula (5):
Figure FDA00025702062600000331
wherein,
Figure FDA00025702062600000332
is composed of
Figure FDA00025702062600000333
To
Figure FDA00025702062600000334
A distance constraint of (c);
9) From
Figure FDA00025702062600000335
Extracting each sub-cluster
Figure FDA00025702062600000336
Normal vector of (2)
Figure FDA00025702062600000337
And establishing a sub-cluster center normal vector rotation constraint, which comprises the following specific steps:
9-1) clustering each sub-cluster
Figure FDA00025702062600000338
Performing principal component analysis to obtain principal component direction of the sub-cluster as normal vector of the center of the sub-cluster
Figure FDA0002570206260000041
Clustering each sub-group
Figure FDA0002570206260000042
Performing principal component analysis to obtain principal component direction of the sub-cluster as normal vector of the center of the sub-cluster
Figure FDA0002570206260000043
9-2) obtained according to step 8-4)
Figure FDA0002570206260000044
And
Figure FDA0002570206260000045
point-to-point data association is carried out, and a sub-cluster center normal vector included angle constraint is established, wherein the formula (6) is as follows:
Figure FDA0002570206260000046
wherein,
Figure FDA0002570206260000047
constraint of a normal vector included angle of the ith sub-cluster center of the current frame;
10 ) iteratively optimizing the constraints established in steps 7) -9) by a non-linear optimization method, updating the current Ψ and the current Ψ
Figure FDA0002570206260000048
Realizing the positioning of the laser radar of the current frame;
if the iteration is not converged, the step 6) is returned again to utilize the updated current value
Figure FDA0002570206260000049
Updating
Figure FDA00025702062600000410
If the iteration is converged, entering the step 11);
11 Utilizing the updated current Ψ and the current Ψ of step 10)
Figure FDA00025702062600000411
Converting the environment local point cloud of the kth frame into a world coordinate system, and performing iterative closest point ICP optimization on the k frame local non-ground cluster projected to the world coordinate system and the current global map to further optimize the current psi and the current global map
Figure FDA00025702062600000412
And updating the current global map; the method comprises the following specific steps:
11-1) clustering each point in the k-th frame local non-ground points
Figure FDA00025702062600000413
Updated Current according to step 10)
Figure FDA00025702062600000414
Corresponding to the point in
Figure FDA00025702062600000415
Projecting the point to the world coordinate system, and recording the projected point as
Figure FDA00025702062600000416
After the projection is finished, all the projection points form a k frame global non-ground point cluster which is recorded as
Figure FDA00025702062600000417
The expression is as follows:
Figure FDA00025702062600000418
11-2) Point non-ground points in the current global map M to
Figure FDA00025702062600000419
Performing ICP operation to further optimize the current
Figure FDA00025702062600000420
Obtaining the final laser radar position and posture state quantity psi and the final position and posture matrix corresponding to the current frame
Figure FDA00025702062600000421
11-3) clustering each point in the k frame local non-ground point cluster
Figure FDA00025702062600000422
Final according to step 11-2)
Figure FDA00025702062600000423
Corresponding to the point in
Figure FDA00025702062600000424
Projected to the world coordinate system for updating
Figure FDA00025702062600000425
Will be updated
Figure FDA00025702062600000426
Carrying out point fusion based on voxel filtering with non-ground points in the current global map M;
11-4) clustering each point in the k-th frame local ground point cluster
Figure FDA00025702062600000427
Updated final according to step 11-2)
Figure FDA00025702062600000428
Corresponding to the point in
Figure FDA00025702062600000429
Projecting the image to a world coordinate system, and recording the projected points as points
Figure FDA00025702062600000430
After the projection is finished, all projection points
Figure FDA0002570206260000051
Forming a current frame global non-ground point cluster which is marked as
Figure FDA0002570206260000052
Will be provided with
Figure FDA0002570206260000053
Performing point fusion based on voxel filtering with ground points in the current global map M, and finishing updating the current global map M;
12 When the laser radar acquires a new frame of environment local point cloud, let k = k +1, and sum the final Ψ obtained in step 11-2) and the maximum ΨFinal (a Chinese character of 'gan')
Figure FDA0002570206260000054
As new current Ψ and current, respectively
Figure FDA0002570206260000055
And then returns to step 4) again.
CN202010638139.4A 2020-07-06 2020-07-06 Laser radar instant positioning and mapping method based on clustering center characteristics Active CN111932614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010638139.4A CN111932614B (en) 2020-07-06 2020-07-06 Laser radar instant positioning and mapping method based on clustering center characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010638139.4A CN111932614B (en) 2020-07-06 2020-07-06 Laser radar instant positioning and mapping method based on clustering center characteristics

Publications (2)

Publication Number Publication Date
CN111932614A CN111932614A (en) 2020-11-13
CN111932614B true CN111932614B (en) 2022-10-14

Family

ID=73312482

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010638139.4A Active CN111932614B (en) 2020-07-06 2020-07-06 Laser radar instant positioning and mapping method based on clustering center characteristics

Country Status (1)

Country Link
CN (1) CN111932614B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112614186A (en) * 2020-12-28 2021-04-06 上海汽车工业(集团)总公司 Target pose calculation method and calculation module
CN113192174B (en) * 2021-04-06 2024-03-26 中国计量大学 Picture construction method and device and computer storage medium
CN113340296B (en) * 2021-06-21 2024-04-09 上海仙工智能科技有限公司 Method and device for automatically updating mobile robot map
CN113848947B (en) * 2021-10-20 2024-06-28 上海擎朗智能科技有限公司 Path planning method, path planning device, computer equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110223379A (en) * 2019-06-10 2019-09-10 于兴虎 Three-dimensional point cloud method for reconstructing based on laser radar
CN110221603A (en) * 2019-05-13 2019-09-10 浙江大学 A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110221603A (en) * 2019-05-13 2019-09-10 浙江大学 A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud
CN110223379A (en) * 2019-06-10 2019-09-10 于兴虎 Three-dimensional point cloud method for reconstructing based on laser radar
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种基于结构化环境的线性距离特征提取算法;匡兵等;《科学技术与工程》;20200228(第06期);全文 *
一种改进ICP算法的移动机器人激光与视觉建图方法研究;张杰等;《机电工程》;20171218(第12期);全文 *

Also Published As

Publication number Publication date
CN111932614A (en) 2020-11-13

Similar Documents

Publication Publication Date Title
CN111932614B (en) Laser radar instant positioning and mapping method based on clustering center characteristics
CN115407357B (en) Low-harness laser radar-IMU-RTK positioning mapping algorithm based on large scene
CN103413352A (en) Scene three-dimensional reconstruction method based on RGBD multi-sensor fusion
CN111862316B (en) Three-dimensional reconstruction method of dense direct RGBD (Red Green blue-white) of tight coupling of IMU (inertial measurement Unit) based on optimization
CN114526745A (en) Drawing establishing method and system for tightly-coupled laser radar and inertial odometer
CN115372989A (en) Laser radar-based long-distance real-time positioning system and method for cross-country automatic trolley
CN101916445A (en) Affine parameter estimation-based image registration method
CN114659514A (en) LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration
CN112465849B (en) Registration method for laser point cloud and sequence image of unmanned aerial vehicle
CN116449384A (en) Radar inertial tight coupling positioning mapping method based on solid-state laser radar
CN115218889A (en) Multi-sensor indoor positioning method based on dotted line feature fusion
CN118168545A (en) Positioning navigation system and method for weeding robot based on multi-source sensor fusion
CN115908539A (en) Target volume automatic measurement method and device and storage medium
CN117541614B (en) Space non-cooperative target close-range relative pose tracking method based on improved ICP algorithm
CN115355904A (en) Slam method for Lidar-IMU fusion of ground mobile robot
CN113554705B (en) Laser radar robust positioning method under changing scene
CN114581519A (en) Laser autonomous positioning mapping method for quadruped robot in cross-country environment
CN112767459A (en) Unmanned aerial vehicle laser point cloud and sequence image registration method based on 2D-3D conversion
US20240185595A1 (en) Method for evaluating quality of point cloud map based on matching
CN117029870A (en) Laser odometer based on road surface point cloud
CN116878542A (en) Laser SLAM method for inhibiting height drift of odometer
CN116045965A (en) Multi-sensor-integrated environment map construction method
CN115113170A (en) Laser radar edge feature prediction method based on indoor feature degradation environment
CN115512054A (en) Method for constructing three-dimensional space-time continuous point cloud map
CN114897967A (en) Material form recognition method for autonomous operation of excavating equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant