CN116385292A - IMU-assisted LiDAR dynamic point cloud eliminating method - Google Patents

IMU-assisted LiDAR dynamic point cloud eliminating method Download PDF

Info

Publication number
CN116385292A
CN116385292A CN202310297076.4A CN202310297076A CN116385292A CN 116385292 A CN116385292 A CN 116385292A CN 202310297076 A CN202310297076 A CN 202310297076A CN 116385292 A CN116385292 A CN 116385292A
Authority
CN
China
Prior art keywords
point cloud
cluster
point
evaluation
lidar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310297076.4A
Other languages
Chinese (zh)
Inventor
隋心
高佳鑫
王长强
徐爱功
陈志键
史政旭
崔艺馨
张书豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaoning Technical University
Original Assignee
Liaoning Technical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Technical University filed Critical Liaoning Technical University
Priority to CN202310297076.4A priority Critical patent/CN116385292A/en
Publication of CN116385292A publication Critical patent/CN116385292A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/02Computing arrangements based on specific mathematical models using fuzzy logic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Biomedical Technology (AREA)
  • Fuzzy Systems (AREA)
  • Molecular Biology (AREA)
  • Algebra (AREA)
  • Medical Informatics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an IMU-assisted LiDAR dynamic point cloud eliminating method, which comprises the following steps of: firstly, collecting LiDAR and IMU data for time synchronization and calibration of relative space conversion parameters of a coordinate system; secondly, performing preliminary point cloud processing by using IMU information, and performing point cloud cluster segmentation by using a ground point cloud extraction and point cloud cluster segmentation method; thirdly, establishing a pairing relation of the point cloud clusters among frames, and calculating pairing point cloud cluster characteristic data; fourthly, constructing a multi-level fuzzy comprehensive evaluation model of the point cloud movement state, and inputting characteristic data of the matched point cloud clusters to evaluate the movement state of the point cloud clusters; fifthly, based on the model evaluation result, removing the dynamic point cloud cluster from the original point cloud data. According to the invention, liDAR point cloud and IMU measurement values are used as data, point cloud motion distortion and a unified point cloud coordinate system are removed by using IMU information in an auxiliary mode, and based on a point cloud clustering result, a multi-level fuzzy comprehensive evaluation model is adopted to judge the motion state of each point cloud cluster, so that dynamic point cloud is removed.

Description

IMU-assisted LiDAR dynamic point cloud eliminating method
Technical Field
The invention belongs to the technical field of simultaneous localization and mapping (Light Detection and Ranging Simultaneous Localization and Mapping, liDAR SLAM) based on a laser radar, and particularly relates to a IMU (Inertial Measurement Unit) -assisted LiDAR dynamic point cloud eliminating method.
Background
With the rapid development of the economic and technological level of China, infrastructures such as large markets, airports and hospitals gradually become important parts of life of people, and the demands of people for indoor location-based services are also increasing. The SLAM (Simultaneous Localization and Mapping) technology is one of the core technologies in the current indoor positioning and navigation fields, and can acquire carrier pose information carrying an SLAM sensor and map information of an environment in real time. Taking LiDAR SLAM as an example, the technology has the advantages of high reliability, intuitive graph construction, no influence of illumination conditions and the like, and the core module mainly comprises three parts of front end matching, rear end optimization and graph construction. The effects of back-end optimization and mapping are affected by the inter-frame registration accuracy of the front-end point cloud, and the inter-frame registration accuracy is affected by the quality of point cloud data. Therefore, the positioning accuracy and the mapping effect of the LiDAR SLAM can be fundamentally improved by improving the data quality.
At present, identifying and eliminating point clouds generated by dynamic target scanning from original point cloud data is one of methods for effectively improving data quality. Aiming at the problem of eliminating the dynamic point cloud, wei et al extract vehicles in the point cloud by adopting a self-adaptive 3D segmentation technology, and estimate the motion state and the motion speed of the vehicles based on the motion artifact effect so as to realize the detection of the dynamic vehicles. The model constructed by the method can only identify the dynamic vehicle, and is mainly suitable for a sparse point cloud data set acquired by the airborne laser radar. Lim et al propose a dynamic target filtering method taking difference of block occupation conditions as a reference, the method divides a priori point cloud map into a plurality of annular blocks, and after point cloud data are input, the block where a dynamic target is located is screened by calculating corresponding occupation descriptors of each block, so that the dynamic point cloud is removed. The method requires a large data storage space, and the prior map information is required to be acquired, so that the method has certain application limitation. Zhang Tao et al propose an odometer integrating LiDAR and IMU in a dynamic environment, unify an adjacent frame point cloud coordinate system by means of IMU information, screen a dynamic target by calculating centroid offset between adjacent frame matching point cloud clusters, and experimental results show that the screening result of the method in a complex environment is low in accuracy. Li et al propose a full convolution neural network, take laser point cloud as the input sample, combine data enhancement training, realize the detection of dynamic target. Li Xiang et al extract features from the data using a fourier single pixel undersampling imaging method so that a deep learning convolutional network is used to detect and identify dynamic objects without acquiring all target information. Although the dynamic target detection effect of the two methods is good, the neural network learning process needs to manually mark the learning sample, and the corresponding hardware cost is high. In order to simultaneously consider the efficiency and the precision of dynamic point cloud elimination, the patent provides a multi-level fuzzy comprehensive evaluation dynamic point cloud elimination method based on IMU assistance. Compared with the other methods, the method does not need complicated model training and priori information, saves the data storage space, is suitable for complex working environments, and can effectively ensure the efficiency and precision of dynamic point cloud rejection.
The fuzzy comprehensive evaluation method (Fuzzy Comprehensive Evaluation Method) is a comprehensive evaluation method based on fuzzy mathematics, and the method converts qualitative evaluation into quantitative evaluation according to membership theory of the fuzzy mathematics, namely, the fuzzy mathematics are used for carrying out overall evaluation on things or objects limited by various factors. The method has the characteristics of clear result and strong systematicness, can better solve the problems of ambiguity and difficult quantization, and is suitable for solving various nondeterminacy problems. At present, the method is widely applied to the fields of product rating, enterprise evaluation, resource allocation, equipment control and the like.
The analytic hierarchy process (Analytic Hierarchy Process, AHP) is a decision-making process in which evaluation elements related to decision making are decomposed into layers such as targets, criteria, schemes, etc., and qualitative and quantitative analyses are performed on the basis of the decomposition. The method is used for layering a complex decision system, providing quantitative basis for analysis and final decision by comparing importance among various evaluation factors layer by layer, and is commonly used for constructing an evaluation weight array corresponding to the evaluation factors in the fuzzy comprehensive evaluation process.
The Multi-level fuzzy comprehensive evaluation method (Multi-level Fuzzy Comprehensive Evaluation Method) organically combines the analytic hierarchy process and the fuzzy comprehensive evaluation method, ensures the systematicness and rationality of the model, can fully utilize the experience and judgment capability of objective data rules, experts or decision makers, and provides a scientific and reasonable solution for the nonlinear, fuzzy and multivariable evaluation problems. The method is mainly used for carrying out multi-level fuzzy comprehensive evaluation on the motion states of the point cloud clusters from three aspects of mass center offset, triaxial span change, point number and point cloud density change between the paired point cloud clusters, so that the motion states of the point cloud clusters are judged, the dynamic point cloud clusters are finally removed from original point cloud data, and the positioning accuracy and the map building effect of LiDAR SLAM are improved.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides an IMU-assisted LiDAR dynamic point cloud eliminating method, which utilizes IMU information to assist in eliminating motion distortion of point cloud data, provides position and attitude parameters for unifying two adjacent frames of point cloud data coordinate systems, and is convenient for constructing pairing combination among adjacent frames of point cloud clusters. And aiming at the pairing result, comprehensively evaluating the motion state of the point cloud cluster from a plurality of characteristic aspects by adopting a multi-level fuzzy comprehensive evaluation method, and finally removing the dynamic point cloud from the original LiDAR point cloud data.
An IMU-assisted LiDAR dynamic point cloud rejection method comprises the following steps:
step 1, acquiring LiDAR and IMU sensor data, calibrating six-degree-of-freedom rotation and translation parameters between two sensor coordinate systems, and simultaneously realizing time synchronization between the two sensor data by using a time synchronization module;
and 2, after each frame of LiDAR point cloud data is acquired, removing LiDAR point cloud motion distortion by using the LiDAR/IMU external parameters obtained in the step 1 and assisted by using IMU pre-integration information, and then performing one-step downsampling and noise point filtering processing on the whole point cloud data. Based on the primarily processed LiDAR point cloud data, the point cloud generated by different target scanning in the environment is segmented into corresponding independent point cloud clusters by a rapid ground point cloud extraction and point cloud cluster segmentation method;
Step 3, unifying coordinate systems of two adjacent frames of point clouds by using LiDAR/IMU external parameters and IMU information, establishing a pairing relation between the point clouds of the adjacent frames based on the point cloud cluster segmentation result obtained in the step 2, and calculating corresponding pairing point cloud cluster characteristic data;
step 4, carrying out multi-level fuzzy comprehensive evaluation on the point cloud cluster motion state from three aspects of centroid offset, triaxial span change, point number and point cloud density change by constructing a multi-level fuzzy comprehensive evaluation model of the point cloud motion state based on the pairing relation between the point cloud clusters of the adjacent frames obtained in the step 3;
and 5, judging the motion state of each point cloud cluster based on the evaluation result in the step 4, and finally removing the dynamic point cloud cluster from the original point cloud data, thereby improving the data quality of the original LiDAR point cloud and further improving the positioning accuracy and the mapping effect of the LiDAR SLAM.
In the step 2, the point clouds generated by different target scans in the environment are segmented into corresponding independent point cloud clusters by a rapid ground point cloud extraction and point cloud cluster segmentation method;
the method comprises the following specific steps:
step 2-1, establishing index numbers according to the corresponding scanning wire harness arrangement sequence and geometric relation for each frame of point cloud, so that the same-line point cloud has a uniform row number and the same-column point cloud has a uniform column number;
Step 2-2, dividing each frame of point cloud into a plurality of radial small areas according to the column numbers based on the point cloud numbers obtained in the step 2-1, sequentially calculating the included angles between the connecting line and the plane between two adjacent laser points k in each small area according to the ascending order of the row numbers, and taking the included angles as a reference for measuring the fluctuation of the point cloud:
Figure BDA0004143559830000031
wherein, the three-dimensional coordinates of the point cloud under the Cartesian coordinate system are represented, d (k,k+1) The projection length of the two-point connecting line on the X-Y plane is as follows;
step 2-3, corresponding rad of step 2-2 (k,k+1) Comparing the calculated result with 0.05, if rad (k,k+1) If the number is larger than 0.05, marking the starting point of the connecting line as a ground point. When the number of the points is larger than 0.05 for the first time, stopping the subsequent calculation of the current small area, marking the starting point of the connecting line as a ground point, and marking the ending point of the connecting line as a first non-ground point. Points of the region that remain not involved in the calculation are also marked as non-ground points. After each small area finishes the calculation, filtering out all laser points marked as ground points of the current frame;
2-4, projecting the LiDAR point cloud data processed in the step into an X-Y two-dimensional plane, and carrying out primary clustering on the whole point cloud by adopting an area growing algorithm based on a two-dimensional Euclidean distance, wherein a seed point set in the area growing process consists of all laser points marked as the first non-ground;
The primary clustering in the step 2-5 and the step 2-4 is fast in processing, but there are two cases that different targets with obvious differences in the Z direction are divided into one point cloud cluster or different targets are connected into one point cloud cluster due to discrete points. Since the occurrence probability of the above situation is low, it is inefficient to perform secondary clustering on each primary clustering result, so a criterion for determining whether a point cloud cluster needs secondary clustering is determined:
Figure BDA0004143559830000041
PD n =PN n /PV n =PN n /(DX n DY n DZ n ) (3)
wherein DY n 、DZ n Is the triaxial span of the nth point cloud cluster X, Y, Z direction in a frame of point cloud,
Figure BDA0004143559830000042
Figure BDA0004143559830000043
and->
Figure BDA0004143559830000044
The maximum and minimum values of the point cloud X, Y, Z coordinates of the current point cloud cluster, PN n For the number of points in the current point cloud cluster, PV n PD is the minimum outsourcing hexahedral volume of the current point cloud cluster n And the point cloud density is the point cloud density of the current point cloud cluster. If the three-axis span is too large but the number of points is too small, the point cloud distribution of the point cloud clusters is not dense enough and possibly abnormal exists, so the point cloud clusters with the point cloud density smaller than 100 are judged to be abnormal point cloud clusters needing secondary subdivision;
step 2-6, enumerating the abnormal point cloud clusters obtained by screening in the step 1-5, calculating the three-dimensional Euclidean distance from each laser point in the point cloud clusters to other laser points in the same point cloud cluster, and regarding the distance as a neighboring point of the laser points with the distance less than 0.05 m. Taking laser points with the number of the adjacent points being less than 3 as discrete points connected with different point cloud clusters, removing the discrete points, and performing secondary clustering segmentation on the abnormal point cloud clusters after removing all the discrete points by adopting a region growing algorithm based on the three-dimensional Euclidean distance;
Step 2-7, after all abnormal point cloud clusters are subdivided for the second time, counting the number of laser points of each point cloud cluster at present, calculating the point cloud density, and calculating the centroid coordinates of each point cloud cluster:
Figure BDA0004143559830000045
wherein, (x) n ,y n ,z n ) Representing the three-dimensional coordinates of the mass center of the nth point cloud cluster in the point cloud data of the current frame,
Figure BDA0004143559830000046
representing the three-dimensional coordinates of the mth laser point in the nth point cloud cluster.
Step 3, unifying coordinate systems of two adjacent frames of point clouds by using LiDAR/IMU external parameters and IMU information, establishing a pairing relation between each point cloud cluster of the adjacent frames based on the point cloud cluster segmentation result obtained in the step 2, and calculating corresponding pairing point cloud cluster characteristic data;
the method comprises the following specific steps:
step 3-1, taking a first frame IMU coordinate system as a global coordinate system, marking the global coordinate system as a W system, and based on rotation parameters in LiDAR/IMU external parameter calibration results
Figure BDA0004143559830000047
And translation parameter->
Figure BDA0004143559830000048
Converting each frame of LiDAR point cloud into a global coordinate system by combining IMU pre-integration results, and using the ith frame of point cloud P i L For example, the converted point cloud is denoted as P i W
Figure BDA0004143559830000051
Step 3-2, under W, using P i W N-th point cloud cluster in (3)
Figure BDA0004143559830000052
Is centered on the centroid of 3m as radius +.>
Figure BDA0004143559830000053
A search area is established. To->
Figure BDA0004143559830000054
Midpoint cloud cluster->
Figure BDA0004143559830000055
As an example of a search result, feature data of the peer-to-peer cloud cluster is calculated:
Figure BDA0004143559830000056
Wherein,,
Figure BDA0004143559830000057
and->
Figure BDA0004143559830000058
Respectively is a point cloud cluster->
Figure BDA0004143559830000059
And->
Figure BDA00041435598300000510
Centroid three-dimensional coordinates of>
Figure BDA00041435598300000511
And->
Figure BDA00041435598300000512
For the number of laser spots in the corresponding point cloud cluster, < >>
Figure BDA00041435598300000513
And->
Figure BDA00041435598300000514
For the point cloud density of the corresponding point cloud cluster, < +.>
Figure BDA00041435598300000515
And
Figure BDA00041435598300000516
three-axis span for the corresponding point cloud cluster;
step 3-3, will
Figure BDA00041435598300000517
And->
Figure BDA00041435598300000518
Middle and->
Figure BDA00041435598300000519
All search results capable of establishing the pairing relation respectively form pairing combination and serve as input of a point cloud motion state multi-level fuzzy comprehensive evaluation model.
Step 4, performing multi-level fuzzy comprehensive evaluation on the point cloud cluster motion state from three aspects of centroid offset, triaxial span change, point number and point cloud density change by constructing a multi-level fuzzy comprehensive evaluation model of the point cloud motion state;
the method comprises the following specific steps:
and 4-1, taking mass center offset, triaxial span change, point quantity change and point cloud density change between paired point cloud clusters as criteria for judging the motion state of the paired point cloud clusters to jointly form an evaluation factor set of the model, and further constructing a factor set U, an evaluation set V and a fuzzy evaluation matrix R:
Figure BDA00041435598300000520
wherein U in U 1 、u 2 、u 3 All are first-level evaluation factors, and sequentially correspond to three factors in the evaluation factor set. u (u) 21 、u 22 、u 23 Corresponding to X, Y, Z triaxial span change between matched point cloud clusters, u 31 、u 32 The two-foot mark variables are two-level evaluation factors corresponding to the number of points among the matched point cloud clusters and the change of the point cloud density. V of V 1 、v 2 、v 3 And corresponding to the evaluation results of the static, undetermined and dynamic three point cloud cluster motion states. R represents a fuzzy relation between a factor set U and an evaluation set V, wherein mu (U, V) is a corresponding membership function;
step 4-2, constructing an evaluation matrix R corresponding to the secondary evaluation factors 2 And R is 3
Figure BDA0004143559830000061
And (3) by acquiring corresponding characteristic data of the point cloud clusters in the motion state to be determined, marking the number of points as PN, the density of the point cloud as PD and the triaxial spans as DX, DY and DZ in sequence. Calculating corresponding characteristic data of the matched point cloud cluster, and marking the mass center offset as a 1 The number of points is changed to a 5 The density change of the point cloud is marked as a 6 The three-axis span change is sequentially marked as a 2 、a 3 、a 4 . According to the characteristic of the characteristic data, adopting a trapezoidal distribution membership function to obtain R 2 For example, a corresponding membership function model is constructed:
Figure BDA0004143559830000062
R 2 and R is 3 The other row elements adopt the membership function construction form, and a in the corresponding point cloud cluster characteristic data substitution type is utilized 2 Calculating with DX;
step 4-3, calculating an evaluation weight matrix corresponding to the second-level evaluation factor by adopting an analytic hierarchy process, and constructing a corresponding judgment matrix C by taking a triaxial span change factor set as an example 2
Figure BDA0004143559830000063
Based on C 2 The corresponding weight is calculated by the middle element by adopting a method root, and the calculation result is normalized:
Figure BDA0004143559830000064
Figure BDA0004143559830000071
Wherein e represents the current factor set evaluationNumber of valence factors. The three-axis span change factor set corresponds to e=3, and a corresponding weight matrix D is constructed 2
D 2 =[d 21 d 22 d 23 ]=[0.4 0.4 0.2] (13)
Calculating weight matrix D corresponding to the point quantity and the point cloud density change factor set by adopting the same method 3
D 3 =[d 31 d 32 ]=[0.67 0.33] (14)
Checking the calculation result of the weight array by adopting an analytic hierarchy process consistency checking method, and using the qualified calculation result for higher-level fuzzy comprehensive evaluation;
step 4-4, adopting a product summation method to sum D 2 And D 3 Respectively apply to R 2 And R is 3 Last two rows of elements in the calculation R:
Figure BDA0004143559830000072
constructing a membership degree calculation function corresponding to the first row element in R:
Figure BDA0004143559830000073
wherein v is the minimum motion rate of the absolute dynamic target which can be detected by the model, and can be adjusted according to the detection requirement. Delta t is the time interval for collecting the LiDAR point cloud data of two adjacent frames, and the delta t is the adjustable quantity. VAL is an adaptive model parameter:
VAL=λ 1 DIS+λ 2 MS (17)
wherein DIS is the three-dimensional Euclidean distance from the centroid of the current point cloud cluster to the origin, MS is the maximum span of the current point cloud cluster in the X, Y, Z triaxial direction, lambda 1 And lambda (lambda) 2 The value of the ground feature is determined by the range of the environment and the complexity of the ground feature in the environment, and the larger the range of the environment is, the more complex the ground feature is, the larger the value of the ground feature is;
step 4-5, calculating a weight matrix D corresponding to the first-level evaluation factor by adopting an analytic hierarchy process, and applying the weight matrix D to R to calculate a fuzzy comprehensive evaluation set B:
D=[d 1 d 2 d 3 ]=[0.5 0.25 0.25] (18)
Figure BDA0004143559830000081
Comparing the elements in B according to the maximum membership rule, if B 1 The maximum, the evaluation result of the model output is static; if b 3 The maximum, the evaluation result outputted is dynamic; if b 2 Maximum, outputting an evaluation result to be undetermined and outputting b at the same time 2 As a corresponding evaluation score.
Step 5, based on the evaluation result of the motion state of the point cloud cluster, judging the motion state of each point cloud cluster, and finally removing the dynamic point cloud cluster from the original point cloud data, thereby improving the data quality of the LiDAR point cloud;
the method comprises the following specific steps:
step 5-1, sequentially inputting pairing and combining results corresponding to the point cloud clusters into a point cloud motion state multi-level fuzzy comprehensive evaluation model, and using an ith frame of point cloud P i W N-th point cloud cluster in (3)
Figure BDA0004143559830000082
For example, if the evaluation result appears static, the label +.>
Figure BDA0004143559830000083
Is a static point cloud cluster and stops the following and +.>
Figure BDA0004143559830000086
Calculating the correlation; if all evaluation results are dynamic, then mark +.>
Figure BDA0004143559830000084
Is a dynamic point cloud cluster; if there is no search result, then mark +.>
Figure BDA0004143559830000085
The missing point cloud cluster is regarded as a special dynamic point cloud cluster. In addition to the above, if the evaluation result has a waiting time, the maximum evaluation score is regarded as +.>
Figure BDA0004143559830000088
A final motor state evaluation score, marking +.f if the score is greater than 0.5 >
Figure BDA0004143559830000087
The point cloud cluster is a static point cloud cluster, otherwise, the point cloud cluster is marked as a dynamic point cloud cluster;
step 5-2, enumerating P i W Repeating the steps until all the motion states of the point cloud clusters are judged, converting the point cloud clusters marked as static into a corresponding laser radar coordinate system and issuing the point cloud clusters as a new frame of point cloud data, wherein the corresponding LiDAR point cloud data does not contain the point cloud data generated by dynamic target scanning. And aiming at the first frame of point cloud, the point cloud cluster which is paired with the second frame of static point cloud cluster and the evaluation result is static is issued.
The invention has the advantages that:
1. according to the invention, through IMU assistance, liDAR dynamic point cloud is removed by mainly adopting a multi-level fuzzy comprehensive evaluation method, 98.67% of dynamic point cloud in original point cloud data can be effectively identified and removed through instance verification, and 2.01% of static point cloud is removed by mistake at the same time, but the static point cloud base is large, so that part of errors can be ignored.
2. And the IMU information is used for assisting in removing LiDAR point cloud motion distortion, so that the data quality of the original point cloud is primarily improved. Meanwhile, IMU information is utilized to provide relative position and posture change parameters among the point cloud data frames, so that a coordinate system corresponding to two frames of point cloud data can be unified with high precision under the condition that dynamic point clouds are not removed.
3. According to the invention, the motion state judgment of the point cloud clusters is regarded as a multi-level fuzzy comprehensive evaluation problem, and the corresponding three-axis span change, point number change and point cloud density change are also regarded as the standards for judging the motion state while the mass center offset among the point cloud clusters is considered, so that an evaluation factor set of the model is formed together, thereby realizing the identification and elimination of the robust dynamic point cloud clusters, and further improving the data quality of LiDAR point clouds.
4. Compared with an LOAM algorithm output result based on original point cloud data, the positioning root mean square error is reduced by 66.06%, the maximum error is reduced by 72.78%, the positioning accuracy reaches the centimeter level, and the effect of constructing the point cloud map is optimized.
Drawings
FIG. 1 is a flow chart of an IMU-assisted LiDAR dynamic point cloud removal method;
FIG. 2 is a specific flow chart of step 2 according to one embodiment of the present invention;
FIG. 3 is a specific flow chart of step 3 according to one embodiment of the present invention;
FIG. 4 is a flowchart showing step 4 according to one embodiment of the present invention;
FIG. 5 is a flowchart showing step 5 according to one embodiment of the present invention;
FIG. 6 is a summary flow chart of one embodiment of the present invention;
FIG. 7 is a time series chart of point cloud rejection rate in the dynamic point cloud rejection process using the present invention;
FIG. 8 is a graph comparing X-Y plane two-dimensional positioning tracks obtained by respectively inputting LOAM to position point cloud data and original point cloud data processed by the method;
FIG. 9 is a graph showing the quantitative comparison of X-axis and Y-axis positioning errors obtained by respectively inputting LOAM to position the point cloud data and the original point cloud data processed by the method;
fig. 10 is a comparison chart of point cloud map results obtained by respectively inputting the point cloud data and the original point cloud data processed by the method into the lomm for mapping.
The specific embodiment is as follows:
an embodiment of the present invention will be further described with reference to the accompanying drawings.
In the embodiment of the invention, the IMU-assisted LiDAR dynamic point cloud eliminating method, as shown in FIG. 1, comprises the following steps:
step 1, acquiring LiDAR and IMU sensor data, calibrating six-degree-of-freedom rotation and translation parameters between two sensor coordinate systems, and simultaneously realizing time synchronization between the two sensor data by using a time synchronization module;
and 2, after each frame of LiDAR point cloud data is acquired, removing LiDAR point cloud motion distortion by using the LiDAR/IMU external parameters obtained in the step 1 and assisted by using IMU pre-integration information, and then performing one-step downsampling and noise point filtering processing on the whole point cloud data. Based on the primarily processed LiDAR point cloud data, the point cloud generated by different target scanning in the environment is segmented into corresponding independent point cloud clusters by a rapid ground point cloud extraction and point cloud cluster segmentation method;
Step 2-1, establishing index numbers according to the corresponding scanning wire harness arrangement sequence and geometric relation for each frame of point cloud, so that the same-line point cloud has a uniform row number and the same-column point cloud has a uniform column number;
step 2-2, dividing each frame of point cloud into a plurality of radial small areas according to the column number based on the point cloud numbers acquired in the step 2-1, and sequentially calculating the included angle rad between the connecting line of two adjacent laser points k and k+1 and the X-Y plane in each small area according to the ascending order of the line number (k,k+1) Taking the change of the fluctuation of the point cloud as a reference:
Figure BDA0004143559830000101
wherein, (x) k ,y k ,z k ) Representing three-dimensional coordinates, d, of a point cloud in a Cartesian coordinate system (k,k+1) The projection length of the two-point connecting line on the X-Y plane is as follows;
step 2-3, corresponding rad of step 2-2 (k,k+1) Comparing the calculated result with 0.05, if rad (k,k+1) If the number is larger than 0.05, marking the starting point of the connecting line as a ground point. When the current small area is larger than 0.05 for the first time, stopping subsequent calculation of the current small area, marking the starting point of the connecting line as a ground point, and marking the end point of the connecting lineIs the first non-ground point. Points of the region that remain not involved in the calculation are also marked as non-ground points. After each small area finishes the calculation, filtering out all laser points marked as ground points of the current frame;
2-4, projecting the LiDAR point cloud data processed in the step into an X-Y two-dimensional plane, and carrying out primary clustering on the whole point cloud by adopting an area growing algorithm based on a two-dimensional Euclidean distance, wherein a seed point set in the area growing process consists of all laser points marked as the first non-ground;
The primary clustering in the step 2-5 and the step 2-4 is fast in processing, but there are two cases that different targets with obvious differences in the Z direction are divided into one point cloud cluster or different targets are connected into one point cloud cluster due to discrete points. Since the occurrence probability of the above situation is low, it is inefficient to perform secondary clustering on each primary clustering result, so a criterion for determining whether a point cloud cluster needs secondary clustering is determined:
Figure BDA0004143559830000102
PD n =PN n /PV n =PN n /(DX n DY n DZ n ) (3)
wherein DX n 、DY n 、DZ n Is the triaxial span of the nth point cloud cluster X, Y, Z direction in a frame of point cloud,
Figure BDA0004143559830000111
and->
Figure BDA0004143559830000112
The maximum and minimum values of the point cloud X, Y, Z coordinates of the current point cloud cluster, PN n For the number of points in the current point cloud cluster, PV n PD is the minimum outsourcing hexahedral volume of the current point cloud cluster n And the point cloud density is the point cloud density of the current point cloud cluster. If the triaxial span is too large but the number of points is too small, the distribution of point clouds in the point cloud cluster is not dense enough, and the point cloud density is small because of possible abnormalityDetermining the abnormal point cloud cluster requiring secondary subdivision at the point cloud cluster of 100;
step 2-6, enumerating the abnormal point cloud clusters obtained by screening in the step 1-5, calculating the three-dimensional Euclidean distance from each laser point in the point cloud clusters to other laser points in the same point cloud cluster, and regarding the distance as a neighboring point of the laser points with the distance less than 0.05 m. Taking laser points with the number of the adjacent points being less than 3 as discrete points connected with different point cloud clusters, removing the discrete points, and performing secondary clustering segmentation on the abnormal point cloud clusters after removing all the discrete points by adopting a region growing algorithm based on the three-dimensional Euclidean distance;
Step 2-7, after all abnormal point cloud clusters are subdivided for the second time, counting the number of laser points of each point cloud cluster at present, calculating the point cloud density, and calculating the centroid coordinates of each point cloud cluster:
Figure BDA0004143559830000113
wherein, (x) n ,y n ,z n ) Representing the three-dimensional coordinates of the mass center of the nth point cloud cluster in the point cloud data of the current frame,
Figure BDA0004143559830000114
representing the three-dimensional coordinates of the mth laser point in the nth point cloud cluster.
Step 3, unifying coordinate systems of two adjacent frames of point clouds by using LiDAR/IMU external parameters and IMU information, establishing a pairing relation between the point clouds of the adjacent frames based on the point cloud cluster segmentation result obtained in the step 2, and calculating corresponding pairing point cloud cluster characteristic data;
step 3-1, taking a first frame IMU coordinate system as a global coordinate system, marking the global coordinate system as a W system, and based on rotation parameters in LiDAR/IMU external parameter calibration results
Figure BDA0004143559830000115
And translation parameter->
Figure BDA0004143559830000116
Converting each frame of LiDAR point cloud into a global coordinate system by combining IMU pre-integration resultsIn the ith frame point cloud P i L For example, the converted point cloud is denoted as P i W
Figure BDA0004143559830000117
Step 3-2, under W, using P i W N-th point cloud cluster in (3)
Figure BDA0004143559830000118
Is centered on the centroid of 3m as radius +.>
Figure BDA0004143559830000119
A search area is established. To->
Figure BDA00041435598300001110
Midpoint cloud cluster->
Figure BDA00041435598300001111
As an example of a search result, feature data of the peer-to-peer cloud cluster is calculated:
Figure BDA00041435598300001112
Wherein,,
Figure BDA0004143559830000121
and->
Figure BDA0004143559830000122
Respectively is a point cloud cluster->
Figure BDA0004143559830000123
And->
Figure BDA0004143559830000124
Centroid three-dimensional coordinates of>
Figure BDA0004143559830000125
And->
Figure BDA0004143559830000126
For the number of laser spots in the corresponding point cloud cluster, < >>
Figure BDA0004143559830000127
And->
Figure BDA0004143559830000128
For the point cloud density of the corresponding point cloud cluster, < +.>
Figure BDA0004143559830000129
And->
Figure BDA00041435598300001210
Three-axis span for the corresponding point cloud cluster;
step 3-3, will
Figure BDA00041435598300001211
And->
Figure BDA00041435598300001212
Middle and->
Figure BDA00041435598300001213
All search results capable of establishing the pairing relation respectively form pairing combination and serve as input of a point cloud motion state multi-level fuzzy comprehensive evaluation model.
Step 4, carrying out multi-level fuzzy comprehensive evaluation on the point cloud cluster motion state from three aspects of centroid offset, triaxial span change, point number and point cloud density change by constructing a multi-level fuzzy comprehensive evaluation model of the point cloud motion state based on the pairing relation between the point cloud clusters of the adjacent frames obtained in the step 3;
and 4-1, taking mass center offset, triaxial span change, point quantity change and point cloud density change between paired point cloud clusters as criteria for judging the motion state of the paired point cloud clusters to jointly form an evaluation factor set of the model, and further constructing a factor set U, an evaluation set V and a fuzzy evaluation matrix R:
Figure BDA00041435598300001214
wherein U in U 1 、u 2 、u 3 All are first-level evaluation factors, and sequentially correspond to three factors in the evaluation factor set. u (u) 21 、u 22 、u 23 Corresponding to the three-axis span change of the point cloud clusters of the matching point, X. Y, Z, u 31 、u 32 The two-foot mark variables are two-level evaluation factors corresponding to the number of points among the matched point cloud clusters and the change of the point cloud density. V of V 1 、v 2 、v 3 And corresponding to the evaluation results of the static, undetermined and dynamic three point cloud cluster motion states. R represents a fuzzy relation between a factor set U and an evaluation set V, wherein mu (U, V) is a corresponding membership function;
step 4-2, constructing an evaluation matrix R corresponding to the secondary evaluation factors 2 And R is 3
Figure BDA00041435598300001215
And (3) by acquiring corresponding characteristic data of the point cloud clusters in the motion state to be determined, marking the number of points as PN, the density of the point cloud as PD and the triaxial spans as DX, DY and DZ in sequence. Calculating corresponding characteristic data of the matched point cloud cluster, and marking the mass center offset as a 1 The number of points is changed to a 5 The density change of the point cloud is marked as a 6 The three-axis span change is sequentially marked as a 2 、a 3 、a 4 . According to the characteristic of the characteristic data, adopting a trapezoidal distribution membership function to obtain R 2 For example, a corresponding membership function model is constructed:
Figure BDA0004143559830000131
R 2 and R is 3 The other row elements adopt the membership function construction form and use the corresponding point cloud cluster characteristicsA in data substitution 2 Calculating with DX;
step 4-3, calculating an evaluation weight matrix corresponding to the second-level evaluation factor by adopting an analytic hierarchy process, and constructing a corresponding judgment matrix C by taking a triaxial span change factor set as an example 2
Figure BDA0004143559830000132
Based on C 2 The corresponding weight is calculated by the middle element by adopting a method root, and the calculation result is normalized:
Figure BDA0004143559830000133
wherein e represents the number of evaluation factors in the current factor set. The three-axis span change factor set corresponds to e=3, and a corresponding weight matrix D is constructed 2
D 2 =[d 21 d 22 d 23 ]=[0.4 0.4 0.2] (13)
Calculating weight matrix D corresponding to the point quantity and the point cloud density change factor set by adopting the same method 3
D 3 =[d 31 d 32 ]=[0.67 0.33] (14)
And checking the calculation result of the weight array by adopting an analytic hierarchy process consistency checking method, and performing fuzzy comprehensive evaluation of a higher level after the calculation result is qualified.
Step 4-4, adopting a product summation method to sum D 2 And D 3 Respectively apply to R 2 And R is 3 Last two rows of elements in the calculation R:
Figure BDA0004143559830000141
/>
constructing a membership degree calculation function corresponding to the first row element in R:
Figure BDA0004143559830000142
wherein v is the minimum motion rate of the absolute dynamic target which can be detected by the model, and can be adjusted according to the detection requirement. Delta t is the time interval for collecting the LiDAR point cloud data of two adjacent frames, and the delta t is the adjustable quantity. VAL is an adaptive model parameter:
VAL=λ 1 DIS+λ 2 MS (17)
wherein DIS is the three-dimensional Euclidean distance from the centroid of the current point cloud cluster to the origin, MS is the maximum span of the current point cloud cluster in the X, Y, Z triaxial direction, lambda 1 And lambda (lambda) 2 The value of the ground feature is determined by the range of the environment and the complexity of the ground feature in the environment, and the larger the range of the environment is, the more complex the ground feature is, the larger the value of the ground feature is;
Step 4-5, calculating a weight matrix D corresponding to the first-level evaluation factor by adopting an analytic hierarchy process, and applying the weight matrix D to R to calculate a fuzzy comprehensive evaluation set B:
D=[d 1 d 2 d 3 ]=[0.5 0.25 0.25] (18)
Figure BDA0004143559830000151
comparing the elements in B according to the maximum membership rule, if B 1 The maximum, the evaluation result of the model output is static; if b 3 The maximum, the evaluation result outputted is dynamic; if b 2 Maximum, outputting an evaluation result to be undetermined and outputting b at the same time 2 As a corresponding evaluation score.
Step 5, based on the evaluation result of the step 4, judging the motion state of each point cloud cluster, and finally removing the dynamic point cloud cluster from the original point cloud data, thereby improving the data quality of the original LiDAR point cloud and further improving the positioning accuracy and the mapping effect of LiDAR SLAM;
step 5-1, sequentially inputting pairing and combining results corresponding to the point cloud clusters into a point cloud motion state multi-level fuzzy comprehensive evaluation model, and using an ith frame of point cloud P i W N-th point cloud cluster in (3)
Figure BDA0004143559830000152
For example, if the evaluation result appears static, the label +.>
Figure BDA0004143559830000153
Is a static point cloud cluster and stops the following and +.>
Figure BDA0004143559830000154
Calculating the correlation; if all evaluation results are dynamic, then mark +.>
Figure BDA0004143559830000155
Is a dynamic point cloud cluster; if there is no search result, then mark +.>
Figure BDA0004143559830000156
The missing point cloud cluster is regarded as a special dynamic point cloud cluster. In addition to the above, if the evaluation result has a waiting time, the maximum evaluation score is regarded as +. >
Figure BDA0004143559830000157
A final motor state evaluation score, marking +.f if the score is greater than 0.5>
Figure BDA0004143559830000158
The point cloud cluster is a static point cloud cluster, otherwise, the point cloud cluster is marked as a dynamic point cloud cluster;
step 5-2, enumerating P i W Repeating the steps until all the motion states of the point cloud clusters are judged, converting the point cloud clusters marked as static into a corresponding laser radar coordinate system and issuing the point cloud clusters as a new frame of point cloud data, wherein the corresponding LiDAR point cloud data does not contain the point cloud data generated by dynamic target scanning. And aiming at the first frame of point cloud, the point cloud cluster which is paired with the second frame of static point cloud cluster and the evaluation result is static is issued.
In the embodiment of the invention, liDAR and IMU data are collected in a real environment by building a combined positioning mobile platform,LiDAR used VLP-16 from Velodyne, and IMU used Ellipse-N from SBG. The come TS50 total station is used for automatically tracking a total reflection prism on a mobile platform to acquire a reference track. The experimental field is about 16m long, about 18m wide and about 260m in area 2 The load-bearing columns in the environment, the tables and chairs placed in the environment and the like are regarded as static targets. The cardboard, which is arranged to be 0.8m long and 1.2m wide by 4 persons, is intermittently irregularly moved in the environment to act as a randomly occurring dynamic target, and is regarded as a static target during the intermittent stage. The corresponding time period during which the dynamic target is eliminated and reappeared in the LiDAR field of view is recorded. A rectangular track with the length of about 9m and the width of about 10m is designed at the center of the field, and the mobile platform moves along the track for one circle. The corresponding dynamic point cloud eliminating effect is verified based on the collected sensor data, and the processed point cloud data is input LOAM (Lidar Odometry andMapping) for positioning and mapping and is compared with LOAM positioning and mapping calculation results based on the original point cloud.
As shown in fig. 7, the time period without a dynamic target in the recording environment is approximately consistent with the time period corresponding to the occurrence valley value of the point cloud rejection rate, and the point cloud rejection rate is mainly distributed in the interval of 0.05-0.2 in the part outside the dotted line mark, and is equal to the ratio between the number of laser points rejected after the processing of the method and the number of laser points remained after the filtering of the ground points. The method can effectively remove the point cloud generated by randomly-appearing dynamic target scanning, and can effectively reserve the static point cloud when no dynamic target exists in the environment. In addition, because the scanning blind areas of the LiDAR are different in the platform movement process, the point cloud rejection rate has reasonable fluctuation change along with time, and the feasibility and the accuracy of the method are successfully verified.
As shown in FIG. 8, the LOAM algorithm positioning track based on the original point cloud data is shown as a dotted line in the figure, the LOAM algorithm positioning track corresponding to the point cloud data processed by the method is shown as a dotted line in the figure, and the reference track is shown as a solid line in the figure. The result shows that the positioning track based on the original point cloud after the platform starts to move is offset compared with the reference track, the offset degree of the subsequent positioning track is increased along with the increase of time. The dynamic point cloud is described to directly influence the positioning accuracy of the LOAM algorithm. The random dynamic targets in the data acquisition process shield geometrical characteristics in part of the environment, and the dynamic point cloud also participates in interframe registration, so that the positioning accuracy is reduced. The more accurate positioning result is obtained by taking the point cloud data processed by the method as the input of the LOAM, which shows that the method can effectively weaken the influence of the dynamic point cloud on the inter-frame registration accuracy.
As shown in fig. 9, the positioning errors on the corresponding X-axis, Y-axis and X-Y plane of the two cases are compared quantitatively, and after the point cloud data are processed by the method of the invention, the root mean square error in the corresponding X-axis direction is reduced from 0.039m to 0.015m, and the maximum error is reduced from 0.176m to 0.056m; the root mean square error in the Y-axis direction is reduced from 0.102m to 0.033m, and the maximum error is reduced from 0.354m to 0.097m. The X-Y plane positioning root mean square error is reduced from 0.109m to 0.037m, and the precision is improved by 66.06%; the maximum error is reduced from 0.360m to 0.098m, and the accuracy is improved by 72.78%. The result shows that the invention can effectively improve the positioning precision of LiDAR SLAM and can realize centimeter-level positioning.
And intercepting the same-segment data from the initial moment for constructing a point cloud map, wherein the top view of the map construction result corresponding to the two conditions is shown in fig. 10. Besides the point cloud generated by static target scanning in walls, house beams and the environment, a large number of point clouds generated by dynamic target scanning exist in the point cloud map constructed based on the original point cloud, and the part encircled by the black mark in the sub-image (a) is a more obvious dynamic point cloud gathering area. In addition, the outer wall contour in the figure also has the ghost phenomenon. The dynamic point cloud is described to influence the positioning accuracy of LiDAR SLAM and the mapping effect. Compared with the sub-graph (a), the point cloud map has a neat overall structure, greatly reduces the number of dynamic point clouds participating in the map building, and is easier to be applied to robot autonomous navigation based on the point cloud map. The point cloud clusters similar to the part encircled by the black mark in the sub-graph (b) are point clouds generated by scanning a dynamic target in an intermittent stage in the experimental process, and further illustrate that the method can effectively remove the dynamic point clouds and retain the static point clouds.
While the invention has been described in terms of what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not limited to the disclosed embodiment. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.

Claims (5)

1. An IMU-assisted LiDAR dynamic point cloud eliminating method is characterized by comprising the following steps of:
step 1, acquiring LiDAR and IMU sensor data, calibrating six-degree-of-freedom rotation and translation parameters between two sensor coordinate systems, and simultaneously realizing time synchronization between the two sensor data by using a time synchronization module;
and 2, after each frame of LiDAR point cloud data is acquired, removing LiDAR point cloud motion distortion by using the LiDAR/IMU external parameters obtained in the step 1 and assisted by using IMU pre-integration information, and then performing one-step downsampling and noise point filtering processing on the whole point cloud data. Based on the primarily processed LiDAR point cloud data, the point cloud generated by different target scanning in the environment is segmented into corresponding independent point cloud clusters by a rapid ground point cloud extraction and point cloud cluster segmentation method;
Step 3, unifying coordinate systems of two adjacent frames of point clouds by using LiDAR/IMU external parameters and IMU information, establishing a pairing relation between the point clouds of the adjacent frames based on the point cloud cluster segmentation result obtained in the step 2, and calculating corresponding pairing point cloud cluster characteristic data;
step 4, carrying out multi-level fuzzy comprehensive evaluation on the point cloud cluster motion state from three aspects of centroid offset, triaxial span change, point number and point cloud density change by constructing a multi-level fuzzy comprehensive evaluation model of the point cloud motion state based on the pairing relation between the point cloud clusters of the adjacent frames obtained in the step 3;
and 5, judging the motion state of each point cloud cluster based on the evaluation result in the step 4, and finally removing the dynamic point cloud cluster from the original point cloud data, thereby improving the data quality of the original LiDAR point cloud and further improving the positioning accuracy and the mapping effect of the LiDAR SLAM.
2. The IMU-assisted LiDAR dynamic point cloud removal method of claim 1, wherein in step 2, the point cloud generated by different target scans in the environment is segmented into corresponding independent point cloud clusters by a rapid ground point cloud extraction and point cloud cluster segmentation method;
the method comprises the following specific steps:
Step 2-1, establishing index numbers according to the corresponding scanning wire harness arrangement sequence and geometric relation for each frame of point cloud, so that the same-line point cloud has a uniform row number and the same-column point cloud has a uniform column number;
step 2-2, dividing each frame of point cloud into a plurality of radial small areas according to the column number based on the point cloud numbers acquired in the step 2-1, and sequentially calculating the included angle rad between the connecting line of two adjacent laser points k and k+1 and the X-Y plane in each small area according to the ascending order of the line number (k,k+1) Taking the change of the fluctuation of the point cloud as a reference:
Figure FDA0004143559800000011
wherein, (x) k ,y k ,z k ) Representing three-dimensional coordinates, d, of a point cloud in a Cartesian coordinate system (k,k+1) The projection length of the two-point connecting line on the X-Y plane is as follows;
step 2-3, corresponding rad of step 2-2 (k,k+1) Comparing the calculated result with 0.05, if rad (k,k+1) If the number is larger than 0.05, marking the starting point of the connecting line as a ground point. When the number of the points is larger than 0.05 for the first time, stopping the subsequent calculation of the current small area, marking the starting point of the connecting line as a ground point, and marking the ending point of the connecting line as a first non-ground point. Points of the region that remain not involved in the calculation are also marked as non-ground points. After each small area finishes the calculation, filtering out all laser points marked as ground points of the current frame;
2-4, projecting the LiDAR point cloud data processed in the step into an X-Y two-dimensional plane, and carrying out primary clustering on the whole point cloud by adopting an area growing algorithm based on a two-dimensional Euclidean distance, wherein a seed point set in the area growing process consists of all laser points marked as the first non-ground;
The primary clustering in the step 2-5 and the step 2-4 is fast in processing, but there are two cases that different targets with obvious differences in the Z direction are divided into one point cloud cluster or different targets are connected into one point cloud cluster due to discrete points. Since the occurrence probability of the above situation is low, it is inefficient to perform secondary clustering on each primary clustering result, so a criterion for determining whether a point cloud cluster needs secondary clustering is determined:
Figure FDA0004143559800000021
PD n =PN n /PV n =PN n /(DX n DY n DZ n ) (3)
wherein DX n 、DY n 、DZ n Is the triaxial span of the nth point cloud cluster X, Y, Z direction in a frame of point cloud,
Figure FDA0004143559800000022
and->
Figure FDA0004143559800000023
The maximum and minimum values of the point cloud X, Y, Z coordinates of the current point cloud cluster, PN n For the number of points in the current point cloud cluster, PV n PD is the minimum outsourcing hexahedral volume of the current point cloud cluster n And the point cloud density is the point cloud density of the current point cloud cluster. If the three-axis span is too large but the number of points is too small, the point cloud distribution of the point cloud clusters is not dense enough and possibly abnormal exists, so the point cloud clusters with the point cloud density smaller than 100 are judged to be abnormal point cloud clusters needing secondary subdivision;
step 2-6, enumerating the abnormal point cloud clusters obtained by screening in the step 1-5, calculating the three-dimensional Euclidean distance from each laser point in the point cloud clusters to other laser points in the same point cloud cluster, and regarding the distance as a neighboring point of the laser points with the distance less than 0.05 m. Taking laser points with the number of the adjacent points being less than 3 as discrete points connected with different point cloud clusters, removing the discrete points, and performing secondary clustering segmentation on the abnormal point cloud clusters after removing all the discrete points by adopting a region growing algorithm based on the three-dimensional Euclidean distance;
Step 2-7, after all abnormal point cloud clusters are subdivided for the second time, counting the number of laser points of each point cloud cluster at present, calculating the point cloud density, and calculating the centroid coordinates of each point cloud cluster:
Figure FDA0004143559800000024
wherein, (x) n ,y n ,z n ) Representing the three-dimensional coordinates of the mass center of the nth point cloud cluster in the point cloud data of the current frame,
Figure FDA0004143559800000025
representing the three-dimensional coordinates of the mth laser point in the nth point cloud cluster.
3. The method for eliminating the LiDAR dynamic point cloud assisted by the IMU according to claim 1, wherein in the step 3, the coordinate system of two adjacent frames of point cloud is unified by using LiDAR/IMU external parameters and IMU information, a pairing relation between each point cloud cluster of the adjacent frames is established based on the point cloud cluster segmentation result obtained in the step 2, and corresponding pairing point cloud cluster characteristic data is calculated;
the method comprises the following specific steps:
step 3-1, taking a first frame IMU coordinate system as a global coordinate system, marking the global coordinate system as a W system, and based on rotation parameters in LiDAR/IMU external parameter calibration results
Figure FDA0004143559800000031
And translation parameter->
Figure FDA0004143559800000032
Converting each frame of LiDAR point cloud into a global coordinate system by combining IMU pre-integration results, and using the ith frame of point cloudP i L For example, the converted point cloud is denoted as P i W
Figure FDA0004143559800000033
Step 3-2, under W, using P i W N-th point cloud cluster in (3)
Figure FDA0004143559800000034
Is centered on the centroid of 3m as radius +. >
Figure FDA0004143559800000035
A search area is established. To->
Figure FDA0004143559800000036
Midpoint cloud cluster->
Figure FDA0004143559800000037
As an example of a search result, feature data of the peer-to-peer cloud cluster is calculated:
Figure FDA0004143559800000038
wherein,,
Figure FDA0004143559800000039
and->
Figure FDA00041435598000000310
Respectively is a point cloud cluster->
Figure FDA00041435598000000311
And->
Figure FDA00041435598000000312
Centroid three-dimensional coordinates of>
Figure FDA00041435598000000313
And->
Figure FDA00041435598000000314
For the number of laser spots in the corresponding point cloud cluster, < >>
Figure FDA00041435598000000315
And->
Figure FDA00041435598000000316
For the point cloud density of the corresponding point cloud cluster, < +.>
Figure FDA00041435598000000317
And->
Figure FDA00041435598000000318
Three-axis span for the corresponding point cloud cluster;
step 3-3, will
Figure FDA00041435598000000319
And->
Figure FDA00041435598000000320
Middle and->
Figure FDA00041435598000000321
All search results capable of establishing the pairing relation respectively form pairing combination and serve as input of a point cloud motion state multi-level fuzzy comprehensive evaluation model.
4. The IMU-assisted LiDAR dynamic point cloud removal method according to claim 1, wherein in step 4, the point cloud motion state is subjected to multi-level fuzzy comprehensive evaluation from three aspects of centroid offset, triaxial span change, point number and point cloud density change by constructing a multi-level fuzzy comprehensive evaluation model of the point cloud motion state;
the method comprises the following specific steps:
and 4-1, taking mass center offset, triaxial span change, point quantity change and point cloud density change between paired point cloud clusters as criteria for judging the motion state of the paired point cloud clusters to jointly form an evaluation factor set of the model, and further constructing a factor set U, an evaluation set V and a fuzzy evaluation matrix R:
Figure FDA0004143559800000041
Wherein U in U 1 、u 2 、u 3 All are first-level evaluation factors, and sequentially correspond to three factors in the evaluation factor set. u (u) 21 、u 22 、u 23 Corresponding to X, Y, Z triaxial span change between matched point cloud clusters, u 31 、u 32 The two-foot mark variables are two-level evaluation factors corresponding to the number of points among the matched point cloud clusters and the change of the point cloud density. V of V 1 、v 2 、v 3 And corresponding to the evaluation results of the static, undetermined and dynamic three point cloud cluster motion states. R represents a fuzzy relation between a factor set U and an evaluation set V, wherein mu (U, V) is a corresponding membership function;
step 4-2, constructing an evaluation matrix R corresponding to the secondary evaluation factors 2 And R is 3
Figure FDA0004143559800000042
And (3) by acquiring corresponding characteristic data of the point cloud clusters in the motion state to be determined, marking the number of points as PN, the density of the point cloud as PD and the triaxial spans as DX, DY and DZ in sequence. Calculating corresponding characteristic data of the matched point cloud cluster, and marking the mass center offset as a 1 The number of points is changed to a 5 The density change of the point cloud is marked as a 6 The three-axis span change is sequentially marked as a 2 、a 3 、a 4 . According to the characteristic of the characteristic data, adopting a trapezoidal distribution membership function to obtain R 2 First element of (a)Taking element as an example, constructing a corresponding membership function model:
Figure FDA0004143559800000043
R 2 and R is 3 The other row elements adopt the membership function construction form, and a in the corresponding point cloud cluster characteristic data substitution type is utilized 2 And DX.
Step 4-3, calculating an evaluation weight matrix corresponding to the second-level evaluation factor by adopting an analytic hierarchy process, and constructing a corresponding judgment matrix C by taking a triaxial span change factor set as an example 2
Figure FDA0004143559800000051
Based on C 2 The corresponding weight is calculated by the middle element by adopting a method root, and the calculation result is normalized:
Figure FDA0004143559800000052
Figure FDA0004143559800000053
wherein e represents the number of evaluation factors in the current factor set. The three-axis span change factor set corresponds to e=3, and a corresponding weight matrix D is constructed 2
D 2 =[d 21 d 22 d 23 ]=[0.4 0.4 0.2] (13)
Calculating weight matrix D corresponding to the point quantity and the point cloud density change factor set by adopting the same method 3
D 3 =[d 31 d 32 ]=[0.67 0.33] (14)
Checking the calculation result of the weight array by adopting an analytic hierarchy process consistency checking method, and using the qualified calculation result for higher-level fuzzy comprehensive evaluation;
step 4-4, adopting a product summation method to sum D 2 And D 3 Respectively apply to R 2 And R is 3 Last two rows of elements in the calculation R:
Figure FDA0004143559800000054
constructing a membership degree calculation function corresponding to the first row element in R:
Figure FDA0004143559800000061
wherein v is the minimum motion rate of the absolute dynamic target which can be detected by the model, and can be adjusted according to the detection requirement. Delta t is the time interval for collecting the LiDAR point cloud data of two adjacent frames, and the delta t is the adjustable quantity. VAL is an adaptive model parameter:
VAL=λ 1 DIS+λ 2 MS (17)
wherein DIS is the three-dimensional Euclidean distance from the centroid of the current point cloud cluster to the origin, MS is the maximum span of the current point cloud cluster in the X, Y, Z triaxial direction, lambda 1 And lambda (lambda) 2 The value of the ground feature is determined by the range of the environment and the complexity of the ground feature in the environment, and the larger the range of the environment is, the more complex the ground feature is, the larger the value of the ground feature is;
step 4-5, calculating a weight matrix D corresponding to the first-level evaluation factor by adopting an analytic hierarchy process, and applying the weight matrix D to R to calculate a fuzzy comprehensive evaluation set B:
D=[d 1 d 2 d 3 ]=[0.5 0.25 0.25] (18)
Figure FDA0004143559800000062
comparing the elements in B according to the maximum membership rule, if B 1 The maximum, the evaluation result of the model output is static; if b 3 The maximum, the evaluation result outputted is dynamic; if b 2 Maximum, outputting an evaluation result to be undetermined and outputting b at the same time 2 As a corresponding evaluation score.
5. The IMU-assisted method for removing LiDAR dynamic point clouds according to claim 1, wherein in step 5, based on the evaluation result of the motion state of the point cloud clusters, the motion state of each point cloud cluster is determined, and finally the dynamic point cloud cluster is removed from the original point cloud data, so that the data quality of the LiDAR point clouds is improved;
the method comprises the following specific steps:
step 5-1, sequentially inputting pairing and combining results corresponding to the point cloud clusters into a point cloud motion state multi-level fuzzy comprehensive evaluation model, and using an ith frame of point cloud P i W N-th point cloud cluster in (3)
Figure FDA0004143559800000063
For example, if the evaluation result appears static, the label +. >
Figure FDA0004143559800000064
Is a static point cloud cluster and stops the following and +.>
Figure FDA0004143559800000071
Calculating the correlation; if all evaluation results are dynamic, then mark +.>
Figure FDA0004143559800000072
Is a dynamic point cloud cluster; if there is no search result, then mark +.>
Figure FDA0004143559800000073
The missing point cloud cluster is regarded as a special dynamic point cloud cluster. In addition to the above, if the evaluation result has a waiting time, the maximum evaluation score is regarded as +.>
Figure FDA0004143559800000074
A final motor state evaluation score, marking +.f if the score is greater than 0.5>
Figure FDA0004143559800000075
The point cloud cluster is a static point cloud cluster, otherwise, the point cloud cluster is marked as a dynamic point cloud cluster;
step 5-2, enumerating P i W Repeating the steps until all the motion states of the point cloud clusters are judged, converting the point cloud clusters marked as static into a corresponding laser radar coordinate system and issuing the point cloud clusters as a new frame of point cloud data, wherein the corresponding LiDAR point cloud data does not contain the point cloud data generated by dynamic target scanning. And aiming at the first frame of point cloud, the point cloud cluster which is paired with the second frame of static point cloud cluster and the evaluation result is static is issued.
CN202310297076.4A 2023-03-24 2023-03-24 IMU-assisted LiDAR dynamic point cloud eliminating method Pending CN116385292A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310297076.4A CN116385292A (en) 2023-03-24 2023-03-24 IMU-assisted LiDAR dynamic point cloud eliminating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310297076.4A CN116385292A (en) 2023-03-24 2023-03-24 IMU-assisted LiDAR dynamic point cloud eliminating method

Publications (1)

Publication Number Publication Date
CN116385292A true CN116385292A (en) 2023-07-04

Family

ID=86972385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310297076.4A Pending CN116385292A (en) 2023-03-24 2023-03-24 IMU-assisted LiDAR dynamic point cloud eliminating method

Country Status (1)

Country Link
CN (1) CN116385292A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437162A (en) * 2023-12-22 2024-01-23 吉林大学 Dynamic point cloud data enhancement method and device based on instance-level sequence mixing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437162A (en) * 2023-12-22 2024-01-23 吉林大学 Dynamic point cloud data enhancement method and device based on instance-level sequence mixing
CN117437162B (en) * 2023-12-22 2024-03-08 吉林大学 Dynamic point cloud data enhancement method and device based on instance-level sequence mixing

Similar Documents

Publication Publication Date Title
CN109887028B (en) Unmanned vehicle auxiliary positioning method based on point cloud data registration
US20240036207A1 (en) Multiple Resolution, Simultaneous Localization And Mapping Based On 3-D Lidar Measurements
CN110866887A (en) Target situation fusion sensing method and system based on multiple sensors
CN112101278A (en) Hotel point cloud classification method based on k nearest neighbor feature extraction and deep learning
CN112883820B (en) Road target 3D detection method and system based on laser radar point cloud
CN101794437B (en) Method for detecting abnormal target in hyperspectral remotely sensed image
CN115482195B (en) Train part deformation detection method based on three-dimensional point cloud
CN114488194A (en) Method for detecting and identifying targets under structured road of intelligent driving vehicle
Zhang et al. Three-dimensional cooperative mapping for connected and automated vehicles
CN112528781B (en) Obstacle detection method, device, equipment and computer readable storage medium
CN102494663A (en) Measuring system of swing angle of swing nozzle and measuring method of swing angle
CN116385292A (en) IMU-assisted LiDAR dynamic point cloud eliminating method
CN111797836A (en) Extraterrestrial celestial body patrolling device obstacle segmentation method based on deep learning
CN114879217B (en) Target pose judgment method and system
CN102981160B (en) Method and device for ascertaining aerial target track
CN114137562B (en) Multi-target tracking method based on improved global nearest neighbor
CN114066773B (en) Dynamic object removal based on point cloud characteristics and Monte Carlo expansion method
CN113295142B (en) Terrain scanning analysis method and device based on FARO scanner and point cloud
CN114170188A (en) Target counting method and system for overlook image and storage medium
CN115453570A (en) Multi-feature fusion mining area dust filtering method
CN114488026A (en) Underground parking garage passable space detection method based on 4D millimeter wave radar
Xu et al. A robust close-range photogrammetric system for industrial metrology
CN112069445A (en) 2D SLAM algorithm evaluation and quantification method
CN113673105A (en) Design method of true value comparison strategy
Huang et al. Ground filtering algorithm for mobile LIDAR using order and neighborhood point information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication