CN117419719A - IMU-fused three-dimensional laser radar positioning and mapping method - Google Patents

IMU-fused three-dimensional laser radar positioning and mapping method Download PDF

Info

Publication number
CN117419719A
CN117419719A CN202311229196.7A CN202311229196A CN117419719A CN 117419719 A CN117419719 A CN 117419719A CN 202311229196 A CN202311229196 A CN 202311229196A CN 117419719 A CN117419719 A CN 117419719A
Authority
CN
China
Prior art keywords
point cloud
imu
laser radar
segmentation
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311229196.7A
Other languages
Chinese (zh)
Inventor
罗元
程宁勃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN202311229196.7A priority Critical patent/CN117419719A/en
Publication of CN117419719A publication Critical patent/CN117419719A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a three-dimensional laser radar positioning and mapping method fusing an IMU. Firstly, carrying out linear interpolation on laser point clouds through the high-frequency pose estimated by the IMU, so as to correct the motion distortion of the point clouds; secondly, introducing a height threshold value into the ground segmentation part, rapidly and accurately segmenting the characteristic point cloud, clustering the characteristic point cloud through the point cloud, and introducing a dynamic clustering threshold value to judge and classify; thirdly, introducing a Scan-Context descriptor to perform loop detection; and finally, registering the local map formed by the key frames with the current frame to obtain an interframe matching error, and constructing a joint optimization function by combining the IMU pre-integration error and loop detection constraint to solve the laser odometer pose. Track and pose accuracy assessment is performed on the improved algorithm and the mainstream LeGO-LOAM scheme by means of the KITTI data set. The actual measurement result shows that under the condition of meeting the requirement of real-time performance of the odometer, the precision of the improved laser odometer is higher than that of the LeGO-LOAM scheme.

Description

IMU-fused three-dimensional laser radar positioning and mapping method
Technical Field
The invention belongs to the technical field of robot perception and positioning, and particularly relates to a three-dimensional laser radar positioning and mapping method integrating an IMU.
Background
The statements in this section merely provide background information related to the present disclosure and may constitute prior art. In carrying out the present invention, the inventors have found that at least the following problems exist in the prior art.
Meanwhile, a positioning and mapping (Simultaneous Localization and Mapping, SLAM) technology is an important technology in the field of mobile robots, and is mainly used for researching mapping and positioning problems of robots in unknown scenes, and the pose estimation of the robots and the map construction of the environment are completed through sensors carried by the robots, so that autonomous operation of the robots in the unknown environments is realized. The working environment of mobile robots is shifted from a single experimental test scenario to an increasingly complex real world, such as autopilot, service robots, smart agriculture, etc. These uncertainty factors present a great challenge to autonomous positioning and environmental awareness of mobile robots.
Currently, three main current SLAM methods are camera-based visual SLAM, radar-based laser SLAM, and multi-sensor fusion-based SLAM. The camera-based visual SLAM is susceptible to interference from light to produce error information. Compared with a vision camera, the laser radar has the advantages of capability of accurately acquiring information of a target, strong anti-interference capability, wide detection range, nearly all-weather operation and the like. The laser SLAM comprises two-dimensional and three-dimensional laser SLAM, and the two-dimensional and three-dimensional laser SLAM can actively emit a plurality of laser beams to perform environment detection and accurate ranging, so that functions of real-time pose estimation, establishment of a three-dimensional map of surrounding environment, scene recognition and the like are realized, and the application range is wider. With the rapid development of laser SLAM technology, three-dimensional laser SLAM is currently one of the most advanced mobile robot SLAM technologies.
However, three-dimensional laser SLAM can experience degradation in open and more similar aisles. The degradation phenomenon is caused by the working principle of the single-sensor laser radar and the environmental characteristics. Lidar is susceptible to multipath interference, reflective surface limitations, environmental noise, and power loss in open and similar corridor environments, resulting in data degradation and performance degradation.
Currently, a plurality of sensors are mostly adopted to solve the problems. The patent name of application number 202211616723.5 is a robot positioning method, device, equipment and storage medium, and the pose of the robot is corrected by fusing various sensors and adding semantic information into a known map, so that the problems that a single sensor is easy to lose in positioning and mismatching in a similar and rare environment are solved.
But few have employed single sensor lidar to address the above issues.
Disclosure of Invention
In view of the above, it is an object of the present invention to solve some of the problems of the prior art, or at least to alleviate them.
A three-dimensional laser radar positioning and mapping method integrating IMU comprises the following steps:
acquiring an original point cloud acquired by a laser radar and a high-frequency signal acquired by an Inertial Measurement Unit (IMU);
preprocessing the original point cloud data, and performing point cloud segmentation; the point cloud segmentation comprises ground point cloud segmentation and point cloud clustering segmentation, so that an original point cloud is converted into a target point cloud only comprising obstacles;
performing point cloud distortion correction on the target point cloud to obtain a characteristic point cloud for completing motion distortion compensation;
extracting line-surface features by calculating the curvature of the feature point cloud, adding the line-surface features into a local feature map, and registering feature points at continuous moments;
introducing a Scan-Context descriptor to perform loop detection;
and carrying out distortion elimination on the point cloud through state prediction of the IMU, taking the processed radar pose as an inter-frame registration initial value, constructing a jointly optimized error function to accurately solve the radar pose, and generating a final track and a global consistency map.
After acquiring the initial point cloud acquired by the laser radar and the high-frequency signal acquired by the inertial measurement unit IMU, the method further comprises the step of introducing pre-integral calculation to the high-frequency signal, wherein the calculation formula is as follows:
wherein Δp ij 、Δv ij 、q ij The position, the speed and the rotation variation from the previous moment i to the current moment j are respectively; t epsilon (i, j), symbolRepresents the multiplication of quaternions, q i 、q j The rotation amounts of the last instant i and the current instant j, respectively, +.>B as angular velocity measurement gk 、b ak Representing the bias error, η, of the gyroscope and accelerometer, respectively gk 、η ak Representing measurement noise of the gyroscope and the accelerometer, respectively; deltaR ik The ideal value is pre-integrated at the previous moment i, and offset errors and noise exist between the ideal value and the measured value; deltav ik An error exists between the ideal speed value at the previous moment i and the speed measured value; Δt is the time interval; />For IMU in world coordinate systemAcceleration under.
Further, the point cloud segmentation includes the following steps:
ground point cloud segmentation: rapidly dividing an original point cloud image subjected to point cloud preprocessing into a ground point cloud and a target point cloud through a ground segmentation algorithm, and extracting the target point cloud;
and (3) clustering and segmentation of point clouds: and carrying out cluster segmentation on the target point cloud after the ground point cloud segmentation extraction by using a cluster segmentation algorithm.
The steps of dividing the ground point cloud and the target point cloud are as follows:
determining an angle theta by using a ground segmentation algorithm: OK and OM are two adjacent laser lines sent by the laser radar, and the corresponding depths are R respectively r And R is r-1 The method comprises the steps of carrying out a first treatment on the surface of the θ is the angle between KM and MN, β i 、β j For the included angle between the laser radar wire harness and the x-axis, the calculation method of the angle theta is as follows:
introducing a height threshold h that varies based on lidar mounting height pc, To filter small obstructions proximate the ground:
h pc =0.15h L
wherein h is L Representing the mounting height of the radar;
the target point cloud and the ground point cloud are distinguished and segmented: when the angle theta is smaller than 15 degrees and the height of the cloud of the two points M, K is larger than the height threshold h pc Or when the angle theta is larger than 15 degrees, judging that the target point cloud is formed; otherwise, the cloud is the ground point cloud.
Further, a dynamic clustering threshold is introduced into the point cloud clustering segmentation:
wherein N is p For clustering points, N l D is the number of scanning laser beams occupied in the vertical directionThe horizontal distance of the radar and laser scanning points is shown.
Performing point cloud distortion correction on the target point cloud by utilizing a high-frequency signal generated by the IMU to complete motion distortion compensation; the method comprises the following steps:
simultaneously receiving laser radar data and high-frequency signals of the IMU, and storing the laser radar data and the high-frequency signals;
according to the time stamp of the current point cloud and the sampling frequency of the laser radar, calculating the time of each sampling point;
searching adjacent IMU gesture data in a storage queue according to the current timestamp, and estimating the current laser point gesture through spherical linear interpolation;
transforming all the point cloud postures from the scanning start time to the scanning end time;
the method comprises the steps of carrying out linear interpolation on pre-integral poses of an IMU at start-stop moments of point cloud scanning to calculate poses of corresponding moments in a frame of point cloud; let L be k-1 And L k+1 The moment translation vectors are respectively T k-1 、T k+1 The gesture vectors are respectively P k-1 、P k+1 ;L j-1 And L j+1 The moment translation vectors are respectively T j-1 、T j+1 The gesture vectors are respectively P j-1 、P j+1
The position interpolation formula is as follows:
the formula of the spherical linear interpolation is as follows:
P k =a(t)P k-1 +b(t)P k+1
wherein P is k Is the required attitude vector, P k-1 And P k+1 Respectively IMU at L j-1 And L j+1 The position and posture of the moment, a (t) and b (t), are changed according to the time tTwo coefficient variables.
After the point cloud distortion correction is carried out on the target point cloud, filtering non-characteristic point cloud from characteristic point cloud with motion distortion compensation, and distinguishing plane characteristics and edge characteristics so as to reduce data operand and improve calculation efficiency; the method comprises the following specific steps:
rejecting non-characteristic point clouds: characteristic points when the laser beam and the object surface approach parallel and characteristic points of the shielded part are regarded as unreliable point clouds and are not included in the characteristic sequence;
distinguishing between planar features and edge features: through point cloud roughnessSetting a threshold value to distinguish plane characteristics and edge characteristics so as to extract different types of structural information from point cloud data, thereby facilitating better point cloud matching and calculation;
the point cloud roughnessThe formula of (2) is:
wherein,is a set of adjacent points on the same scanning line at the moment k; />For the current frame point cloud coordinates under the laser radar reference system,/->Is->The point clouds are adjacent to one another on the same scanning line; point cloud roughness->When the threshold value is larger than the threshold value, dividing the edge characteristic into edge characteristics; point cloud roughness->When the value is smaller than the threshold value, the plane features are divided.
Introducing a Scan-Context descriptor to perform loop detection, comprising the following steps:
introducing a Scan-Context descriptor to perform loop detection, selecting an average value of 3 adjacent height maximum values in each region, and taking the average value as a numerical value of the feature matrix;
if the distance of one scanning context pair meets the threshold condition, next step is to find the matched loop frame based on the Euclidean distance mode for verification;
if loop matching of the query point cloud and the candidate point cloud based on Euclidean distance is successful, adding loop constraint into the nonlinear optimization function to obtain minimum error.
Constructing a jointly optimized error function to accurately solve the radar pose, comprising the following steps:
optimizing inter-frame matching errors: two frames of point cloud registration errors are constructed, so that the instantaneity of an algorithm is improved, and the characteristics of key frames are included in an optimization function;
optimizing IMU errors: an IMU error function is constructed to improve positioning accuracy;
optimizing loop error; constructing loop errors to improve positioning accuracy;
and constructing a nonlinear cost function of laser radar constraint, IMU measurement constraint and loop-back constraint, and solving an optimal state with the minimum error function by utilizing a nonlinear optimization algorithm of Gauss Newton.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the three-dimensional lidar localization and mapping method of the fusion IMU.
The invention has the following beneficial effects:
1. in order to solve the problem that a single-sensor laser radar can generate degradation phenomenon in open and similar galleries, the single-sensor laser radar is used, a laser mileage calculation method of loosely coupling laser and IMU data is applied to SLAM, motion constraint is provided through an IMU, the IMU pre-integration and loop detection synergistic effect are achieved, the inter-frame matching error is reduced as far as possible, and then a joint optimization function is constructed by combining the IMU pre-integration error and loop detection constraint to solve the laser mileage meter pose, so that the positioning accuracy is finally improved; simulation results show that the precision of the laser odometer is 42% higher than that of the LeGO-LOAM scheme, and the laser odometer is more approximate to a real track;
2. based on the consideration of positioning accuracy, the invention carries out specific consideration on two parts: firstly, carrying out linear interpolation on the laser point cloud through the high-frequency pose estimated by the IMU, so as to correct the motion distortion of the point cloud; secondly, the distance information generated by the Scan-Context descriptor loop detection module is used in the loop detection part, if the distance information meets the judgment standard, a matching loop frame is searched in a Euclidean distance-based mode, loop constraint is added after verification is successful, and the influence of accumulated errors is effectively avoided;
3. in order to reduce the calculated amount, the invention introduces pre-integral calculation to the high-frequency signal of the IMU, introduces a height threshold value to the ground segmentation part, and introduces a dynamic clustering threshold value to judge classification in the point cloud clustering segmentation, thereby converting the original point cloud into the target point cloud only comprising the obstacle, and being applied to the subsequent feature extraction and matching; and filtering non-characteristic point clouds in the characteristic point clouds for completing motion distortion compensation, so that the data operand can be greatly reduced, and the calculation efficiency is improved.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of a ground segmentation algorithm;
FIG. 3 is a schematic diagram of IMU linear interpolation;
FIG. 4 is a graph showing the alignment of the present invention with LeGO-LOAM in KITTI 05 sequence.
Detailed Description
The present invention will be further described with reference to the accompanying drawings, wherein the embodiments of the present invention are for the purpose of illustrating the invention only and not for the purpose of limiting the same, and wherein various substitutions and modifications are made by the person of ordinary skill in the art without departing from the technical spirit of the present invention, and are intended to be included in the scope of the present invention.
Aiming at the defect that a three-dimensional laser radar has a degradation phenomenon in an open gallery and a similar gallery, the invention applies a laser mileage calculation method for loosely coupling laser and IMU data to SLAM, and provides a three-dimensional laser radar positioning and mapping method based on a single sensor and fusing the IMU.
As shown in fig. 1, a three-dimensional laser radar positioning and mapping method integrating an IMU includes the following steps:
acquiring an original point cloud acquired by a laser radar and a high-frequency signal acquired by an Inertial Measurement Unit (IMU);
preprocessing the original point cloud data, and performing point cloud segmentation; the point cloud segmentation comprises ground point cloud segmentation and point cloud clustering segmentation, so that an original point cloud is converted into a target point cloud only comprising obstacles;
performing point cloud distortion correction on the target point cloud to obtain a characteristic point cloud for completing motion distortion compensation;
extracting line-surface features by calculating the curvature of the feature point cloud, adding the line-surface features into a local feature map, and registering feature points at continuous moments;
introducing a Scan-Context descriptor to perform loop detection;
and carrying out distortion elimination on the point cloud through state prediction of the IMU, taking the processed radar pose as an inter-frame registration initial value, constructing a jointly optimized error function to accurately solve the radar pose, and generating a final track and a global consistency map.
The high-frequency signals collected by the IMU comprise acceleration data, angular velocity data and the like.
The method comprises the steps of preprocessing original point cloud data, including projecting the original point cloud to obtain an original point cloud image with depth information, and performing preprocessing such as invalid point removal on the original point cloud, so as to compensate motion distortion of the point cloud and generate a high-quality point cloud frame.
The invalid point is possibly generated by the influence of measurement on the laser point cloud, and the coordinate value is NAN/Inf and the like. Because the method can seriously interfere with some geometric characteristics, pretreatment of invalid point detection and removal is added, so that influence on subsequent point clouds is avoided. This process is known in the art.
In order to avoid solving the position, speed and rotation variation of moment j in the iterative calculation process, so as to reduce the calculation amount, after acquiring the original point cloud acquired by the laser radar and the high-frequency signal acquired by the inertial measurement unit IMU, the method further comprises the step of introducing pre-integral calculation to the high-frequency signal, wherein the calculation formula is as follows:
wherein Δp ij 、Δv ij 、q ij The position, the speed and the rotation variation from the previous moment i to the current moment j are respectively; t epsilon (i, j), symbolRepresents the multiplication of quaternions, q i 、q j The rotation amounts of the last instant i and the current instant j, respectively, +.>B as angular velocity measurement gk 、b ak Representing the bias error, η, of the gyroscope and accelerometer, respectively gk 、η ak Representing measurement noise of the gyroscope and the accelerometer, respectively; deltaR ik For the last time iPre-integrating the ideal value, and having bias error and noise between the ideal value and the measured value; deltav ik An error exists between the ideal speed value at the previous moment i and the speed measured value; Δt is the time interval; />Is the acceleration of the IMU in the world coordinate system.
The point cloud segmentation comprises the following steps:
ground point cloud segmentation: and rapidly dividing the original point cloud image subjected to the point cloud preprocessing into a ground point cloud and a target point cloud through a ground segmentation algorithm, and extracting the target point cloud. Only feature extraction and matching are carried out on the target point cloud, so that the calculated amount can be greatly reduced, and the operation efficiency and instantaneity of the algorithm are improved.
And (3) clustering and segmentation of point clouds: and carrying out cluster segmentation on the target point cloud after the ground point cloud segmentation extraction by using a cluster segmentation algorithm.
Through ground point cloud segmentation and point cloud clustering, the original point cloud can be converted into a target point cloud only containing obstacles, and the system only performs feature extraction and matching on the target point cloud, so that the calculated amount can be greatly reduced.
As shown in fig. 2, the steps of dividing the ground point cloud and the target point cloud are as follows:
determining an angle theta by using a ground segmentation algorithm: OK and OM are two adjacent laser lines sent by the laser radar, and the corresponding depths are R respectively r And R is r-1 The method comprises the steps of carrying out a first treatment on the surface of the θ is the angle between KM and MN, β i 、β j For the included angle between the laser radar wire harness and the x-axis, the calculation method of the angle theta is as follows:
introducing a height threshold h that varies based on lidar mounting height pc To filter small obstructions proximate the ground:
h pc =0.15h L
wherein h is L Representing the mounting height of the radar;
the target point cloud and the ground point cloud are distinguished and segmented: when the angle theta is smaller than 15 degrees and the height of the cloud of the two points M, K is larger than the height threshold h pc Or when the angle theta is larger than 15 degrees, judging that the target point cloud is formed; otherwise, the cloud is the ground point cloud.
It has been studied that the number of laser points and laser beams scanned is different around 50 meters from the laser radar. The farther from the lidar the object, the smaller the number of laser points and laser beams scanned. In order to accurately segment target point clouds with different distances in a large scene, a dynamic clustering threshold value is introduced into the point cloud clustering segmentation:
wherein N is p For clustering points, N l D represents the horizontal distance between the radar and the laser scanning point, which is the number of scanning laser beams occupied in the vertical direction. When the distance d is greater than 50 meters, the clustering number of the point cloud clusters is greater than 30, and the scanning laser beams in the vertical direction are greater than 3, so that the point cloud clusters are divided into a target point cloud; and when the distance d is smaller than 50 meters, the condition of dividing the target point cloud is that the number of clusters of the point cloud clusters is greater than 50 and the number of scanning laser beams in the vertical direction is greater than 5. Feature extraction and registration after the target point cloud is performed are facilitated, so that the calculation amount is reduced.
The invention improves the registration accuracy of the point cloud by providing motion constraint by the IMU.
Performing point cloud distortion correction on the target point cloud by utilizing a high-frequency signal generated by the IMU to complete motion distortion compensation; the method comprises the following steps:
simultaneously receiving laser radar data and high-frequency signals of the IMU, and storing the laser radar data and the high-frequency signals;
according to the time stamp of the current point cloud and the sampling frequency of the laser radar, calculating the time of each sampling point;
searching adjacent IMU gesture data in a storage queue according to the current timestamp, and estimating the current laser point gesture through spherical linear interpolation;
transforming all the point cloud postures from the scanning start time to the scanning end time;
the method comprises the steps of carrying out linear interpolation on pre-integral poses of an IMU at start-stop moments of point cloud scanning to calculate poses of corresponding moments in a frame of point cloud; let L be k-1 And L k+1 The moment translation vectors are respectively T k-1 、T k+1 The gesture vectors are respectively P k-1 、P k+1 ;L j-1 And L j+1 The moment translation vectors are respectively T j-1 、T j+1 The gesture vectors are respectively P j-1 、P j+1
The position interpolation formula is as follows:
the formula of the spherical linear interpolation is as follows:
P k =a(t)P k-1 +b(t)P k+1
wherein P is k Is the required attitude vector, P k-1 And P k+1 Respectively IMU at L j-1 And L j+1 The pose of the moment, a (t) and b (t), are two coefficient variables that vary according to the time t.
The step of receiving the laser radar data and the high-frequency signals of the IMU can be completed in the step of acquiring the initial point cloud acquired by the laser radar and the high-frequency signals acquired by the inertial measurement unit IMU, wherein the high-frequency signals of the IMU comprise triaxial acceleration data, triaxial angular velocity data and the like.
In order to perform point cloud registration rapidly and accurately, after point cloud distortion correction is performed on the target point cloud, filtering non-characteristic point cloud from characteristic point cloud subjected to motion distortion compensation, and distinguishing plane characteristics and edge characteristics so as to reduce data operand and improve calculation efficiency; the method comprises the following specific steps:
rejecting non-characteristic point clouds: characteristic points when the laser beam and the object surface approach parallel and characteristic points of the shielded part are regarded as unreliable point clouds and are not included in the characteristic sequence;
distinguishing between planar features and edge features: through point cloud roughness c l k Setting a threshold value to distinguish plane characteristics and edge characteristics so as to extract different types of structural information from point cloud data, thereby facilitating better point cloud matching and calculation;
the point cloud roughnessThe formula of (2) is:
wherein,is a set of adjacent points on the same scanning line at the moment k; />For the current frame point cloud coordinates under the laser radar reference system,/->Is->The point clouds are adjacent to one another on the same scanning line; point cloud roughness->When the threshold value is larger than the threshold value, dividing the edge characteristic into edge characteristics; point cloud roughness->When the value is smaller than the threshold value, the plane features are divided.
The purpose of classifying the information in the point cloud of one frame into the plane characteristics and the edge characteristics is to extract different types of structural information, such as an edge characteristic point set and a plane characteristic point set, from the point cloud data, match the point cloud of the previous frame with the point cloud of the current frame, better perform point cloud matching calculation and greatly improve the subsequent system performance.
Introducing a Scan-Context descriptor to perform loop detection, comprising the following steps:
introducing a Scan-Context descriptor to perform loop detection, selecting an average value of 3 adjacent height maximum values in each region, and taking the average value as a numerical value of the feature matrix;
if the distance of one scanning context pair meets the threshold condition, next step is to find the matched loop frame based on the Euclidean distance mode for verification;
if loop matching of the query point cloud and the candidate point cloud based on Euclidean distance is successful, adding loop constraint into the nonlinear optimization function to obtain minimum error.
And obtaining accurate state estimation by iteratively optimizing a nonlinear error function, so as to obtain accurate track estimation and map updating effects. Constructing a jointly optimized error function to accurately solve the radar pose, comprising the following steps:
optimizing inter-frame matching errors: two-frame point cloud registration error is constructedIn order to improve the real-time performance of the algorithm, the characteristics of the key frames are included in the optimization function:
E (Lk,N) =d (l,v) +d (p,u)
wherein d (l,v) Distance d is the distance from the point to the straight line (p,u) Taking the two distances as observation values of key frame registration errors for the distance from the point to the plane; and N is the sum of the number of the feature points extracted by the point cloud of a certain key frame.
Optimizing IMU errors: construction of IMU error functionTo improve positioning accuracy. IMU errors are pose errors from pre-integration, bias errors of gyroscopes and accelerometers.
In the method, in the process of the invention,is IMU pose error from k-1 time to k time under the radar coordinate system; />The pose of the IMU at the moment k; b gk 、b ak Respectively representing bias errors of the gyroscope and the accelerometer; η (eta) gk 、η ak Representing the measurement noise of the gyroscope and accelerometer, respectively.
Optimizing loop error; and constructing loop errors to improve positioning accuracy. The loop error is caused by inaccurate matching of feature extraction and descriptors, and feature point shielding or position change caused by environmental change.
In order to improve positioning accuracy, a nonlinear cost function of laser radar constraint, IMU measurement constraint and loop-back constraint is constructed based on the errors, and an optimal state enabling the error function to be minimum is solved by utilizing a nonlinear optimization algorithm of Gaussian Newton.
In order to better understand the above technical solutions, the following detailed description will refer to the accompanying drawings and specific embodiments.
FIG. 4 is a graph comparing the trace of the LeGO-LOAM in the KITTI sequence with the trace and pose accuracy evaluation of the improved algorithm and the mainstream LeGO-LOAM scheme by the aid of the KITTI data set. The LeGO-LOAM is a lightweight laser radar SLAM algorithm based on ground optimization, and the KITTI data set is jointly sponsored by Karsluer institute of technology and Toyota American society of technology, germany, and is a computer vision algorithm evaluation data set in the largest international automatic driving scene at present. Simulation results show that under the condition of meeting the requirement of real-time performance of the odometer, the precision of the improved laser odometer is 42% higher than that of the LeGO-LOAM scheme, so that the invention can be proved to be more approximate to a real track than that of the LeGO-LOAM algorithm.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the three-dimensional lidar localization and mapping method of the fusion IMU. It comprises the following steps: the front-end data processing module and the back-end optimizing module.
The front-end data processing flow comprises:
(1) And a data preprocessing module. The system projects the original point cloud after receiving the original point cloud generated by the laser radar and the high-frequency signals output by the IMU, including angular velocity and acceleration information, so as to generate an image with depth information.
(2) And the point cloud segmentation module. The method comprises 3 parts, namely point cloud preprocessing, ground point cloud segmentation and point cloud clustering segmentation. The point cloud segmentation module rapidly extracts non-ground point clouds and removes invalid points in the target point clouds using a cluster segmentation algorithm. To filter small obstacles proximate the ground, the present invention introduces a height threshold that varies based on the radar mounting height, as shown in fig. 2.
(3) And the point cloud distortion correction module. And removing point cloud distortion by using high-frequency signals acquired by the IMU to complete motion distortion compensation. The motion distortion correction step of the fusion IMU and the laser radar is as follows: first, laser radar data and high-frequency attitude angle data of the IMU are received simultaneously and stored. Secondly, according to the time stamp of the current point cloud and the sampling frequency of the laser radar, the time of each sampling point can be obtained. And finally, searching adjacent IMU gesture data in a storage queue according to the current timestamp, estimating the current laser point gesture through spherical linear interpolation, and finally converting all the point cloud gestures from the scanning start time to the scanning end time. The interpolation method is shown in fig. 3.
(4) And the feature point extraction module. And (3) extracting line-plane characteristics by calculating the curvature of the point cloud, adding the line-plane characteristics into a local characteristic diagram, registering characteristic points at continuous moments, estimating LiDAR relative motion information, and providing information for the next calibration of the characteristic points. Before extracting the characteristics of the LiDAR point cloud after de-distortion, the characteristic points when the laser beam and the object surface approach parallel and the characteristic points of the shielded part are regarded as unreliable point clouds and are not included in the characteristic sequence.
The back-end optimization module comprises:
(1) And a loop detection module. The feature similarity of a certain frame of depth image is calculated and divided into two steps, the score of Euclidean distance is calculated firstly, then the space distance score of Scan Context feature descriptors is calculated, and whether loop is generated or not is judged through a threshold value by combining the score and the time interval of the two. If loop matching of the query point cloud and the candidate point cloud based on Euclidean distance is successful, adding loop constraint into the nonlinear optimization function, so that the minimum error is obtained.
(2) And the back-end optimization module. The system receives the information such as the current feature frame, pose estimation and the like to perform global consistency optimization, outputs the optimized pose, and generates a final track and a global consistency map. There are three error functions requiring joint optimization, namely an inter-frame matching error, an IMU error and a loop-back error. The inter-frame matching error is an error generated in the matching process of the characteristic points of the continuous key frames; IMU error is the pose error from pre-integration, bias error of gyroscope and accelerometer; the loop error is caused by inaccurate matching of feature extraction and descriptors, and feature point shielding or position change caused by environmental change. The point cloud registration errors can be compensated for due to the high accuracy of the IMU data. Therefore, an IMU sensor is introduced, the IMU radar pose is used as an interframe registration initial value, and a joint optimization error function is constructed to accurately solve the radar pose.
The invention discloses a three-dimensional laser radar positioning and mapping method fusing an IMU. Firstly, carrying out linear interpolation on laser point clouds through the high-frequency pose estimated by the IMU, so as to correct the motion distortion of the point clouds; secondly, introducing a height threshold value into the ground segmentation part, rapidly and accurately segmenting the characteristic point cloud, clustering the characteristic point cloud through the point cloud, and introducing a dynamic clustering threshold value to judge and classify; thirdly, introducing a Scan-Context descriptor to perform loop detection; and finally, registering the local map formed by the key frames with the current frame to obtain an interframe matching error, and constructing a joint optimization function by combining the IMU pre-integration error and loop detection constraint to solve the laser odometer pose. Track and pose accuracy assessment is performed on the improved algorithm and the mainstream LeGO-LOAM scheme by means of the KITTI data set. The actual measurement result shows that under the condition of meeting the requirement of the real-time performance of the odometer, the precision of the improved laser odometer is 42% higher than that of the LeGO-LOAM scheme, and the improved laser odometer is more approximate to a real track.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The above examples should be understood as illustrative only and not limiting the scope of the invention. Various changes and modifications to the present invention may be made by one skilled in the art after reading the teachings herein, and such equivalent changes and modifications are intended to fall within the scope of the invention as defined in the appended claims.

Claims (10)

1. The method for positioning and mapping the three-dimensional laser radar fused with the IMU is characterized by comprising the following steps of:
acquiring an original point cloud acquired by a laser radar and a high-frequency signal acquired by an Inertial Measurement Unit (IMU);
preprocessing the original point cloud data, and performing point cloud segmentation; the point cloud segmentation comprises ground point cloud segmentation and point cloud clustering segmentation, so that an original point cloud is converted into a target point cloud only comprising obstacles;
performing point cloud distortion correction on the target point cloud to obtain a characteristic point cloud for completing motion distortion compensation;
extracting line-surface features by calculating the curvature of the feature point cloud, adding the line-surface features into a local feature map, and registering feature points at continuous moments;
introducing a Scan-Context descriptor to perform loop detection;
and carrying out distortion elimination on the point cloud through state prediction of the IMU, taking the processed radar pose as an inter-frame registration initial value, constructing a jointly optimized error function to accurately solve the radar pose, and generating a final track and a global consistency map.
2. The method for positioning and mapping the three-dimensional laser radar fused with the IMU according to claim 1, wherein after acquiring the initial point cloud acquired by the laser radar and the high-frequency signal acquired by the inertial measurement unit IMU, the method further comprises introducing pre-integral calculation to the high-frequency signal, wherein the calculation formula is as follows:
wherein Δp ij 、Δv ij 、q ij The position, the speed and the rotation variation from the previous moment i to the current moment j are respectively; t epsilon (i, j), symbolRepresents the multiplication of quaternions, q i 、q j The rotation amounts of the last instant i and the current instant j, respectively, +.>B as angular velocity measurement gk 、b ak Representing the bias error, η, of the gyroscope and accelerometer, respectively gk 、η ak Representing measurement noise of the gyroscope and the accelerometer, respectively; deltaR ik The ideal value is pre-integrated at the previous moment i, and offset errors and noise exist between the ideal value and the measured value; deltav ik An error exists between the ideal speed value at the previous moment i and the speed measured value; Δt is the time interval; />Is the acceleration of the IMU in the world coordinate system.
3. The IMU-fused three-dimensional lidar localization and mapping method of claim 1, wherein the point cloud segmentation comprises the steps of:
ground point cloud segmentation: rapidly dividing an original point cloud image subjected to point cloud preprocessing into a ground point cloud and a target point cloud through a ground segmentation algorithm, and extracting the target point cloud;
and (3) clustering and segmentation of point clouds: and carrying out cluster segmentation on the target point cloud after the ground point cloud segmentation extraction by using a cluster segmentation algorithm.
4. The IMU-fused three-dimensional lidar localization and mapping method of claim 3, wherein the step of segmenting the ground point cloud and the target point cloud is as follows:
determining an angle theta by using a ground segmentation algorithm: OK and OM are two adjacent laser lines sent by the laser radar, and the corresponding depths are R respectively r And R is r-1 The method comprises the steps of carrying out a first treatment on the surface of the θ is the angle between KM and MN, β i 、β j For the included angle between the laser radar wire harness and the x-axis, the calculation method of the angle theta is as follows:
introducing variations based on lidar mounting heightHeight threshold h pc To filter small obstructions proximate the ground:
h pc =0.15h L
wherein h is L Representing the mounting height of the radar;
the target point cloud and the ground point cloud are distinguished and segmented: when the angle theta is smaller than 15 degrees and the height of the cloud of the two points M, K is larger than the height threshold h pc Or when the angle theta is larger than 15 degrees, judging that the target point cloud is formed; otherwise, the cloud is the ground point cloud.
5. The IMU-fused three-dimensional lidar localization and mapping method of claim 3, wherein a dynamic cluster threshold is introduced into the point cloud cluster segmentation:
wherein N is p For clustering points, N l D represents the horizontal distance between the radar and the laser scanning point, which is the number of scanning laser beams occupied in the vertical direction.
6. The method for positioning and mapping the three-dimensional laser radar fused with the IMU according to claim 2, wherein the high-frequency signal generated by the IMU is used for carrying out point cloud distortion correction on the target point cloud so as to complete motion distortion compensation; the method comprises the following steps:
simultaneously receiving laser radar data and high-frequency signals of the IMU, and storing the laser radar data and the high-frequency signals;
according to the time stamp of the current point cloud and the sampling frequency of the laser radar, calculating the time of each sampling point;
searching adjacent IMU gesture data in a storage queue according to the current timestamp, and estimating the current laser point gesture through spherical linear interpolation;
transforming all the point cloud postures from the scanning start time to the scanning end time;
wherein, the point cloud scanning starts and stops at the moment by pre-integrating the IMUPerforming linear interpolation on the pose to calculate the pose of a frame of point cloud at a corresponding moment; let L be k-1 And L k+1 The moment translation vectors are respectively T k-1 、T k+1 The gesture vectors are respectively P k-1 、P k+1 ;L j-1 And L j+1 The moment translation vectors are respectively T j-1 、T j+1 The gesture vectors are respectively P j-1 、P j+1
The position interpolation formula is as follows:
the formula of the spherical linear interpolation is as follows:
P k =a(t)P k-1 +b(t)P k+1
wherein P is k Is the required attitude vector, P k-1 And P k+1 Respectively IMU at L j-1 And L j+1 The pose of the moment, a (t) and b (t), are two coefficient variables that vary according to the time t.
7. The method for positioning and mapping the three-dimensional laser radar fused with the IMU according to claim 1, wherein after the point cloud distortion correction is performed on the target point cloud, the method further comprises filtering non-characteristic point clouds from characteristic point clouds subjected to motion distortion compensation, distinguishing plane characteristics and edge characteristics, reducing data operand and improving calculation efficiency; the method comprises the following specific steps:
rejecting non-characteristic point clouds: characteristic points when the laser beam and the object surface approach parallel and characteristic points of the shielded part are regarded as unreliable point clouds and are not included in the characteristic sequence;
distinguishing between planar features and edge features: through point cloud roughnessSetting a threshold value to distinguish plane characteristics and edge characteristics so as to extract different types of structural information from point cloud data, thereby facilitating better point cloud matching and calculation;
the point cloud roughnessThe formula of (2) is:
wherein,is a set of adjacent points on the same scanning line at the moment k; />For the current frame point cloud coordinates under the laser radar reference system,/->Is->The point clouds are adjacent to one another on the same scanning line; point cloud roughness->When the threshold value is larger than the threshold value, dividing the edge characteristic into edge characteristics; roughness c of point cloud l k When the value is smaller than the threshold value, the plane features are divided.
8. The method for positioning and mapping the three-dimensional laser radar fused with the IMU according to claim 1, wherein the Scan-Context descriptor is introduced to perform loop detection, comprising the following steps:
introducing a Scan-Context descriptor to perform loop detection, selecting an average value of 3 adjacent height maximum values in each region, and taking the average value as a numerical value of the feature matrix;
if the distance of one scanning context pair meets the threshold condition, next step is to find the matched loop frame based on the Euclidean distance mode for verification;
if loop matching of the query point cloud and the candidate point cloud based on Euclidean distance is successful, adding loop constraint into the nonlinear optimization function to obtain minimum error.
9. The IMU-fused three-dimensional lidar localization and mapping method of claim 1, wherein constructing a jointly optimized error function to accurately solve for radar pose comprises the steps of:
optimizing inter-frame matching errors: two frames of point cloud registration errors are constructed, so that the instantaneity of an algorithm is improved, and the characteristics of key frames are included in an optimization function;
optimizing IMU errors: an IMU error function is constructed to improve positioning accuracy;
optimizing loop error; constructing loop errors to improve positioning accuracy;
and constructing a nonlinear cost function of laser radar constraint, IMU measurement constraint and loop-back constraint, and solving an optimal state with the minimum error function by utilizing a nonlinear optimization algorithm of Gauss Newton.
10. A computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor performs the steps of the three-dimensional lidar localization and mapping method of the fusion IMU of any of claims 1 to 9.
CN202311229196.7A 2023-09-21 2023-09-21 IMU-fused three-dimensional laser radar positioning and mapping method Pending CN117419719A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311229196.7A CN117419719A (en) 2023-09-21 2023-09-21 IMU-fused three-dimensional laser radar positioning and mapping method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311229196.7A CN117419719A (en) 2023-09-21 2023-09-21 IMU-fused three-dimensional laser radar positioning and mapping method

Publications (1)

Publication Number Publication Date
CN117419719A true CN117419719A (en) 2024-01-19

Family

ID=89529178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311229196.7A Pending CN117419719A (en) 2023-09-21 2023-09-21 IMU-fused three-dimensional laser radar positioning and mapping method

Country Status (1)

Country Link
CN (1) CN117419719A (en)

Similar Documents

Publication Publication Date Title
CN110261870B (en) Synchronous positioning and mapping method for vision-inertia-laser fusion
CN110389348B (en) Positioning and navigation method and device based on laser radar and binocular camera
JP6760114B2 (en) Information processing equipment, data management equipment, data management systems, methods, and programs
CN108280442B (en) Multi-source target fusion method based on track matching
CN112396650B (en) Target ranging system and method based on fusion of image and laser radar
Zhao et al. A robust laser-inertial odometry and mapping method for large-scale highway environments
CN110487286B (en) Robot pose judgment method based on point feature projection and laser point cloud fusion
Yin et al. Dynam-SLAM: An accurate, robust stereo visual-inertial SLAM method in dynamic environments
CN112381890B (en) RGB-D vision SLAM method based on dotted line characteristics
US10996337B2 (en) Systems and methods for constructing a high-definition map based on landmarks
CN114526745A (en) Drawing establishing method and system for tightly-coupled laser radar and inertial odometer
CN112101160B (en) Binocular semantic SLAM method for automatic driving scene
CN114325634A (en) Method for extracting passable area in high-robustness field environment based on laser radar
CN107798691A (en) A kind of unmanned plane independent landing terrestrial reference real-time detecting and tracking method of view-based access control model
CN117218350A (en) SLAM implementation method and system based on solid-state radar
CN115930977A (en) Method and system for positioning characteristic degradation scene, electronic equipment and readable storage medium
Yan et al. SensorX2car: Sensors-to-car calibration for autonomous driving in road scenarios
CN114549549A (en) Dynamic target modeling tracking method based on instance segmentation in dynamic environment
CN116643291A (en) SLAM method for removing dynamic targets by combining vision and laser radar
CN116862832A (en) Three-dimensional live-action model-based operator positioning method
WO2020113425A1 (en) Systems and methods for constructing high-definition map
CN113554705B (en) Laser radar robust positioning method under changing scene
CN111239761B (en) Method for indoor real-time establishment of two-dimensional map
CN112432653B (en) Monocular vision inertial odometer method based on dotted line characteristics
CN117419719A (en) IMU-fused three-dimensional laser radar positioning and mapping method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination