CN115407357A - Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene - Google Patents

Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene Download PDF

Info

Publication number
CN115407357A
CN115407357A CN202210784265.XA CN202210784265A CN115407357A CN 115407357 A CN115407357 A CN 115407357A CN 202210784265 A CN202210784265 A CN 202210784265A CN 115407357 A CN115407357 A CN 115407357A
Authority
CN
China
Prior art keywords
point
imu
point cloud
points
rtk
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210784265.XA
Other languages
Chinese (zh)
Inventor
高旺
刘宏
潘树国
乔龙雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202210784265.XA priority Critical patent/CN115407357A/en
Publication of CN115407357A publication Critical patent/CN115407357A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a low-beam laser radar-IMU-RTK positioning mapping algorithm based on a large scene, which comprises the following steps: in the initialization stage, the RTK position information is used for quickly and accurately completing IMU alignment, and the correct initial pose of the laser radar-IMU-RTK system under the global coordinate is solved; extracting surrounding point cloud environment features, and distinguishing angular points and surface points according to curvatures; constructing a local point cloud map in real time, and calculating the current laser radar pose by utilizing the dynamic matching of current frame point cloud information and the local map; and constructing a laser radar odometer factor, an IMU pre-integration factor and an RTK absolute position factor by using the graph optimization model, and optimally solving a positioning result and splicing a real-time point cloud map. According to the invention, through the graph optimization model, the problems of positioning and graph building of the low-beam laser radar in a large scene characteristic sparse environment can be solved, and meanwhile, real-time acquisition can be guaranteed.

Description

Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene
Technical Field
The invention belongs to the field of laser radar SLAM (Simultaneous Localization And Mapping), relates to a multi-sensor fusion technology, and particularly relates to a low-beam laser radar-IMU-RTK positioning Mapping algorithm based on a large scene.
Background
Obtaining accurate and reliable positioning is a basic requirement of mobile robots and automatic driving. In recent years, the technology of synchronous positioning and mapping by vision and laser has been developed greatly. The vision positioning can realize the state estimation with six degrees of freedom by depending on a camera, but the vision positioning is limited by more environments and is easily influenced by factors such as system initialization, illumination, low texture characteristics and the like. The laser sensor can directly obtain depth information, has higher resolution, can work at night and can realize more accurate pose estimation.
The point cloud matching scheme is an iterative ICP algorithm in the past, but is easy to fall into a local minimum value, and long in registration time. A LOAM issued in 2014 firstly adopts point line and point surface feature matching pose calculation, and a point cloud map is updated and maintained by using a frame-to-frame (scan-to-scan) odometer result initial value and frame-to-map (scan-to-map) matching in the map building process. LIO-SAM released in 2020 integrates three sensor information of laser radar-IMU-GNSS, and by means of high-frequency calculation of IMU odometer and feature matching of laser radar odometer, GNSS and loop-back information are used for inhibiting drift of positioning results in a mapping process to obtain a point cloud map with good global consistency.
According to the principle, the laser radar SLAM technology mainly adopts a front-end processing mode of extracting the characteristics of angular points and surface points, and predicted values are provided through IMU (inertial measurement Unit) odometers or scan-to-scan matching. In the optimization process, a more accurate positioning result is obtained by matching the current frame with the local map. And factor graph frames are generally utilized in back-end processing to perform multi-source information processing fusion. But existing algorithms are mainly directed to small scene motion states. And for the complex scene, the carrier environment changes greatly, the accumulative error is easy to occur, and the robustness is poor. Therefore, a reliable laser SLAM technology under a large-scene-based urban environment needs to be accurately explored under the condition of ensuring positioning and mapping.
Disclosure of Invention
In order to solve the problems, the invention discloses a low-beam laser radar-IMU-RTK positioning mapping algorithm based on a large scene, which utilizes the low-beam laser radar, the IMU and the RTK to realize information complementary utilization according to the complexity of the current urban environment positioning, and acquires and obtains a reliable positioning mapping result in real time.
In order to achieve the purpose, the technical scheme of the invention is as follows:
the low-beam laser radar-IMU-RTK positioning mapping algorithm based on the large scene comprises the following steps:
step 1, initializing to quickly and accurately finish IMU alignment by using RTK position information, and resolving a correct initial pose of a laser radar-IMU-RTK system under a global coordinate;
step 2, extracting surrounding point cloud environment features, distinguishing angular points and surface points according to the curvature of the scanning points, and performing point cloud distortion removal correction by adopting IMU data;
step 3, constructing a local point cloud map in real time, carrying out voxel filtering on the map and the current frame to improve the subsequent calculation efficiency, and carrying out iterative calculation on the current laser radar pose by utilizing the dynamic matching of the current frame point cloud information and the local map;
and 4, constructing a laser radar odometer factor, an IMU pre-integration factor and an RTK absolute position factor by using a graph optimization model, optimally solving a positioning result, splicing a real-time point cloud map, and obtaining a global map with better consistency.
The method comprises the following specific steps:
step 1.IMU Rapid and accurate initial alignment
Figure RE-GDA0003833729390000021
In the formula (I), the compound is shown in the specification,
Figure RE-GDA0003833729390000022
representing a navigational coordinate system toA transformation matrix of the carrier coordinate system; a is b 、b b 、c b Respectively representing position coordinates under a carrier coordinate system; a is n 、b n 、c n And the coordinate values of the corresponding points in the navigation coordinate system are represented. And determining the attitude of the inertial device under a navigation coordinate system by initial alignment, and obtaining an attitude initial value of the laser radar-IMU-RTK fusion positioning under the global coordinate.
Step 2, extracting surrounding point cloud environment features, and distinguishing angular points and surface points according to curvature of scanning points
Extracting environmental point cloud characteristics, and distinguishing an angular point and a surface point according to the curvature of a scanning point, wherein the curvature calculation formula is as follows:
Figure RE-GDA0003833729390000023
| S | represents the set of collinear beam surrounding points of the scanning point, r i Representing the depth value of this point. The curvature of all points scanned by each frame of laser radar can be calculated by the formula in turn. And simultaneously, removing parallel points and shielding points by using the depth information of each point. In order to improve the traversal efficiency, each frame of point cloud coordinate information is projected into a two-dimensional image and divided into six sub-images, and the angular points and the surface points are respectively extracted from the sub-images. According to the threshold value c th Making a decision that c is less than c th Is a corner point, c > c th Is a pastry. Due to the motion of the carrier, point cloud distortion occurs in each frame scanning process, and motion compensation can be realized by adopting high-frequency IMU data for short-time integration.
Step 3, constructing a local point cloud map, and optimizing the pose of point cloud registration
And merging key frame information in a certain range in a sliding window mode. In order to improve the point cloud matching rate, the point cloud information of the local map is stored in the data structure of the kd-tree, so that subsequent searching is facilitated. The motion estimation module matches the current frame with the local map to solve the motion state.
For the corner point, searching five points which are nearest to the local map, and calculating a five-point mean value and a covariance matrix. When the line characteristics meeting the conditions exist, the distance between the current corner point and the line can be calculated. The distance calculation formula is as follows:
Figure RE-GDA0003833729390000024
Figure RE-GDA0003833729390000025
where, and x represent dot product and cross product operation, respectively, p n Is a unit vector, p ε Representing the current corner point,
Figure RE-GDA0003833729390000031
representing the feature vector corresponding to the feature value. In the same way, a surface point p s And searching a five-point composition plane in the local map. At the moment, the eigenvector corresponding to the minimum eigenvalue of the covariance matrix of the five points
Figure RE-GDA0003833729390000032
The normal vector corresponding to this plane. The point-to-surface distance is calculated as follows:
Figure RE-GDA0003833729390000033
a formula
Figure RE-GDA0003833729390000034
And
Figure RE-GDA0003833729390000035
respectively representing the geometric centers of the local map extraction corner points and the face points. Minimizing the dotted line residual f (p) by combination ε ) Sum point-surface residual f (p) s ) Obtaining the best pose estimation T of the current point on the local map k . The solution process is as follows:
min{∑f(p ε )+f(p s )} (6)
step 4, solving the positioning result by the graph optimization model and splicing the point cloud maps in real time
For theOptimizing the state quantity for the key frames in a certain window with the number of n
Figure RE-GDA0003833729390000036
The minimization solution equation is as follows:
Figure RE-GDA0003833729390000037
in the formula R p (x) Representing marginalized residual errors, wherein the marginalization aims to reduce the scale of an optimization equation, and the key frame information shifted out of a sliding window is stored as prior information of current optimization.
Figure RE-GDA0003833729390000038
Representing the interframe constraints of the lidar odometer within the sliding window, the residual is calculated as follows:
Figure RE-GDA0003833729390000039
in the formula, i and j respectively represent the previous frame and the current frame, T is the Euclidean matrix representation of the pose, and delta T is the pose variation.
Figure RE-GDA00038337293900000310
Representing the IMU pre-integration constraint in the sliding window, and solving the residual error according to the following formula:
Figure RE-GDA00038337293900000311
in the formula
Figure RE-GDA00038337293900000312
Is the attitude represented by the rotation matrix, and W is the world coordinate system. δ α, δ β and δ θ represent the amount of change in position, velocity and angle, δ b a And δ b w Representing the bias changes of the accelerometer and gyroscope, respectively. g W Representing the value of gravity, p, in the world coordinate system W 、v W And q is W Is the attitude of the position, velocity and quaternion representation under the world system. [. Extract from] xyz The imaginary part of the number of quaternions is referred to,
Figure RE-GDA0003833729390000041
respectively representing the position, velocity and angle between two adjacent frames of the IMU pre-integration calculation. While resolving the estimated bias of accelerometer and gyroscope per frame as b a And b w
R g (x) Representing the RTK absolute position residual constraint, the calculation formula is as follows:
Figure RE-GDA0003833729390000042
in the formula
Figure RE-GDA0003833729390000043
Indicating the position results of the RTK measurements. The weight of the RTK factors in the optimization problem varies according to their covariance.
According to the formula (7), the state quantity of each frame in the sliding window is solved by using a Ceres optimization library, the acquired point cloud posture is updated in real time, laser radar scanning points in the motion state of the carrier are switched to a world coordinate system, and a point cloud map with the same global situation is sequentially spliced and synthesized.
The beneficial effects of the invention include:
the large-scene-based low-beam laser radar-IMU-RTK positioning mapping algorithm provided by the invention utilizes a mapping optimization model and reasonably utilizes data information of three sensors, overcomes the defects that the low-beam laser radar is easy to position and diverge in a large scene, RTK is shielded in an urban environment signal, and IMU is accumulated and drifts for a long time, and realizes a SLAM system for high-precision positioning and mapping. The method is used for vehicle-mounted test in urban environment, positioning accuracy and robustness can be obviously improved, and the method has important reference significance for industries such as three-dimensional live-action surveying and mapping, automatic driving and the like.
Drawings
FIG. 1 is a flow chart of a low beam lidar-IMU-RTK positioning-based mapping algorithm of the present invention;
FIG. 2 is a schematic diagram of applying a graph optimization model to solve positioning results at the back end;
FIG. 3 is a laser radar-IMU-RTK experimental facility for data acquisition and real-time positioning and mapping;
FIG. 4 is a schematic diagram of large scene positioning and mapping of the experimental system in vehicle-mounted testing, and high-precision results can be output in real time and stably run.
Detailed Description
The present invention will be further illustrated with reference to the accompanying drawings and specific embodiments, which are to be understood as merely illustrative of the invention and not as limiting the scope of the invention.
As shown in fig. 1, the embodiment discloses a low beam lidar-IMU-RTK positioning mapping algorithm, which comprises the following specific steps:
step 1.IMU Rapid and accurate initial alignment
Figure RE-GDA0003833729390000051
In the formula (I), the compound is shown in the specification,
Figure RE-GDA0003833729390000052
representing a transformation matrix from a navigation coordinate system to a carrier coordinate system; a is b 、b b 、c b Respectively representing position coordinates under a carrier coordinate system; a is n 、b n 、c n And the coordinate values of the corresponding points in the navigation coordinate system are represented. And determining the attitude of the inertial device under a navigation coordinate system by initial alignment, and obtaining an attitude initial value of the laser radar-IMU-RTK fusion positioning under the global coordinate.
Step 2, extracting the environmental characteristics of the surrounding point cloud, and distinguishing an angular point and a surface point according to the curvature of the scanning point
Extracting environmental point cloud characteristics, and distinguishing an angular point and a surface point according to the curvature of a scanning point, wherein the curvature calculation formula is as follows:
Figure RE-GDA0003833729390000053
| S | represents the set of collinear beam surrounding points of the scanning point, r i Representing the depth value of this point. The curvature of all points scanned by each frame of laser radar can be calculated by the formula in turn. And simultaneously, removing parallel points and shielding points by using the depth information of each point. In order to improve the traversal efficiency, each frame of point cloud coordinate information is projected into a two-dimensional image and divided into six sub-images, and the angular points and the surface points are respectively extracted from the sub-images. According to the threshold value c th Making a decision that c is less than c th Is a corner point, c > c th Is a pastry. Due to the motion of the carrier, point cloud distortion can occur in each frame scanning process, and motion compensation can be realized by adopting high-frequency IMU data for short-time integration.
Step 3, constructing a local point cloud map, and carrying out point cloud registration to optimize pose
And merging key frame information in a certain range in a sliding window mode. In order to improve the point cloud matching rate, the point cloud information of the local map is stored in a data structure of the kd-tree, so that subsequent searching is facilitated. The motion estimation module matches the current frame with the local map to solve the motion state.
For the corner point, searching five points which are nearest to the local map, and calculating a five-point mean value and a covariance matrix. When the line characteristics meeting the conditions exist, the distance between the current corner point and the line can be calculated. The distance calculation formula is as follows:
Figure RE-GDA0003833729390000054
Figure RE-GDA0003833729390000055
where, and x represent dot product and cross product operation, respectively, p n Is a unit vector, p ε Representing the current corner point,
Figure RE-GDA0003833729390000056
representing pairs of feature valuesThe corresponding feature vector. In the same way, a surface point p s And searching a five-point composition plane in the local map. At the moment, the eigenvector corresponding to the minimum eigenvalue of the covariance matrix of the five points
Figure RE-GDA0003833729390000057
The normal vector corresponding to this plane. The point-to-surface distance is calculated as follows:
Figure RE-GDA0003833729390000058
a formula
Figure RE-GDA0003833729390000059
And
Figure RE-GDA00038337293900000510
respectively representing the geometric centers of the local map extraction corner points and the face points. Minimizing the dotted line residual f (p) by combination ε ) Sum point-surface residual f (p) s ) Obtaining the optimal pose estimation T of the current point on the local map k . The solution process is as follows:
min{∑f(p ε )+f(p s )} (6)
step 4, solving the positioning result by the graph optimization model and splicing the point cloud maps in real time
Optimizing the state quantity for the key frames in a certain window, wherein the number of the key frames is n
Figure RE-GDA0003833729390000061
The minimization solution equation is as follows:
Figure RE-GDA0003833729390000062
in the formula R p (x) Representing marginalized residual errors, wherein the marginalization aims to reduce the scale of an optimization equation, and the key frame information shifted out of a sliding window is stored as prior information of current optimization.
Figure RE-GDA0003833729390000063
Representing the interframe constraints of the lidar odometer within the sliding window, the residual is calculated as follows:
Figure RE-GDA0003833729390000064
in the formula, i and j respectively represent the previous frame and the current frame, T is the Euclidean matrix representation of the pose, and delta T is the pose variation.
Figure RE-GDA0003833729390000065
Representing the IMU pre-integration constraint in the sliding window, and solving the residual error according to the following formula:
Figure RE-GDA0003833729390000066
in the formula
Figure RE-GDA0003833729390000067
Is the attitude represented by the rotation matrix, and W is the world coordinate system. δ α, δ β and δ θ represent the amount of change in position, velocity and angle, δ b a And δ b w Representing the bias changes of the accelerometer and gyroscope, respectively. g W Representing the value of gravity, p, in the world coordinate system W 、v W And q is W Is the attitude of the position, velocity and quaternion representation under the world system. [. Extract from] xyz The imaginary part of the number of quaternions is referred to,
Figure RE-GDA0003833729390000068
respectively representing the position, velocity and angle between two adjacent frames of the IMU pre-integration calculation. Solving the estimated bias of accelerometer and gyroscope at the same time per frame as b a And b w
R g (x) Representing the RTK absolute position residual constraint, the calculation formula is as follows:
Figure RE-GDA0003833729390000071
in the formula
Figure RE-GDA0003833729390000072
Position results of the RTK measurements are represented. The weight of the RTK factors in the optimization problem varies according to their covariance.
According to the formula (7), the state quantity of each frame in the sliding window is solved by using a Ceres optimization library, the acquired point cloud posture is updated in real time, laser radar scanning points in the motion state of the carrier are switched to a world coordinate system, and a point cloud map with the same global situation is sequentially spliced and synthesized.
FIG. 3 is an experimental facility for positioning and mapping in urban environments, using Velodyne-VLP16 lidar, ADIS16488 inertial navigation, tianbao receiver. The installation parameters between the sensors are known, and the time synchronization is realized in a hardware mode. FIG. 4 shows the experimental positioning and mapping results of the vehicle-mounted platform in a large scene, and the whole-course positioning accuracy (root mean square error RMSE) index of the vehicle-mounted platform in an urban environment of 23.6 kilometers is 0.24m in the vehicle-mounted test (including an overhead and forest shade scene shielded by RTK, a large scene environment with sparse and degraded point cloud, and the like). Meanwhile, the system can utilize the point cloud to construct surrounding scenes in real time, and automatic driving mapping and perception are achieved.
It should be noted that the above-mentioned contents only illustrate the technical idea of the present invention, and the protection scope of the present invention is not limited thereby, and it is obvious to those skilled in the art that several modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations fall within the protection scope of the claims of the present invention.

Claims (5)

1. The low-beam laser radar-IMU-RTK positioning mapping algorithm based on the large scene is characterized by comprising the following specific steps:
step 1, initializing to finish IMU alignment by using RTK position information, and resolving a correct initial pose of a laser radar-IMU-RTK system under a global coordinate;
step 2, extracting surrounding point cloud environment features, distinguishing angular points and surface points according to the curvature of scanning points, and performing point cloud distortion removal correction by adopting IMU data;
step 3, constructing a local point cloud map in real time, carrying out voxel filtering on the map and a current frame to improve subsequent calculation efficiency, and carrying out iterative calculation on the current laser radar pose by utilizing dynamic matching of current frame point cloud information and the local map;
and 4, constructing a laser radar odometer factor, an IMU pre-integration factor and an RTK absolute position factor by using a graph optimization model, optimally solving a positioning result, splicing a real-time point cloud map, and obtaining a global map with better consistency.
2. The large scene based low beam lidar-IMU-RTK positioning mapping algorithm of claim 1, wherein the IMU initial alignment of step 1 is solved by:
Figure RE-FDA0003833729380000011
in the formula (I), the compound is shown in the specification,
Figure RE-FDA0003833729380000012
representing a transformation matrix from a navigation coordinate system to a carrier coordinate system; a is b 、b b 、c b Respectively representing position coordinates under a carrier coordinate system; a is n 、b n 、c n And expressing the coordinate value of the corresponding point under the navigation coordinate system, initially aligning and determining the attitude of the inertial device under the navigation coordinate system, and obtaining the attitude initial value of the laser radar-IMU-RTK fusion positioning under the global coordinate.
3. The large scene based low beam lidar-IMU-RTK positioning mapping algorithm of claim 1, wherein the step 2 extracts environmental point cloud features, distinguishes between corner points and face points according to the curvature of the scanning points, and the curvature c is calculated as follows:
Figure RE-FDA0003833729380000013
| S | represents the set of collinear beam surrounding points of the scanning point, r i Indicates the depth value of the point, r j Representing the depth value of the collinear point spot near the point, sequentially calculating the curvatures of all points scanned by each frame of laser radar by the formula, simultaneously removing parallel points and shielding points by using the depth information of each point, projecting each frame of point cloud coordinate information into a two-dimensional image for improving the traversal efficiency, dividing the two-dimensional image into six sub-images, respectively extracting angular points and surface points from the sub-images, and according to a threshold value c th Making a decision that c is less than c th Is a corner point, c > c th The method is characterized in that the method is a surface point, point cloud distortion occurs in each frame scanning process due to carrier motion, and motion compensation is realized by adopting high-frequency IMU data short-time integral operation.
4. The large-scene-based low-beam lidar-IMU-RTK positioning mapping algorithm of claim 1, wherein the step 3 constructs a local point cloud map in real time, performs voxel filtering on the map and a current frame to improve subsequent calculation efficiency, uses current frame point cloud information to dynamically match with the local map, iteratively calculates the current lidar pose, combines key frame information in a certain range in a sliding window manner, and stores the point cloud information of the local map in a data structure of a kd-tree to improve the point cloud matching rate so as to facilitate subsequent search,
the motion estimation module matches the current frame with a local map to solve a motion state, searches five points nearest to the local map for an angular point, calculates a five-point mean value and a covariance matrix, and calculates the distance from the current angular point to a line when a line characteristic meeting a condition exists, wherein a distance calculation formula is as follows:
Figure RE-FDA0003833729380000021
Figure RE-FDA0003833729380000022
where, and x represent dot product and cross product operation, respectively, p n Is a unit vector, p ε Representing the current corner point,
Figure RE-FDA0003833729380000023
representing eigenvectors corresponding to eigenvalues, and, similarly, a face point p s Searching a five-point forming plane in a local map, wherein the eigenvector corresponding to the minimum eigenvalue of the covariance matrix of the five points
Figure RE-FDA0003833729380000024
Calculating the distance between the point and the plane for the normal vector corresponding to the plane according to the formula (5):
Figure RE-FDA0003833729380000025
a formula
Figure RE-FDA0003833729380000026
And
Figure RE-FDA0003833729380000027
respectively representing the geometric centers of the extracted corner points and face points of the local map, and minimizing the residual f (p) of the point line by combining ε ) Sum point-surface residual f (p) s ) Obtaining the best pose estimation T of the current point on the local map k The solution process is as follows:
min{∑f(p ε )+f(p s )} (6)。
5. the large-scene-based low-beam lidar-IMU-RTK positioning mapping algorithm of claim 1, wherein the mapping optimization model of step 4 constructs lidar odometry factors, IMU pre-integration factors and RTK absolute position factors, optimizes the solution positioning result and performs real-time point cloud map stitching, and optimizes the state quantity for n keyframes in a certain window
Figure RE-FDA0003833729380000028
The minimization solution equation is as follows:
Figure RE-FDA0003833729380000029
in the formula R p (x) Representing marginalized residuals, the purpose of marginalization is to reduce the scale of the optimization equation, the keyframe information shifted out of the sliding window is saved as prior information of the current optimization,
Figure RE-FDA00038337293800000210
representing the interframe constraints of the lidar odometer within the sliding window, the residual is calculated as follows:
Figure RE-FDA0003833729380000031
wherein i and j represent the previous frame and the current frame respectively, T is the Euclidean matrix representation of the pose, delta T is the pose variation,
Figure RE-FDA0003833729380000032
representing the IMU pre-integration constraint in the sliding window, and solving the residual error according to the following formula:
Figure RE-FDA0003833729380000033
in the formula
Figure RE-FDA0003833729380000034
Is the attitude represented by a rotation matrix, W is the world coordinate system, b i And b j Indicating a loading system in which the ith frame and the jth frame are positioned, wherein delta alpha, delta beta and delta theta respectively represent the variation of position, speed and angle, and deltab a And δ b w Representing the change in bias of the accelerometer and gyroscope, respectively, Δ t being the time interval between two frames, g W Representing the value of gravity, p, in the world coordinate system W 、v W And q is W Is the position, speed and posture expressed by quaternion under the world system [ "in a book] xyz The imaginary part of the number of quaternions is referred to,
Figure RE-FDA0003833729380000035
respectively representing the position, speed and angle between two adjacent frames of IMU pre-integration calculation, and calculating the bias of the estimated accelerometer and gyroscope as b a And b w
R g (x) Representing the RTK absolute position residual constraint, the calculation formula is as follows:
Figure RE-FDA0003833729380000036
in the formula
Figure RE-FDA0003833729380000037
Representing the position results of the RTK measurements, the weight of the RTK factors in the optimization problem varies according to their covariance,
according to the formula (7), the state quantity of each frame in the sliding window is solved by using a Ceres optimization library, the acquired point cloud posture is updated in real time, laser radar scanning points in the motion state of the carrier are switched to a world coordinate system, and a point cloud map with the same global situation is sequentially spliced and synthesized.
CN202210784265.XA 2022-07-05 2022-07-05 Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene Pending CN115407357A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210784265.XA CN115407357A (en) 2022-07-05 2022-07-05 Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210784265.XA CN115407357A (en) 2022-07-05 2022-07-05 Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene

Publications (1)

Publication Number Publication Date
CN115407357A true CN115407357A (en) 2022-11-29

Family

ID=84157404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210784265.XA Pending CN115407357A (en) 2022-07-05 2022-07-05 Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene

Country Status (1)

Country Link
CN (1) CN115407357A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115561731A (en) * 2022-12-05 2023-01-03 安徽蔚来智驾科技有限公司 Pose optimization method, point cloud map establishment method, computer device and medium
CN115797425A (en) * 2023-01-19 2023-03-14 中国科学技术大学 Laser global positioning method based on point cloud aerial view and rough-to-fine strategy
CN116592888A (en) * 2023-05-08 2023-08-15 五八智能科技(杭州)有限公司 Global positioning method, system, device and medium for patrol robot
CN117250623A (en) * 2023-11-20 2023-12-19 山东新坐标智能装备有限公司 Positioning method, system and mobile robot for fusion of laser radar and complementary positioning
CN117433511A (en) * 2023-12-20 2024-01-23 绘见科技(深圳)有限公司 Multi-sensor fusion positioning method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115561731A (en) * 2022-12-05 2023-01-03 安徽蔚来智驾科技有限公司 Pose optimization method, point cloud map establishment method, computer device and medium
CN115561731B (en) * 2022-12-05 2023-03-10 安徽蔚来智驾科技有限公司 Pose optimization method, point cloud map establishment method, computer device and medium
CN115797425A (en) * 2023-01-19 2023-03-14 中国科学技术大学 Laser global positioning method based on point cloud aerial view and rough-to-fine strategy
CN116592888A (en) * 2023-05-08 2023-08-15 五八智能科技(杭州)有限公司 Global positioning method, system, device and medium for patrol robot
CN117250623A (en) * 2023-11-20 2023-12-19 山东新坐标智能装备有限公司 Positioning method, system and mobile robot for fusion of laser radar and complementary positioning
CN117250623B (en) * 2023-11-20 2024-02-27 苏州新坐标智能装备有限公司 Positioning method, system and mobile robot for fusion of laser radar and complementary positioning
CN117433511A (en) * 2023-12-20 2024-01-23 绘见科技(深圳)有限公司 Multi-sensor fusion positioning method
CN117433511B (en) * 2023-12-20 2024-03-12 绘见科技(深圳)有限公司 Multi-sensor fusion positioning method

Similar Documents

Publication Publication Date Title
CN109709801B (en) Indoor unmanned aerial vehicle positioning system and method based on laser radar
CN110261870B (en) Synchronous positioning and mapping method for vision-inertia-laser fusion
CN111862672B (en) Parking lot vehicle self-positioning and map construction method based on top view
CN109993113B (en) Pose estimation method based on RGB-D and IMU information fusion
CN112347840B (en) Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN115407357A (en) Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene
CN110411462B (en) GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method
CN110044354A (en) A kind of binocular vision indoor positioning and build drawing method and device
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN113781582B (en) Synchronous positioning and map creation method based on laser radar and inertial navigation combined calibration
Huang Review on LiDAR-based SLAM techniques
CN107167826B (en) Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving
CN114199259B (en) Multi-source fusion navigation positioning method based on motion state and environment perception
CN112987065B (en) Multi-sensor-integrated handheld SLAM device and control method thereof
CN114526745B (en) Drawing construction method and system for tightly coupled laser radar and inertial odometer
CN208323361U (en) A kind of positioning device and robot based on deep vision
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN114019552A (en) Bayesian multi-sensor error constraint-based location reliability optimization method
CN112556719B (en) Visual inertial odometer implementation method based on CNN-EKF
CN112781582A (en) Multi-sensor fusion high-precision pose estimation algorithm under satellite weak observation condition
CN115930977A (en) Method and system for positioning characteristic degradation scene, electronic equipment and readable storage medium
CN112945233A (en) Global drift-free autonomous robot simultaneous positioning and map building method
CN116242374A (en) Direct method-based multi-sensor fusion SLAM positioning method
CN115355904A (en) Slam method for Lidar-IMU fusion of ground mobile robot
CN115218889A (en) Multi-sensor indoor positioning method based on dotted line feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination