CN112859051A - Method for correcting laser radar point cloud motion distortion - Google Patents

Method for correcting laser radar point cloud motion distortion Download PDF

Info

Publication number
CN112859051A
CN112859051A CN202110030119.3A CN202110030119A CN112859051A CN 112859051 A CN112859051 A CN 112859051A CN 202110030119 A CN202110030119 A CN 202110030119A CN 112859051 A CN112859051 A CN 112859051A
Authority
CN
China
Prior art keywords
point cloud
frame
pose
laser radar
imu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110030119.3A
Other languages
Chinese (zh)
Other versions
CN112859051B (en
Inventor
刘飞
周志全
邹钰杰
屈婧婧
柴文静
杨起鸣
严景琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Electronic Technology
Original Assignee
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Electronic Technology filed Critical Guilin University of Electronic Technology
Priority to CN202110030119.3A priority Critical patent/CN112859051B/en
Publication of CN112859051A publication Critical patent/CN112859051A/en
Application granted granted Critical
Publication of CN112859051B publication Critical patent/CN112859051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manufacturing & Machinery (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A method for correcting laser radar point cloud motion distortion belongs to the technical field of unmanned vehicle automatic driving. The method aims at the problem that an external sensor has large displacement accumulated error in the laser radar point cloud motion distortion correction. The method comprises the following steps: an RTK is used as a pulse generator, and a laser radar and an IMU are synchronously triggered; establishing a global coordinate system for the first frame point cloud of the laser radar; establishing a local coordinate system for the non-first frame point cloud; simultaneously converting the corresponding IMU coordinate system to the local coordinate system; synchronously obtaining IMU poses; obtaining an IMU (inertial measurement Unit) pose corresponding to each data point in the current frame point cloud by adopting an interpolation method, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose; converting the local coordinate system into the global coordinate system, and calibrating the pose of the IMU by using the RTK pose; and correcting the point cloud of the next frame until the point cloud motion distortion of the laser radar of the last frame is corrected. The invention enhances the robustness and stability of laser radar point cloud correction.

Description

Method for correcting laser radar point cloud motion distortion
Technical Field
The invention relates to a method for correcting laser radar point cloud motion distortion, and belongs to the technical field of unmanned vehicle automatic driving.
Background
With the development of artificial intelligence, the laser radar has gained wide attention with the advantages of high resolution, long distance measurement, no influence from illumination, strong anti-interference capability and the like.
In the field of unmanned vehicle automatic driving, the laser radar can establish map information in an unknown environment, and provides a good foundation for subsequent positioning navigation, path planning and the like. When the laser radar based on mechanical rotation scanning moves along with an unmanned vehicle, the coordinate system of the laser radar changes continuously, so that points in the same frame of point cloud are not in the same coordinate system and generate motion distortion, and therefore the motion distortion correction needs to be performed on the acquired point cloud data.
At present, aiming at the motion distortion of a multi-line laser radar, a single Inertial Measurement Unit (IMU) or a combination of the IMU and a wheel speed odometer can be used as an external sensor, and the motion information of the laser radar is deduced on the assumption that an unmanned vehicle only has course angle change, so that the motion distortion of a laser radar point cloud is corrected. However, both the IMU and the wheel speed odometer need to obtain displacement through integration, and the defect of large accumulated error of the displacement exists in a long-time working state.
Disclosure of Invention
The invention provides a method for correcting point cloud motion distortion of a laser radar, aiming at the problem that displacement accumulated errors are large in an adopted external sensor in the conventional method for correcting the point cloud motion distortion of the laser radar.
The invention relates to a method for correcting laser radar point cloud motion distortion, which comprises the following steps,
the method comprises the following steps: an RTK is used as a pulse generator, and a laser radar and an IMU are synchronously triggered; establishing a global coordinate system for the collected first frame point cloud of the laser radar by taking the first frame point cloud as an origin;
step two: collecting current frame point clouds of a laser radar of a non-first frame, and establishing a local coordinate system by taking the current frame point clouds as an origin; simultaneously converting an IMU coordinate system corresponding to the current frame point cloud acquisition moment into the local coordinate system; synchronously obtaining IMU poses;
step three: obtaining an IMU (inertial measurement Unit) pose corresponding to each data point in the current frame point cloud by adopting an interpolation method according to the local coordinate system, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose;
step four: converting the local coordinate system into the global coordinate system, and calibrating the pose of the IMU by using the RTK pose; and then returning to the step two until the correction of the point cloud motion distortion of the laser radar of the last frame is finished.
According to the method for correcting the laser radar point cloud motion distortion of the invention,
in the second step, for the collected laser radar point cloud, under the local coordinate system, the coordinate of the ith point of the kth frame is
Figure BDA0002891765450000021
Wherein xki、yki、zkiThe coordinates of X, Y, Z axes of the ith point of the kth frame under a local coordinate system are respectively, wherein k belongs to (1,2,3.. m), i belongs to (1,2,3.. n), m is the total frame number of the point cloud, and n is the number of data points contained in the frame data;
the laser radar point cloud is expressed in a matrix form as follows:
Figure BDA0002891765450000022
according to the method for correcting the laser radar point cloud motion distortion of the invention,
in the second step, the IMU poses are expressed in a matrix form as follows:
Figure BDA0002891765450000023
wherein
Figure BDA0002891765450000024
Is the pose of the ith point of the kth frame of the laser radar point cloud,
Figure BDA0002891765450000025
is a rotation matrix from the ith point of the kth frame of the laser radar point cloud to the corresponding local coordinate system,
Figure BDA0002891765450000026
a translation matrix from the ith point of the kth frame of the laser radar point cloud to a corresponding local coordinate system;
Figure BDA0002891765450000027
Figure BDA0002891765450000028
the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Z axis of the local coordinate system, beta is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Y axis of the local coordinate system, and alpha is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the X axis of the local coordinate system;
Figure BDA0002891765450000029
in the formula sxIs the displacement of the ith point of the kth frame of the laser radar point cloud along the X axis of the local coordinate system syIs the displacement of the ith point of the kth frame of the laser radar point cloud along the Y axis of the local coordinate system szThe displacement of the ith point of the kth frame of the laser radar point cloud along the Z axis of the local coordinate system;
when i is equal to 0, the data is transmitted,
Figure BDA00028917654500000210
in the formula
Figure BDA00028917654500000211
Is the starting pose of the current k frame of the laser radar point cloud,
Figure BDA00028917654500000212
a rotation matrix of the current laser radar point cloud k frame data to a corresponding local coordinate system,
Figure BDA00028917654500000213
and a translation matrix from the current laser radar point cloud kth frame data to the corresponding local coordinate system.
According to the method for correcting the laser radar point cloud motion distortion of the invention,
correcting the laser radar point cloud motion distortion according to the IMU pose in the third step comprises the following steps:
marking the frame head time stamp of the kth frame of the laser radar point cloud as tdTime stamp in frame teAnd the time stamp of the frame tail is tf(ii) a The positions of IMUs of the frame head of the k frame and the front frame and the rear frame are respectively
Figure BDA0002891765450000031
And
Figure BDA0002891765450000032
IMU poses of a front frame and a rear frame in the k frame are respectively
Figure BDA0002891765450000033
And
Figure BDA0002891765450000034
the pose of the IMU of the frame before and after the frame end of the kth frame is
Figure BDA0002891765450000035
And
Figure BDA0002891765450000036
interpolating pose positions of IMU to obtain pose positions of the head, middle and tail of the kth frame
Figure BDA0002891765450000037
And
Figure BDA0002891765450000038
and then carrying out secondary curve fitting to obtain a point cloud pose curve fitting equation:
Pt k=At2+Bt+C,
in the formula Pt kThe pose corresponding to each data point in the kth frame point cloud is shown, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the point cloud motion distortion of the laser radar current frame according to a point cloud pose curve fitting equation to obtain:
Figure BDA0002891765450000039
in the formula
Figure BDA00028917654500000310
And representing corrected point cloud coordinates of the ith point of the kth frame.
According to the method for correcting the laser radar point cloud motion distortion of the invention,
the step four of utilizing the RTK pose to calibrate the IMU pose comprises the following steps:
judging whether the timestamp of the IMU pose is smaller than the timestamp of the RTK pose according to the RTK pose and the IMU pose in the local coordinate system;
if the timestamp of the IMU pose is smaller than the timestamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is larger than or equal to the time stamp of the RTK pose, searching two frames of RTK poses before and after the current frame IMU pose time stamp, and using the pose obtained by RTK interpolation as an IMU pose observation value;
and performing Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain a calibrated IMU pose.
The invention has the beneficial effects that: the method utilizes IMU pose interpolation to obtain the poses of all points of each frame in a local coordinate system, further converts each frame of point cloud after distortion removal into the local coordinate system of the current frame, converts each frame of local coordinate system into the global coordinate system, and finally completes the correction of the laser radar point cloud motion distortion.
The method utilizes a carrier phase differential technology RTK, the plane precision and the elevation precision of the RTK can reach centimeter level, no accumulated error exists, and the RTK can be used as a clock source to unify the time of each sensor on hardware. After the pose of the IMU is corrected in real time by using the RTK, the displacement, the course angle, the pitch angle and the roll angle of the unmanned vehicle can be calculated by using the IMU, and then the point cloud motion distortion of the laser radar is corrected. When the unmanned vehicle runs to an area with serious shielding, the IMU corrected by the RTK still can provide a more accurate pose, and the robustness and the stability of the system are enhanced although the RTK works are influenced to a certain extent.
Drawings
FIG. 1 is an overall flow chart of the correction method for the laser radar point cloud motion distortion;
FIG. 2 is a flowchart of an embodiment of a method for correcting the point cloud motion distortion of a laser radar;
FIG. 3 is a flow chart of interpolation IMU pose correction lidar point cloud distortion;
FIG. 4 is a flow chart for calibrating IMU pose using RTK pose.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The invention is further described with reference to the following drawings and specific examples, which are not intended to be limiting.
First embodiment, as shown in fig. 1 and fig. 2, the present invention provides a method for correcting laser radar point cloud motion distortion, including,
the method comprises the following steps: an RTK is used as a pulse generator, and a laser radar and an IMU are synchronously triggered; establishing a global coordinate system for the collected first frame point cloud of the laser radar by taking the first frame point cloud as an origin; after each time of triggering of the RTK, the RTK can correct the clock of the RTK, so that clock source time synchronization on hardware is completed;
step two: collecting current frame point clouds of a laser radar of a non-first frame, and establishing a local coordinate system by taking the current frame point clouds as an origin; simultaneously converting an IMU coordinate system corresponding to the current frame point cloud acquisition moment into the local coordinate system; synchronously obtaining IMU poses;
step three: obtaining an IMU (inertial measurement Unit) pose corresponding to each data point in the current frame point cloud by adopting an interpolation method according to the local coordinate system, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose;
step four: converting the local coordinate system into the global coordinate system, and calibrating the pose of the IMU by using the RTK pose; and then returning to the step two until the correction of the point cloud motion distortion of the laser radar of the last frame is finished.
The method realizes the correction of the laser radar point cloud distortion based on the inertial measurement unit IMU and the real-time dynamic carrier phase difference technology RTK. The RTK has two functions, namely, time synchronization is achieved, and data inaccuracy is caused by long-time operation due to time drift of the pose of the IMU, so that the pose of the IMU can be corrected by the RTK. Therefore, when the RTK is seriously shielded and cannot transmit data, the pose of the IMU is corrected, and the data which continuously work is still credible. Therefore, complementary operation of the three sensors of the laser radar, the RTK and the IMU is realized.
In the embodiment, when collecting the laser radar point cloud, firstly, judging whether the point cloud is a first frame point cloud, and establishing a global coordinate system according to the first frame point cloud; establishing a local coordinate system for the non-first frame point cloud; and after each correction is finished, judging whether the current frame point cloud is the last frame point cloud or not, if not, repeating the correction process, and if so, finishing the operation. And after finishing the correction of one frame of point cloud, correcting the pose of the IMU once until finishing all correction tasks.
Further, in the second step, for the collected laser radar point cloud, under the local coordinate system, the coordinate of the ith point of the kth frame is
Figure BDA0002891765450000051
Wherein xki、yki、zkiThe coordinates of X, Y, Z axes of the ith point of the kth frame under a local coordinate system are respectively, wherein k belongs to (1,2,3.. m), i belongs to (1,2,3.. n), m is the total frame number of the point cloud, and n is the number of data points contained in the frame data;
the laser radar point cloud is expressed in a matrix form as follows:
Figure BDA0002891765450000052
the last item 1 is added in the matrix form, which is equivalent to a row and four columns, so that the subsequent calculation can be facilitated.
Still further, in step two, the IMU pose is expressed in a matrix form as:
Figure BDA0002891765450000053
wherein
Figure BDA0002891765450000054
Is the pose of the ith point of the kth frame of the laser radar point cloud,
Figure BDA0002891765450000055
is a rotation matrix from the ith point of the kth frame of the laser radar point cloud to the corresponding local coordinate system,
Figure BDA0002891765450000056
a translation matrix from the ith point of the kth frame of the laser radar point cloud to a corresponding local coordinate system;
Figure BDA0002891765450000057
Figure BDA0002891765450000058
the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Z axis of the local coordinate system, beta is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Y axis of the local coordinate system, and alpha is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the X axis of the local coordinate system;
Figure BDA0002891765450000059
in the formula sxIs the displacement of the ith point of the kth frame of the laser radar point cloud along the X axis of the local coordinate system syIs the displacement of the ith point of the kth frame of the laser radar point cloud along the Y axis of the local coordinate system szThe displacement of the ith point of the kth frame of the laser radar point cloud along the Z axis of the local coordinate system;
when i is equal to 0, the data is transmitted,
Figure BDA00028917654500000510
in the formula
Figure BDA00028917654500000511
Is the starting pose of the current k frame of the laser radar point cloud,
Figure BDA00028917654500000512
a rotation matrix of the current laser radar point cloud k frame data to a corresponding local coordinate system,
Figure BDA0002891765450000061
and a translation matrix from the current laser radar point cloud kth frame data to the corresponding local coordinate system.
Each frame of data of the lidar point cloud has its own local coordinate system.
Still further, with reference to fig. 3, the correcting the laser radar point cloud motion distortion according to the IMU pose in step three includes:
marking the frame head time stamp of the kth frame of the laser radar point cloud as tdTime stamp in frame teAnd the time stamp of the frame tail is tf(ii) a The positions of IMUs of the frame head of the k frame and the front frame and the rear frame are respectively
Figure BDA0002891765450000062
And
Figure BDA0002891765450000063
IMU poses of a front frame and a rear frame in the k frame are respectively
Figure BDA0002891765450000064
And
Figure BDA0002891765450000065
the pose of the IMU of the frame before and after the frame end of the kth frame is
Figure BDA0002891765450000066
And
Figure BDA0002891765450000067
interpolating pose positions of IMU to obtain pose positions of the head, middle and tail of the kth frame
Figure BDA0002891765450000068
And
Figure BDA0002891765450000069
and then carrying out secondary curve fitting to obtain a point cloud pose curve fitting equation:
Pt k=At2+Bt+C,
in the formula Pt kThe pose corresponding to each data point in the kth frame point cloud is shown, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the point cloud motion distortion of the laser radar current frame according to a point cloud pose curve fitting equation to obtain:
Figure BDA00028917654500000610
in the formula
Figure BDA00028917654500000611
And representing corrected point cloud coordinates of the ith point of the kth frame.
Figure BDA00028917654500000612
Representing the inverse of the current k-th frame starting pose matrix, Pi kShowing the pose of the ith point of the current k-th frame,
Figure BDA00028917654500000613
and representing the point cloud coordinates before the i-th point correction of the current k-th frame. Thus, the correction of the current k frame point cloud motion distortion is completed.
Still further, as shown in fig. 4, the calibrating the position of the IMU by using the RTK position in the fourth step includes:
judging whether the timestamp of the IMU pose is smaller than the timestamp of the RTK pose according to the RTK pose and the IMU pose in the local coordinate system;
if the timestamp of the IMU pose is smaller than the timestamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is larger than or equal to the time stamp of the RTK pose, searching two frames of RTK poses before and after the current frame IMU pose time stamp, and using the pose obtained by RTK interpolation as an IMU pose observation value;
and performing Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain a calibrated IMU pose.
In the embodiment, the position and pose of the IMU under the original local coordinate system can be used as a predicted value, then Kalman fusion of the IMU position and pose predicted value and observation value is carried out, and finally the calibrated IMU position and pose are obtained.
In the present invention, the update frequency of the lidar may be 10HZ and the update frequency of the IMU may be 300HZ, as an example.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims. It should be understood that features described in different dependent claims and herein may be combined in ways different from those described in the original claims. It is also to be understood that features described in connection with individual embodiments may be used in other described embodiments.

Claims (5)

1. A method for correcting the point cloud motion distortion of laser radar comprises,
the method comprises the following steps: an RTK is used as a pulse generator, and a laser radar and an IMU are synchronously triggered; establishing a global coordinate system for the collected first frame point cloud of the laser radar by taking the first frame point cloud as an origin;
step two: collecting current frame point clouds of a laser radar of a non-first frame, and establishing a local coordinate system by taking the current frame point clouds as an origin; simultaneously converting an IMU coordinate system corresponding to the current frame point cloud acquisition moment into the local coordinate system; synchronously obtaining IMU poses;
step three: obtaining an IMU (inertial measurement Unit) pose corresponding to each data point in the current frame point cloud by adopting an interpolation method according to the local coordinate system, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose;
step four: converting the local coordinate system into the global coordinate system, and calibrating the pose of the IMU by using the RTK pose; and then returning to the step two until the correction of the point cloud motion distortion of the laser radar of the last frame is finished.
2. The method for correcting laser radar point cloud motion distortion according to claim 1,
in the second step, for the collected laser radar point cloud, under the local coordinate system, the coordinate of the ith point of the kth frame is
Figure FDA0002891765440000011
Wherein xki、yki、zkiX, Y, Z axes of the ith point of the k frame under the local coordinate systemThe coordinate of (1), (2), (3) · m), i ∈ (1, (2), (3) · n), m is the total frame number of the point cloud, and n is the number of data points contained in the frame data;
the laser radar point cloud is expressed in a matrix form as follows:
Figure FDA0002891765440000012
3. the method for correcting laser radar point cloud motion distortion according to claim 2,
in the second step, the IMU poses are expressed in a matrix form as follows:
Figure FDA0002891765440000013
wherein
Figure FDA0002891765440000014
Is the pose of the ith point of the kth frame of the laser radar point cloud,
Figure FDA0002891765440000015
is a rotation matrix from the ith point of the kth frame of the laser radar point cloud to the corresponding local coordinate system,
Figure FDA0002891765440000016
a translation matrix from the ith point of the kth frame of the laser radar point cloud to a corresponding local coordinate system;
Figure FDA0002891765440000017
Figure FDA0002891765440000018
the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Z axis of the local coordinate system, beta is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Y axis of the local coordinate system, and alpha is laserThe rotation angle of the ith point of the kth frame of the optical radar point cloud around the X axis of the local coordinate system;
Figure FDA0002891765440000019
in the formula sxIs the displacement of the ith point of the kth frame of the laser radar point cloud along the X axis of the local coordinate system syIs the displacement of the ith point of the kth frame of the laser radar point cloud along the Y axis of the local coordinate system szThe displacement of the ith point of the kth frame of the laser radar point cloud along the Z axis of the local coordinate system;
when i is equal to 0, the data is transmitted,
Figure FDA0002891765440000021
in the formula
Figure FDA0002891765440000022
Is the starting pose of the current k frame of the laser radar point cloud,
Figure FDA0002891765440000023
a rotation matrix of the current laser radar point cloud k frame data to a corresponding local coordinate system,
Figure FDA0002891765440000024
and a translation matrix from the current laser radar point cloud kth frame data to the corresponding local coordinate system.
4. The method for correcting laser radar point cloud motion distortion according to claim 3,
correcting the laser radar point cloud motion distortion according to the IMU pose in the third step comprises the following steps:
marking the frame head time stamp of the kth frame of the laser radar point cloud as tdTime stamp in frame teAnd the time stamp of the frame tail is tf(ii) a The positions of IMUs of the frame head of the k frame and the front frame and the rear frame are respectively
Figure FDA0002891765440000025
And
Figure FDA0002891765440000026
IMU poses of a front frame and a rear frame in the k frame are respectively
Figure FDA0002891765440000027
And
Figure FDA0002891765440000028
the pose of the IMU of the frame before and after the frame end of the kth frame is
Figure FDA0002891765440000029
And
Figure FDA00028917654400000210
interpolating pose positions of IMU to obtain pose positions of the head, middle and tail of the kth frame
Figure FDA00028917654400000211
Pe kAnd
Figure FDA00028917654400000212
and then carrying out secondary curve fitting to obtain a point cloud pose curve fitting equation:
Pt k=At2+Bt+C,
in the formula Pt kThe pose corresponding to each data point in the kth frame point cloud is shown, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the point cloud motion distortion of the laser radar current frame according to a point cloud pose curve fitting equation to obtain:
Figure FDA00028917654400000213
in the formula
Figure FDA00028917654400000214
And representing corrected point cloud coordinates of the ith point of the kth frame.
5. The method for correcting laser radar point cloud motion distortion according to claim 4,
the step four of utilizing the RTK pose to calibrate the IMU pose comprises the following steps:
judging whether the timestamp of the IMU pose is smaller than the timestamp of the RTK pose according to the RTK pose and the IMU pose in the local coordinate system;
if the timestamp of the IMU pose is smaller than the timestamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is larger than or equal to the time stamp of the RTK pose, searching two frames of RTK poses before and after the current frame IMU pose time stamp, and using the pose obtained by RTK interpolation as an IMU pose observation value;
and performing Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain a calibrated IMU pose.
CN202110030119.3A 2021-01-11 2021-01-11 Laser radar point cloud motion distortion correction method Active CN112859051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110030119.3A CN112859051B (en) 2021-01-11 2021-01-11 Laser radar point cloud motion distortion correction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110030119.3A CN112859051B (en) 2021-01-11 2021-01-11 Laser radar point cloud motion distortion correction method

Publications (2)

Publication Number Publication Date
CN112859051A true CN112859051A (en) 2021-05-28
CN112859051B CN112859051B (en) 2024-04-09

Family

ID=76002243

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110030119.3A Active CN112859051B (en) 2021-01-11 2021-01-11 Laser radar point cloud motion distortion correction method

Country Status (1)

Country Link
CN (1) CN112859051B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113376638A (en) * 2021-06-08 2021-09-10 武汉理工大学 Unmanned logistics trolley environment sensing method and system
CN113495281A (en) * 2021-06-21 2021-10-12 杭州飞步科技有限公司 Real-time positioning method and device for movable platform
CN113790738A (en) * 2021-08-13 2021-12-14 上海智能网联汽车技术中心有限公司 Data compensation method based on intelligent cradle head IMU
CN113838143A (en) * 2021-09-13 2021-12-24 三一专用汽车有限责任公司 Method and device for determining calibration external parameter, engineering vehicle and readable storage medium
CN113935904A (en) * 2021-08-27 2022-01-14 清华大学 Laser odometer error correction method, system, storage medium and computing equipment
CN114372914A (en) * 2022-01-12 2022-04-19 吉林大学 Mechanical laser radar point cloud preprocessing method applied to mining electric shovel
CN114569011A (en) * 2022-03-25 2022-06-03 微思机器人(深圳)有限公司 Wall-following walking method and device, floor sweeping robot and storage medium
CN114820392A (en) * 2022-06-28 2022-07-29 新石器慧通(北京)科技有限公司 Laser radar detection moving target distortion compensation method, device and storage medium
CN114862932A (en) * 2022-06-20 2022-08-05 安徽建筑大学 BIM global positioning-based pose correction method and motion distortion correction method
CN115840234A (en) * 2022-10-28 2023-03-24 苏州知至科技有限公司 Radar data acquisition method and device and storage medium
CN116359938A (en) * 2023-05-31 2023-06-30 未来机器人(深圳)有限公司 Object detection method, device and carrying device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180080828A (en) * 2017-01-05 2018-07-13 서울대학교산학협력단 Method for recognizing lane-level vehicle positioning information based on lidar map matching, recording medium and device for performing the method
CN109975792A (en) * 2019-04-24 2019-07-05 福州大学 Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion
CN110703229A (en) * 2019-09-25 2020-01-17 禾多科技(北京)有限公司 Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision
CN111578957A (en) * 2020-05-07 2020-08-25 泉州装备制造研究所 Intelligent pure vehicle tracking and tracking method based on three-dimensional point cloud map positioning
CN112082545A (en) * 2020-07-29 2020-12-15 武汉威图传视科技有限公司 Map generation method, device and system based on IMU and laser radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180080828A (en) * 2017-01-05 2018-07-13 서울대학교산학협력단 Method for recognizing lane-level vehicle positioning information based on lidar map matching, recording medium and device for performing the method
CN109975792A (en) * 2019-04-24 2019-07-05 福州大学 Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion
CN110703229A (en) * 2019-09-25 2020-01-17 禾多科技(北京)有限公司 Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision
CN111578957A (en) * 2020-05-07 2020-08-25 泉州装备制造研究所 Intelligent pure vehicle tracking and tracking method based on three-dimensional point cloud map positioning
CN112082545A (en) * 2020-07-29 2020-12-15 武汉威图传视科技有限公司 Map generation method, device and system based on IMU and laser radar

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴昱晗: "一种基于点云匹配的激光雷达/IMU联合标定方法", 《电子技术应用》, vol. 45, no. 12, pages 78 - 82 *
李旭: "基于多线激光雷达建图的里程计优化及回环检测", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 2, pages 136 - 1991 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113376638A (en) * 2021-06-08 2021-09-10 武汉理工大学 Unmanned logistics trolley environment sensing method and system
CN113495281B (en) * 2021-06-21 2023-08-22 杭州飞步科技有限公司 Real-time positioning method and device for movable platform
CN113495281A (en) * 2021-06-21 2021-10-12 杭州飞步科技有限公司 Real-time positioning method and device for movable platform
CN113790738A (en) * 2021-08-13 2021-12-14 上海智能网联汽车技术中心有限公司 Data compensation method based on intelligent cradle head IMU
CN113935904A (en) * 2021-08-27 2022-01-14 清华大学 Laser odometer error correction method, system, storage medium and computing equipment
CN113838143A (en) * 2021-09-13 2021-12-24 三一专用汽车有限责任公司 Method and device for determining calibration external parameter, engineering vehicle and readable storage medium
CN114372914A (en) * 2022-01-12 2022-04-19 吉林大学 Mechanical laser radar point cloud preprocessing method applied to mining electric shovel
CN114569011A (en) * 2022-03-25 2022-06-03 微思机器人(深圳)有限公司 Wall-following walking method and device, floor sweeping robot and storage medium
CN114569011B (en) * 2022-03-25 2023-09-05 微思机器人(深圳)有限公司 Wall-following walking method and device, sweeping robot and storage medium
CN114862932A (en) * 2022-06-20 2022-08-05 安徽建筑大学 BIM global positioning-based pose correction method and motion distortion correction method
CN114862932B (en) * 2022-06-20 2022-12-30 安徽建筑大学 BIM global positioning-based pose correction method and motion distortion correction method
CN114820392B (en) * 2022-06-28 2022-10-18 新石器慧通(北京)科技有限公司 Laser radar detection moving target distortion compensation method, device and storage medium
CN114820392A (en) * 2022-06-28 2022-07-29 新石器慧通(北京)科技有限公司 Laser radar detection moving target distortion compensation method, device and storage medium
CN115840234A (en) * 2022-10-28 2023-03-24 苏州知至科技有限公司 Radar data acquisition method and device and storage medium
CN115840234B (en) * 2022-10-28 2024-04-19 苏州知至科技有限公司 Radar data acquisition method, device and storage medium
CN116359938A (en) * 2023-05-31 2023-06-30 未来机器人(深圳)有限公司 Object detection method, device and carrying device
CN116359938B (en) * 2023-05-31 2023-08-25 未来机器人(深圳)有限公司 Object detection method, device and carrying device

Also Published As

Publication number Publication date
CN112859051B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
CN112859051A (en) Method for correcting laser radar point cloud motion distortion
CN101241011B (en) High precision positioning and posture-fixing device on laser radar platform and method
WO2023131123A1 (en) External parameter calibration method and apparatus for combined navigation device and laser radar
WO2022127532A1 (en) Method and apparatus for calibrating external parameter of laser radar and imu, and device
CN109934920A (en) High-precision three-dimensional point cloud map constructing method based on low-cost equipment
CN113358112B (en) Map construction method and laser inertia odometer
CN109471146B (en) Self-adaptive fault-tolerant GPS/INS integrated navigation method based on LS-SVM
CN110187375A (en) A kind of method and device improving positioning accuracy based on SLAM positioning result
CN113933818A (en) Method, device, storage medium and program product for calibrating laser radar external parameter
CN112731358B (en) Multi-laser-radar external parameter online calibration method
CN110570449A (en) positioning and mapping method based on millimeter wave radar and visual SLAM
CN114526745A (en) Drawing establishing method and system for tightly-coupled laser radar and inertial odometer
CN109507706B (en) GPS signal loss prediction positioning method
CN111880207A (en) Visual inertial satellite tight coupling positioning method based on wavelet neural network
CN114612348B (en) Laser point cloud motion distortion correction method and device, electronic equipment and storage medium
CN112946681B (en) Laser radar positioning method fusing combined navigation information
CN114413887A (en) Method, equipment and medium for calibrating external parameters of sensor
CN114485654A (en) Multi-sensor fusion positioning method and device based on high-precision map
CN115728753A (en) External parameter calibration method and device for laser radar and integrated navigation and intelligent vehicle
CN114915913A (en) UWB-IMU combined indoor positioning method based on sliding window factor graph
CN112883134A (en) Data fusion graph building method and device, electronic equipment and storage medium
CN111521996A (en) Laser radar installation calibration method
CN116481543A (en) Multi-sensor fusion double-layer filtering positioning method for mobile robot
CN116753948A (en) Positioning method based on visual inertial GNSS PPP coupling
CN116380057B (en) Unmanned aerial vehicle autonomous landing positioning method under GNSS refusing environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant