CN112859051B - Laser radar point cloud motion distortion correction method - Google Patents
Laser radar point cloud motion distortion correction method Download PDFInfo
- Publication number
- CN112859051B CN112859051B CN202110030119.3A CN202110030119A CN112859051B CN 112859051 B CN112859051 B CN 112859051B CN 202110030119 A CN202110030119 A CN 202110030119A CN 112859051 B CN112859051 B CN 112859051B
- Authority
- CN
- China
- Prior art keywords
- frame
- pose
- laser radar
- point cloud
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000012937 correction Methods 0.000 title claims abstract description 23
- 238000006073 displacement reaction Methods 0.000 claims abstract description 14
- 239000011159 matrix material Substances 0.000 claims description 22
- 238000013519 translation Methods 0.000 claims description 6
- 230000004927 fusion Effects 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 2
- 238000009825 accumulation Methods 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 230000004075 alteration Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
Abstract
A method for correcting laser radar point cloud motion distortion belongs to the technical field of unmanned vehicle automatic driving. The method aims at the problem that in the laser radar point cloud motion distortion correction, the displacement accumulation error of an external sensor is large. Comprising the following steps: adopting RTK as pulse generator to trigger laser radar and IMU synchronously; establishing a global coordinate system for a first frame point cloud of the laser radar; establishing a local coordinate system for the non-first frame point cloud; simultaneously converting the corresponding IMU coordinate system into the local coordinate system; synchronously obtaining the pose of the IMU; obtaining IMU pose corresponding to each data point in the current frame point cloud by adopting an interpolation method, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose; converting the local coordinate system into the global coordinate system, and calibrating the IMU pose by using the RTK pose; and correcting the point cloud of the next frame until the correction of the point cloud motion distortion of the laser radar of the last frame is completed. The method and the device enhance the robustness and stability of laser radar point cloud correction.
Description
Technical Field
The invention relates to a method for correcting laser radar point cloud motion distortion, and belongs to the technical field of unmanned vehicle automatic driving.
Background
With the development of artificial intelligence, the laser radar has gained wide attention by virtue of the advantages of high resolution, long range, no influence of illumination, strong anti-interference capability and the like.
In the field of unmanned vehicle automatic driving, the laser radar can establish map information in an unknown environment, and provides a good basis for subsequent positioning navigation, path planning and the like. When the laser radar based on mechanical rotation scanning moves along with the unmanned vehicle, the coordinate system of the laser radar is changed continuously, so that points in the same frame of point cloud are not in the same coordinate system and generate motion distortion, and therefore, the acquired point cloud data need to be subjected to motion distortion correction.
At present, aiming at the movement distortion of the multi-line laser radar, a single Inertial Measurement Unit (IMU) or the combination of the inertial measurement unit and a wheel speed odometer can be used as an external sensor, and the movement information of the laser radar is deduced under the assumption that an unmanned vehicle only has course angle change, so that the movement distortion correction of the laser radar point cloud is performed. However, the IMU and the wheel speed odometer both need to obtain displacement through integration, and have the defect of large accumulated displacement error in a long-time working state.
Disclosure of Invention
Aiming at the problem that an external sensor adopted in the existing laser radar point cloud motion distortion correction method has large displacement accumulation error, the invention provides the laser radar point cloud motion distortion correction method.
The invention relates to a method for correcting laser radar point cloud motion distortion, which comprises the following steps of,
step one: adopting RTK as pulse generator to trigger laser radar and IMU synchronously; establishing a global coordinate system for the first frame point cloud of the acquired laser radar by taking the first frame point cloud as an origin;
step two: collecting a point cloud of a laser radar current frame which is not a first frame, and establishing a local coordinate system by taking the point cloud of the current frame as an origin; simultaneously converting an IMU coordinate system corresponding to the point cloud acquisition moment of the current frame into the local coordinate system; synchronously obtaining the pose of the IMU;
step three: according to the local coordinate system, an interpolation method is adopted to obtain IMU pose corresponding to each data point in the current frame point cloud, and according to the IMU pose, the motion distortion of the current frame point cloud of the laser radar is corrected;
step four: converting the local coordinate system into the global coordinate system, and calibrating the IMU pose by using the RTK pose; and then returning to the second step until the correction of the motion distortion of the point cloud of the laser radar of the last frame is completed.
According to the method for correcting the laser radar point cloud motion distortion,
in the second step, for the collected laser radar point cloud, the coordinates of the ith point of the kth frame are in a local coordinate systemWherein x is ki 、y ki 、z ki The coordinates of X, Y, Z axes of the ith point of the kth frame in the local coordinate system are respectively, wherein k epsilon (1, 2, 3..m), i epsilon (1, 2, 3..n), m is the total frame number of the point cloud, and n is the number of data points contained in frame data;
the laser radar point cloud is expressed as a matrix form as follows:
according to the method for correcting the laser radar point cloud motion distortion,
in the second step, the IMU pose is expressed in a matrix form as:
wherein the method comprises the steps ofPose of ith point of Yun Di k frame of laser radar point, < >>For the rotation matrix of the laser radar point Yun Di k frame i-th point to the corresponding local coordinate system,/o>A translation matrix from the ith point of the laser radar point Yun Di k frame to a corresponding local coordinate system;
the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the Z axis of the local coordinate system, the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the Y axis of the local coordinate system and the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the X axis of the local coordinate system are shown as beta;
s in x Is the displacement of the ith point of the Yun Di k frame of the laser radar point along the X axis of a local coordinate system, s y Is the displacement of the ith point of the Yun Di k frame of the laser radar point along the Y axis of a local coordinate system, s z The displacement of the ith point of the Yun Di k frame of the laser radar point along the Z axis of the local coordinate system;
when i=0, the number of the cells,
in the middle ofFor the initial pose of the current laser radar point Yun Di k frame,/for the current laser radar point Yun Di k frame>For the rotation matrix of the current lidar point Yun Di k frame data to the corresponding local coordinate system, +.>A translation matrix for the current lidar point Yun Di k frames of data to a corresponding local coordinate system.
According to the method for correcting the laser radar point cloud motion distortion,
and step three, correcting the laser radar point cloud motion distortion according to the IMU pose comprises the following steps:
the frame head time stamp of the laser radar point Yun Di k frame is t d Time stamp in frame t e The end-of-frame timestamp is t f The method comprises the steps of carrying out a first treatment on the surface of the The IMU pose of the frame head of the kth frame is respectively thatAnd->The IMU pose of the front and rear frames in the kth frame is +.>Andthe IMU pose of the two frames before and after the end of the kth frame is->And->
Interpolating the IMU pose to obtain the pose of the frame head, the frame middle and the frame tail of the kth frame in sequenceAnd->And performing quadratic curve fitting to obtain a point cloud pose curve fitting equation:
P t k =At 2 +Bt+C,
p in the formula t k For the pose corresponding to each data point in the kth frame point cloud, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the motion distortion of the point cloud of the current frame of the laser radar according to the point cloud pose curve fitting equation to obtain the following steps:
in the middle ofAnd (5) representing corrected point cloud coordinates of the ith point of the kth frame.
According to the method for correcting the laser radar point cloud motion distortion,
the calibrating of the IMU pose by using the RTK pose in the fourth step comprises the following steps:
judging whether the time stamp of the IMU pose is smaller than that of the RTK pose according to the RTK pose and the IMU pose under the local coordinate system;
if the time stamp of the IMU pose is smaller than the time stamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is greater than or equal to the time stamp of the RTK pose, searching for the RTK pose of the front frame and the rear frame of the time stamp of the IMU pose of the current frame, and taking the pose obtained by RTK interpolation as an IMU pose observation value;
and carrying out Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain the calibrated IMU pose.
The invention has the beneficial effects that: according to the method, the pose of all points of each frame under a local coordinate system is obtained by IMU pose interpolation, each frame of point cloud is further subjected to de-distortion and then is converted into the local coordinate system of the current frame, each frame of local coordinate system is then converted into the global coordinate system, and finally the correction of the laser radar point cloud motion distortion is completed.
The method utilizes the carrier phase differential technology RTK, the plane precision and the elevation precision of the RTK can reach the centimeter level, no accumulated error exists, and the RTK can be used as a clock source to unify the time of each sensor on hardware. After the position and the posture of the IMU are corrected in real time by using the RTK, the displacement, the course angle, the pitch angle and the roll angle of the unmanned aerial vehicle can be calculated by using the IMU, so that the laser radar point cloud motion distortion is corrected, and the technology is closer to the real situation compared with the technology which only considers the course angle. When the unmanned vehicle runs to a region with serious shielding, although the RTK work can be affected to a certain extent, the IMU corrected by the RTK can still provide more accurate pose, and the robustness and the stability of the system are enhanced.
Drawings
FIG. 1 is an overall flow chart of a method for correcting point cloud motion aberration of a lidar according to the present invention;
FIG. 2 is a flowchart of a method for correcting the motion distortion of a point cloud of a laser radar;
FIG. 3 is a flow chart for interpolating IMU pose to correct laser radar point cloud distortion;
FIG. 4 is a flow chart for calibrating IMU pose using RTK pose.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
The invention is further described below with reference to the drawings and specific examples, which are not intended to be limiting.
The invention provides a method for correcting the motion distortion of a point cloud of a laser radar, which is shown in the detailed description with reference to fig. 1 and 2,
step one: adopting RTK as pulse generator to trigger laser radar and IMU synchronously; establishing a global coordinate system for the first frame point cloud of the acquired laser radar by taking the first frame point cloud as an origin; after each trigger of the RTK is completed, the RTK can correct the clock of the RTK, so that clock source time synchronization on hardware is completed;
step two: collecting a point cloud of a laser radar current frame which is not a first frame, and establishing a local coordinate system by taking the point cloud of the current frame as an origin; simultaneously converting an IMU coordinate system corresponding to the point cloud acquisition moment of the current frame into the local coordinate system; synchronously obtaining the pose of the IMU;
step three: according to the local coordinate system, an interpolation method is adopted to obtain IMU pose corresponding to each data point in the current frame point cloud, and according to the IMU pose, the motion distortion of the current frame point cloud of the laser radar is corrected;
step four: converting the local coordinate system into the global coordinate system, and calibrating the IMU pose by using the RTK pose; and then returning to the second step until the correction of the motion distortion of the point cloud of the laser radar of the last frame is completed.
The embodiment realizes the correction of the laser radar point cloud distortion based on an inertial measurement unit IMU and a real-time dynamic carrier phase difference technology RTK. The RTK has two functions, namely, time synchronization is realized, and the pose of the IMU can be corrected by the RTK due to inaccurate data caused by long-time running due to time drift of the pose of the IMU. Thus, when the RTK is blocked seriously and cannot transmit data, the data which continue to work is still reliable because the pose of the IMU is corrected. Thus, the complementary operation of the laser radar, the RTK and the IMU is realized.
In the embodiment, when the laser radar point cloud is collected, whether the laser radar point cloud is a first frame point cloud is firstly judged, and a global coordinate system is established according to the first frame point cloud; for non-first frame point clouds, establishing a local coordinate system; after each correction is finished, judging whether the current frame point cloud is the last frame point cloud, if not, repeating the correction process, and if so, ending the operation. And after correction of each frame of point cloud is completed, carrying out correction of the IMU pose once until all correction tasks are completed.
Further, in the second step, for the collected laser radar point cloud, in the local coordinate system, the coordinates of the ith point of the kth frame areWherein x is ki 、y ki 、z ki The coordinates of X, Y, Z axes of the ith point of the kth frame in the local coordinate system are respectively, wherein k epsilon (1, 2, 3..m), i epsilon (1, 2, 3..n), m is the total frame number of the point cloud, and n is the number of data points contained in frame data;
the laser radar point cloud is expressed as a matrix form as follows:the last item 1 is added in the matrix form, which is equivalent to one row and four columns, so that the subsequent calculation can be facilitated.
Still further, in the second step, the IMU pose is expressed in a matrix form as:
wherein the method comprises the steps ofPose of ith point of Yun Di k frame of laser radar point, < >>For the rotation matrix of the laser radar point Yun Di k frame i-th point to the corresponding local coordinate system,/o>A translation matrix from the ith point of the laser radar point Yun Di k frame to a corresponding local coordinate system;
the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the Z axis of the local coordinate system, the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the Y axis of the local coordinate system and the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the X axis of the local coordinate system are shown as beta;
s in x Is the displacement of the ith point of the Yun Di k frame of the laser radar point along the X axis of a local coordinate system, s y Is the displacement of the ith point of the Yun Di k frame of the laser radar point along the Y axis of a local coordinate system, s z The displacement of the ith point of the Yun Di k frame of the laser radar point along the Z axis of the local coordinate system;
when i=0, the number of the cells,
in the middle ofFor the initial pose of the current laser radar point Yun Di k frame,/for the current laser radar point Yun Di k frame>For the rotation matrix of the current lidar point Yun Di k frame data to the corresponding local coordinate system, +.>A translation matrix for the current lidar point Yun Di k frames of data to a corresponding local coordinate system.
Each frame of data of the laser radar point cloud has a local coordinate system.
Still further, referring to fig. 3, in the third step, correcting the motion distortion of the lidar point cloud according to the pose of the IMU includes:
the frame head time stamp of the laser radar point Yun Di k frame is t d Time stamp in frame t e The end-of-frame timestamp is t f The method comprises the steps of carrying out a first treatment on the surface of the The IMU pose of the frame head of the kth frame is respectively thatAnd->The IMU pose of the front and rear frames in the kth frame is +.>Andthe IMU pose of the two frames before and after the end of the kth frame is->And->
Interpolating the IMU pose to obtain the pose of the frame head, the frame middle and the frame tail of the kth frame in sequenceAnd->And performing quadratic curve fitting to obtain a point cloud pose curve fitting equation:
P t k =At 2 +Bt+C,
p in the formula t k For the pose corresponding to each data point in the kth frame point cloud, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the motion distortion of the point cloud of the current frame of the laser radar according to the point cloud pose curve fitting equation to obtain the following steps:
in the middle ofAnd (5) representing corrected point cloud coordinates of the ith point of the kth frame.
Representing the inverse, P, of the current kth frame start pose matrix i k Representing the pose of the ith point of the current kth frame,/->And (5) representing the point cloud coordinates before the ith point correction of the current kth frame. Thus, the correction of the current kth frame point cloud motion distortion is completed.
Still further, referring to fig. 4, calibrating the IMU pose using the RTK pose in step four includes:
judging whether the time stamp of the IMU pose is smaller than that of the RTK pose according to the RTK pose and the IMU pose under the local coordinate system;
if the time stamp of the IMU pose is smaller than the time stamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is greater than or equal to the time stamp of the RTK pose, searching for the RTK pose of the front frame and the rear frame of the time stamp of the IMU pose of the current frame, and taking the pose obtained by RTK interpolation as an IMU pose observation value;
and carrying out Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain the calibrated IMU pose.
In this embodiment, the IMU pose in the original local coordinate system may be used as a predicted value, and then the kalman fusion of the predicted value and the observed value of the IMU pose is performed, so as to obtain the calibrated IMU pose.
In the present invention, as an example, the update frequency of the lidar may be 10HZ and the update frequency of the imu may be 300HZ.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims. It should be understood that the different dependent claims and the features described herein may be combined in ways other than as described in the original claims. It is also to be understood that features described in connection with separate embodiments may be used in other described embodiments.
Claims (2)
1. A method for correcting the movement distortion of the point cloud of a laser radar is characterized by comprising the steps of,
step one: adopting RTK as pulse generator to trigger laser radar and IMU synchronously; establishing a global coordinate system for the first frame point cloud of the acquired laser radar by taking the first frame point cloud as an origin;
step two: collecting a point cloud of a laser radar current frame which is not a first frame, and establishing a local coordinate system by taking the point cloud of the current frame as an origin; simultaneously converting an IMU coordinate system corresponding to the point cloud acquisition moment of the current frame into the local coordinate system; synchronously obtaining the pose of the IMU;
step three: according to the local coordinate system, an interpolation method is adopted to obtain IMU pose corresponding to each data point in the current frame point cloud, and according to the IMU pose, the motion distortion of the current frame point cloud of the laser radar is corrected;
step four: converting the local coordinate system into the global coordinate system, and calibrating the IMU pose by using the RTK pose; then returning to the second step until the correction of the point cloud motion distortion of the laser radar of the last frame is completed;
in the second step, for the collected laser radar point cloud, the coordinates of the ith point of the kth frame are in a local coordinate systemWherein x is ki 、y ki 、z ki The coordinates of X, Y, Z axes of the ith point of the kth frame in the local coordinate system are respectively, wherein k epsilon (1, 2, 3..m), i epsilon (1, 2, 3..n), m is the total frame number of the point cloud, and n is the number of data points contained in frame data;
the laser radar point cloud is expressed as a matrix form as follows:the last item 1 is added in a matrix form, which is equivalent to one row and four columns, so that the subsequent calculation can be facilitated;
still further, in the second step, the IMU pose is expressed in a matrix form as:
wherein the method comprises the steps ofPose of ith point of Yun Di k frame of laser radar point, < >>For the rotation matrix of the laser radar point Yun Di k frame i-th point to the corresponding local coordinate system,/o>A translation matrix from the ith point of the laser radar point Yun Di k frame to a corresponding local coordinate system;
the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the Z axis of the local coordinate system, the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the Y axis of the local coordinate system and the angle of rotation of the ith point of the frame Yun Di k of the laser radar point around the X axis of the local coordinate system are shown as beta;
s in x Is the displacement of the ith point of the Yun Di k frame of the laser radar point along the X axis of a local coordinate system, s y Is the displacement of the ith point of the Yun Di k frame of the laser radar point along the Y axis of a local coordinate system, s z The displacement of the ith point of the Yun Di k frame of the laser radar point along the Z axis of the local coordinate system;
when i=0, the number of the cells,
in the middle ofFor the initial pose of the current laser radar point Yun Di k frame,/for the current laser radar point Yun Di k frame>For the rotation matrix of the current lidar point Yun Di k frame data to the corresponding local coordinate system, +.>A translation matrix for the current laser radar point Yun Di k frame data to a corresponding local coordinate system;
each frame of data of the laser radar point cloud has a local coordinate system;
still further, in the third step, correcting the motion distortion of the laser radar point cloud according to the pose of the IMU includes:
the frame head time stamp of the laser radar point Yun Di k frame is t d Time stamp in frame t e The end-of-frame timestamp is t f The method comprises the steps of carrying out a first treatment on the surface of the The IMU pose of the frame head of the kth frame is respectively thatAnd->The IMU pose of the front and rear frames in the kth frame is +.>And->The IMU pose of the two frames before and after the end of the kth frame is->And->
Interpolating the IMU pose to obtain the pose of the frame head, the frame middle and the frame tail of the kth frame in sequenceAnd->And performing quadratic curve fitting to obtain a point cloud pose curve fitting equation:
P t k =At 2 +Bt+C,
p in the formula t k For the pose corresponding to each data point in the kth frame point cloud, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the motion distortion of the point cloud of the current frame of the laser radar according to the point cloud pose curve fitting equation to obtain the following steps:
in the middle ofThe corrected point cloud coordinates of the ith point of the kth frame are represented;
representing the inverse, P, of the current kth frame start pose matrix i k Representing the pose of the ith point of the current kth frame,/->And (5) representing the point cloud coordinates before the correction of the ith point of the current kth frame, and thus finishing the correction of the point cloud motion distortion of the current kth frame.
2. The method for correcting motion distortion of laser radar point cloud according to claim 1, wherein,
the calibrating of the IMU pose by using the RTK pose in the fourth step comprises the following steps:
judging whether the time stamp of the IMU pose is smaller than that of the RTK pose according to the RTK pose and the IMU pose under the local coordinate system;
if the time stamp of the IMU pose is smaller than the time stamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is greater than or equal to the time stamp of the RTK pose, searching for the RTK pose of the front frame and the rear frame of the time stamp of the IMU pose of the current frame, and taking the pose obtained by RTK interpolation as an IMU pose observation value;
and carrying out Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain the calibrated IMU pose.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110030119.3A CN112859051B (en) | 2021-01-11 | 2021-01-11 | Laser radar point cloud motion distortion correction method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110030119.3A CN112859051B (en) | 2021-01-11 | 2021-01-11 | Laser radar point cloud motion distortion correction method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112859051A CN112859051A (en) | 2021-05-28 |
CN112859051B true CN112859051B (en) | 2024-04-09 |
Family
ID=76002243
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110030119.3A Active CN112859051B (en) | 2021-01-11 | 2021-01-11 | Laser radar point cloud motion distortion correction method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112859051B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113376638A (en) * | 2021-06-08 | 2021-09-10 | 武汉理工大学 | Unmanned logistics trolley environment sensing method and system |
CN113495281B (en) * | 2021-06-21 | 2023-08-22 | 杭州飞步科技有限公司 | Real-time positioning method and device for movable platform |
CN113790738A (en) * | 2021-08-13 | 2021-12-14 | 上海智能网联汽车技术中心有限公司 | Data compensation method based on intelligent cradle head IMU |
CN113838143A (en) * | 2021-09-13 | 2021-12-24 | 三一专用汽车有限责任公司 | Method and device for determining calibration external parameter, engineering vehicle and readable storage medium |
CN114372914B (en) * | 2022-01-12 | 2022-09-13 | 吉林大学 | Mechanical laser radar point cloud preprocessing method applied to mining electric shovel |
CN114569011B (en) * | 2022-03-25 | 2023-09-05 | 微思机器人(深圳)有限公司 | Wall-following walking method and device, sweeping robot and storage medium |
CN114862932B (en) * | 2022-06-20 | 2022-12-30 | 安徽建筑大学 | BIM global positioning-based pose correction method and motion distortion correction method |
CN114820392B (en) * | 2022-06-28 | 2022-10-18 | 新石器慧通(北京)科技有限公司 | Laser radar detection moving target distortion compensation method, device and storage medium |
CN115840234B (en) * | 2022-10-28 | 2024-04-19 | 苏州知至科技有限公司 | Radar data acquisition method, device and storage medium |
CN116359938B (en) * | 2023-05-31 | 2023-08-25 | 未来机器人(深圳)有限公司 | Object detection method, device and carrying device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180080828A (en) * | 2017-01-05 | 2018-07-13 | 서울대학교산학협력단 | Method for recognizing lane-level vehicle positioning information based on lidar map matching, recording medium and device for performing the method |
CN109975792A (en) * | 2019-04-24 | 2019-07-05 | 福州大学 | Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion |
CN110703229A (en) * | 2019-09-25 | 2020-01-17 | 禾多科技(北京)有限公司 | Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU |
CN111045017A (en) * | 2019-12-20 | 2020-04-21 | 成都理工大学 | Method for constructing transformer substation map of inspection robot by fusing laser and vision |
CN111578957A (en) * | 2020-05-07 | 2020-08-25 | 泉州装备制造研究所 | Intelligent pure vehicle tracking and tracking method based on three-dimensional point cloud map positioning |
CN112082545A (en) * | 2020-07-29 | 2020-12-15 | 武汉威图传视科技有限公司 | Map generation method, device and system based on IMU and laser radar |
-
2021
- 2021-01-11 CN CN202110030119.3A patent/CN112859051B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180080828A (en) * | 2017-01-05 | 2018-07-13 | 서울대학교산학협력단 | Method for recognizing lane-level vehicle positioning information based on lidar map matching, recording medium and device for performing the method |
CN109975792A (en) * | 2019-04-24 | 2019-07-05 | 福州大学 | Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion |
CN110703229A (en) * | 2019-09-25 | 2020-01-17 | 禾多科技(北京)有限公司 | Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU |
CN111045017A (en) * | 2019-12-20 | 2020-04-21 | 成都理工大学 | Method for constructing transformer substation map of inspection robot by fusing laser and vision |
CN111578957A (en) * | 2020-05-07 | 2020-08-25 | 泉州装备制造研究所 | Intelligent pure vehicle tracking and tracking method based on three-dimensional point cloud map positioning |
CN112082545A (en) * | 2020-07-29 | 2020-12-15 | 武汉威图传视科技有限公司 | Map generation method, device and system based on IMU and laser radar |
Non-Patent Citations (2)
Title |
---|
一种基于点云匹配的激光雷达/IMU联合标定方法;吴昱晗;《电子技术应用》;第45卷(第12期);78-82 * |
基于多线激光雷达建图的里程计优化及回环检测;李旭;《中国优秀硕士学位论文全文数据库 信息科技辑》(第2期);I136-1991 * |
Also Published As
Publication number | Publication date |
---|---|
CN112859051A (en) | 2021-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112859051B (en) | Laser radar point cloud motion distortion correction method | |
CN111381217B (en) | Missile-borne SAR motion compensation method based on low-precision inertial navigation system | |
WO2023131123A1 (en) | External parameter calibration method and apparatus for combined navigation device and laser radar | |
WO2022127532A1 (en) | Method and apparatus for calibrating external parameter of laser radar and imu, and device | |
CN111880207B (en) | Visual inertial satellite tight coupling positioning method based on wavelet neural network | |
CN110187375A (en) | A kind of method and device improving positioning accuracy based on SLAM positioning result | |
CN109507706B (en) | GPS signal loss prediction positioning method | |
CN113466890B (en) | Light laser radar inertial combination positioning method and system based on key feature extraction | |
CN110570449A (en) | positioning and mapping method based on millimeter wave radar and visual SLAM | |
CN109059907A (en) | Track data processing method, device, computer equipment and storage medium | |
CN112946681B (en) | Laser radar positioning method fusing combined navigation information | |
CN113763548B (en) | Vision-laser radar coupling-based lean texture tunnel modeling method and system | |
CN113933818A (en) | Method, device, storage medium and program product for calibrating laser radar external parameter | |
CN114526745A (en) | Drawing establishing method and system for tightly-coupled laser radar and inertial odometer | |
CN115685292B (en) | Navigation method and device of multi-source fusion navigation system | |
CN114413887A (en) | Method, equipment and medium for calibrating external parameters of sensor | |
CN110126842B (en) | Method and device for dynamically correcting longitudinal acceleration of intelligent driving vehicle | |
CN116452763A (en) | Three-dimensional point cloud map construction method based on error Kalman filtering and factor graph | |
Rieken et al. | Sensor scan timing compensation in environment models for automated road vehicles | |
CN116242372A (en) | UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment | |
CN115097481A (en) | Point cloud motion compensation method and device and electronic equipment | |
CN114915913A (en) | UWB-IMU combined indoor positioning method based on sliding window factor graph | |
CN111307176B (en) | Online calibration method for visual inertial odometer in VR head-mounted display equipment | |
JP2022149051A (en) | Map creation device, map creation system, map creation method, and program | |
CN107796417B (en) | Method for adaptively estimating scene matching and inertial navigation installation error |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |