CN112781594A - Laser radar iteration closest point improvement algorithm based on IMU coupling - Google Patents

Laser radar iteration closest point improvement algorithm based on IMU coupling Download PDF

Info

Publication number
CN112781594A
CN112781594A CN202110032151.5A CN202110032151A CN112781594A CN 112781594 A CN112781594 A CN 112781594A CN 202110032151 A CN202110032151 A CN 202110032151A CN 112781594 A CN112781594 A CN 112781594A
Authority
CN
China
Prior art keywords
imu
laser radar
original
frame
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110032151.5A
Other languages
Chinese (zh)
Other versions
CN112781594B (en
Inventor
刘飞
周志全
邹钰杰
屈婧婧
柴文静
杨起鸣
严景琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Electronic Technology
Original Assignee
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Electronic Technology filed Critical Guilin University of Electronic Technology
Priority to CN202110032151.5A priority Critical patent/CN112781594B/en
Publication of CN112781594A publication Critical patent/CN112781594A/en
Application granted granted Critical
Publication of CN112781594B publication Critical patent/CN112781594B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A laser radar iteration closest point improvement algorithm based on IMU coupling belongs to the field of laser radar mapping and positioning. The method aims at the problems that the existing iterative closest point algorithm is low in operation speed and sensitive to an initial value and is easily limited to a local optimal solution. The method comprises the following steps: synchronizing the lidar and the IMU clock sources; preprocessing the laser radar original point cloud, and then extracting a characteristic point set to obtain a laser radar characteristic point set; obtaining a pre-integration result and a conversion initial matrix of two adjacent frames of laser radar original point clouds according to the IMU original poses corresponding to the two adjacent frames of laser radar original point clouds; constructing an original error equation matched with the laser radar point cloud, constructing an observation error equation of the IMU, and then constructing a combined error equation of the IMU and the laser radar; and carrying out ICP algorithm iterative computation on the joint error equation to obtain an optimal transformation matrix. The invention improves the running speed and the matching precision of the ICP algorithm.

Description

Laser radar iteration closest point improvement algorithm based on IMU coupling
Technical Field
The invention relates to an IMU coupling-based laser radar iterative closest point improvement algorithm, and belongs to the field of laser radar mapping and positioning.
Background
The laser radar can help the unmanned vehicle to establish a surrounding map in an unknown environment, and meanwhile, the self-motion estimation is realized, so that the laser radar is one of important modes for the unmanned vehicle to accurately navigate.
The laser radar mapping and positioning can be divided into four modules of front-end scanning matching, rear-end optimization, closed-loop detection and mapping construction. The front end scanning matching is a core module, a pose change matrix between two adjacent frames of point clouds is estimated through point cloud matching, and the two adjacent frames of point clouds are sequentially spliced into a global map according to a pose change relation, so that the map building is realized; and then, calculating state information between two adjacent frames according to the pose transformation matrix, namely realizing positioning.
In the front-end scanning matching module, the Iterative Closest Point (ICP) algorithm developed by Besl and McKay is the most commonly used for realizing mapping and positioning, the algorithm is a registration algorithm based on corresponding points, two points with the minimum Euclidean distance are used as the corresponding points, and a transformation matrix between two frames of point cloud data is obtained by minimizing the sum of squares of the distances between the corresponding points in two point sets. The algorithm is simple and convenient to operate, high in matching accuracy, low in running speed, sensitive to an initial value and easy to limit to a local optimal solution.
Disclosure of Invention
Aiming at the problems that the existing iterative closest point algorithm is low in operation speed and sensitive to an initial value and is easily limited to a local optimal solution, the invention provides an improved iterative closest point algorithm of a laser radar based on IMU coupling.
The invention relates to a lidar iteration closest point improvement algorithm based on IMU coupling, which comprises,
synchronizing the lidar and the IMU clock sources;
preprocessing the laser radar original point cloud, and then extracting a characteristic point set to obtain a laser radar characteristic point set; obtaining a pre-integration result and a conversion initial matrix of two adjacent frames of laser radar original point clouds according to the IMU original poses corresponding to the two adjacent frames of laser radar original point clouds;
constructing an original error equation of laser radar point cloud matching according to the laser radar feature point set; constructing an observation error equation of the IMU according to the pre-integration result; then constructing a joint error equation of the IMU and the laser radar based on the original error equation and the observation error equation;
and performing ICP algorithm iterative computation on a joint error equation by taking the conversion initial matrix as an initial value based on the laser radar feature point sets corresponding to the two adjacent frames of laser radar original point clouds to obtain an optimal transformation matrix.
According to the IMU coupling based lidar iterative closest point refinement algorithm of the present invention,
and the laser radar and the IMU realize clock source synchronization through a time synchronizer.
According to the IMU coupling based lidar iterative closest point refinement algorithm of the present invention,
after preprocessing the laser radar original point cloud, extracting a characteristic point set, wherein the process of obtaining the laser radar characteristic point set comprises the following steps:
collecting original point clouds of the laser radar, and removing outliers by using a filter; sequentially connecting all the reserved original point clouds into a line, and calculating the cosine absolute value of an included angle formed by the connection line of each original point cloud and the two adjacent original point clouds in front and back; if the cosine absolute value is smaller than a preset minimum threshold, the current original point cloud is an angular point; if the cosine absolute value is greater than or equal to a preset minimum threshold and less than a preset maximum threshold, the current original point cloud is a non-characteristic point; if the cosine absolute value is larger than a preset maximum threshold value, the current original point cloud is a plane point;
the laser radar characteristic point set comprises an angular point set formed by all angular points and a plane point set formed by all plane points.
According to the IMU coupling-based laser radar iterative closest point improvement algorithm, the specific process of constructing the observation error equation of the IMU according to the pre-integration result comprises the following steps:
establishing a laser radar coordinate system L and an IMU coordinate system B, and converting the IMU coordinate system B into the laser radar coordinate system L;
setting IMU original pose information as follows:
Figure BDA0002892916750000021
wherein
Figure BDA0002892916750000022
The IMU state of the laser radar under a laser radar coordinate system L corresponding to the k frame original point cloud of the laser radar, including the IMU position
Figure BDA0002892916750000023
IMU speed
Figure BDA0002892916750000024
IMU quaternion
Figure BDA0002892916750000025
IMU speed drift
Figure BDA0002892916750000026
And IMU acceleration drift
Figure BDA0002892916750000027
Pre-integrating IMU state data corresponding to the k frame of original point cloud of the laser radar and the k +1 frame of original point cloud to obtain state information containing noise in a laser radar coordinate system L relative to the IMU original poses corresponding to the k frame of original point cloud of the laser radar and the k +1 frame of original point cloud:
Figure BDA0002892916750000028
wherein
Figure BDA0002892916750000029
The position of the IMU at the k +1 th frame time of the laser radar coordinate system L,
Figure BDA00028929167500000210
The position of the IMU at the kth frame time of the laser radar coordinate system L,
Figure BDA00028929167500000211
Velocity, Δ t, of the IMU at the kth frame time of the lidar coordinate system LkThe time variation, g, corresponding to the IMU from the kth frame to the (k + 1) th frame of the laser radar coordinate system LLIs the gravity acceleration of the IMU under the laser radar coordinate system L,
Figure BDA00028929167500000212
The displacement of the IMU from the kth frame to the (k + 1) th frame in a laser radar coordinate system L;
the velocity of the IMU at the moment of the (k + 1) th frame of the lidar coordinate system L
Figure BDA00028929167500000213
Comprises the following steps:
Figure BDA00028929167500000214
in the formula
Figure BDA0002892916750000031
The velocity of the IMU from the kth frame to the (k + 1) th frame in a laser radar coordinate system L;
further obtain
Figure BDA0002892916750000032
Wherein
Figure BDA0002892916750000033
Is the (k + 1) th frame IMU quaternion,
Figure BDA0002892916750000034
the rotation of the IMU from the kth frame to the (k + 1) th frame under a laser radar coordinate system L;
constructing a state error equation of displacement, speed and angle of the IMU:
Figure BDA0002892916750000035
Figure BDA0002892916750000036
Figure BDA0002892916750000037
wherein
Figure BDA0002892916750000038
Representing the displacement error from the k frame to the k +1 frame of the IMU under the laser radar coordinate system L,
Figure BDA0002892916750000039
Representing the speed error from the k frame to the k +1 frame of the IMU under the laser radar coordinate system L,
Figure BDA00028929167500000310
Representing the rotation error of the IMU from the k frame to the k +1 frame under a laser radar coordinate system L;
constructing an observation error equation of the IMU:
Figure BDA00028929167500000311
EBindicating the observed error of the IMU.
According to the laser radar iterative closest point improvement algorithm based on IMU coupling, the original error equation of laser radar point cloud matching is constructed according to the laser radar feature point set and is as follows:
Figure BDA00028929167500000312
in the formula ELRepresenting the original error of the laser radar point cloud matching, R representing the rotation matrix, AiRepresenting a source point cloud, BiRepresenting the target point cloud and t representing the translation matrix.
According to the laser radar iterative closest point improvement algorithm based on IMU coupling, the joint error equation is
Figure BDA00028929167500000313
Where E represents the joint error.
According to the laser radar iterative closest point improvement algorithm based on IMU coupling, the ICP algorithm iterative computation of the joint error equation comprises the following steps:
recording the original point clouds of two adjacent frames of laser radars as source point cloud Ai=(xi yi zi) And a target point cloud Bj=(xj yjzj) Wherein i belongs to (1,2, 3., n), j belongs to (1,2, 3., m), in the formula, (x y z) is a coordinate value of the point cloud under a laser radar coordinate system L, n represents the quantity of source point clouds, and m represents the quantity of target point clouds;
determining an angular point set and a plane point set corresponding to two adjacent frames of laser radar original point clouds according to a Euclidean distance minimum principle between two points; and performing ICP algorithm iterative computation on the joint error equation by taking the conversion initial matrix as an initial value to obtain an optimal transformation matrix, wherein the optimal transformation matrix enables the joint error equation value to be minimum.
According to the laser radar iterative closest point improvement algorithm based on IMU coupling, the ICP algorithm iterative computation of the joint error equation further comprises the following steps:
transforming the target point cloud by adopting the optimal transformation matrix to obtain a new target point cloud;
calculating the average distance d between the new target point cloud and the source point cloud:
Figure BDA0002892916750000041
in the formula Bi' represents a new target point cloud;
and when the average distance d is smaller than a set distance threshold lambda or the iteration times reach a set iteration time N, stopping iterative computation, and taking the obtained optimal transformation matrix as a final optimal transformation matrix.
The invention has the beneficial effects that: in order to improve the running speed of the ICP algorithm and ensure the matching precision at the same time, the method firstly extracts the laser radar feature point set, and reduces the matching calculation amount and the corresponding points of error matching; secondly, solving a conversion initial matrix of the ICP algorithm by using the IMU original pose of the inertial measurement unit, simultaneously constructing a joint error equation of the IMU and the laser radar in order to reduce errors caused by IMU drift, and finally iteratively calculating an optimal transformation matrix.
The invention improves the running speed and the matching precision of the ICP algorithm.
Drawings
FIG. 1 is a flowchart of an embodiment of an IMU coupling-based lidar iterative closest point improvement algorithm according to the present invention;
FIG. 2 is a flow chart of obtaining a set of lidar feature points;
FIG. 3 is a flow chart for constructing an observation error equation for an IMU;
fig. 4 is a flow chart of iterative computation of the ICP algorithm for the joint error equation.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The invention is further described with reference to the following drawings and specific examples, which are not intended to be limiting.
First embodiment, referring to fig. 1, the present invention provides an iterative closest point improvement algorithm for lidar based on IMU coupling, which includes,
synchronizing the lidar and the IMU clock sources;
preprocessing the laser radar original point cloud, and then extracting a characteristic point set to obtain a laser radar characteristic point set; obtaining a pre-integration result and a conversion initial matrix of two adjacent frames of laser radar original point clouds according to the IMU original poses corresponding to the two adjacent frames of laser radar original point clouds;
constructing an original error equation of laser radar point cloud matching according to the laser radar feature point set; constructing an observation error equation of the IMU according to the pre-integration result; then constructing a joint error equation of the IMU and the laser radar based on the original error equation and the observation error equation;
and performing ICP algorithm iterative computation on a joint error equation by taking the conversion initial matrix as an initial value based on the laser radar feature point sets corresponding to the two adjacent frames of laser radar original point clouds to obtain an optimal transformation matrix.
As an example, the lidar and the IMU implement clock source synchronization through a time synchronizer.
Further, with reference to fig. 2, the method for extracting the feature point set after preprocessing the laser radar original point cloud to obtain the laser radar feature point set includes:
collecting original point clouds of the laser radar, and removing outliers by using a filter; sequentially connecting all the reserved original point clouds into a line, and calculating the cosine absolute value of an included angle formed by the connection line of each original point cloud and the two adjacent original point clouds in front and back; if the cosine absolute value is smaller than a preset minimum threshold, the current original point cloud is an angular point; if the cosine absolute value is greater than or equal to a preset minimum threshold and less than a preset maximum threshold, the current original point cloud is a non-characteristic point; if the cosine absolute value is larger than a preset maximum threshold value, the current original point cloud is a plane point;
the laser radar characteristic point set comprises an angular point set formed by all angular points and a plane point set formed by all plane points.
Still further, with reference to fig. 3, a specific process of constructing the observation error equation of the IMU according to the pre-integration result includes:
establishing a laser radar coordinate system L and an IMU coordinate system B, and converting the IMU coordinate system B into the laser radar coordinate system L;
setting IMU original pose information as follows:
Figure BDA0002892916750000051
wherein
Figure BDA0002892916750000052
The IMU state of the laser radar under a laser radar coordinate system L corresponding to the k frame original point cloud of the laser radar, including the IMU position
Figure BDA0002892916750000061
IMU speed
Figure BDA0002892916750000062
IMU quaternion
Figure BDA0002892916750000063
IMU speed drift
Figure BDA0002892916750000064
And IMU acceleration drift
Figure BDA0002892916750000065
Pre-integrating IMU state data corresponding to the k frame of original point cloud of the laser radar and the k +1 frame of original point cloud to obtain state information containing noise in a laser radar coordinate system L relative to the IMU original poses corresponding to the k frame of original point cloud of the laser radar and the k +1 frame of original point cloud:
Figure BDA0002892916750000066
wherein
Figure BDA0002892916750000067
The position of the IMU at the k +1 th frame time of the laser radar coordinate system L,
Figure BDA0002892916750000068
The position of the IMU at the kth frame time of the laser radar coordinate system L,
Figure BDA0002892916750000069
Velocity, Δ t, of the IMU at the kth frame time of the lidar coordinate system LkThe time variation, g, corresponding to the IMU from the kth frame to the (k + 1) th frame of the laser radar coordinate system LLIs the gravity acceleration of the IMU under the laser radar coordinate system L,
Figure BDA00028929167500000610
The displacement of the IMU from the kth frame to the (k + 1) th frame in a laser radar coordinate system L;
the velocity of the IMU at the moment of the (k + 1) th frame of the lidar coordinate system L
Figure BDA00028929167500000611
Comprises the following steps:
Figure BDA00028929167500000612
in the formula
Figure BDA00028929167500000613
The velocity of the IMU from the kth frame to the (k + 1) th frame in a laser radar coordinate system L;
further obtain
Figure BDA00028929167500000614
Wherein
Figure BDA00028929167500000615
Is the (k + 1) th frame IMU quaternion,
Figure BDA00028929167500000616
the rotation of the IMU from the kth frame to the (k + 1) th frame under a laser radar coordinate system L;
constructing a state error equation of displacement, speed and angle of the IMU:
Figure BDA00028929167500000617
Figure BDA00028929167500000618
Figure BDA00028929167500000619
wherein
Figure BDA00028929167500000620
Representing the displacement error from the k frame to the k +1 frame of the IMU under the laser radar coordinate system L,
Figure BDA00028929167500000621
Representing the speed error from the k frame to the k +1 frame of the IMU under the laser radar coordinate system L,
Figure BDA00028929167500000622
Representing the rotation error of the IMU from the k frame to the k +1 frame under a laser radar coordinate system L;
constructing an observation error equation of the IMU:
Figure BDA0002892916750000071
EBindicating the observed error of the IMU.
The main drift of IMU is the secondary integral of accelerationFurthermore, the observation error equation of the IMU can be simplified to E in order to reduce the calculation amountB
In this embodiment, the IMU original pose information includes angular velocity, acceleration, quaternion, velocity, and displacement, and the poses of these original information may be represented by a matrix
Figure BDA0002892916750000072
Is shown, in which:
Figure BDA0002892916750000073
r is a rotation matrix expressed in terms of quaternions,
t=[Sx Sy Sz]Tand is a translation matrix representing displacements in the x, y, z directions of the IMU coordinate system B.
q0Represents the scale, q1Expression of a vector representing rotation about the z-axis, q2Expression of a vector representing rotation about the y-axis, q3Representing a vector representation of a rotation around the x-axis.
Constructing an original error equation of laser radar point cloud matching according to the laser radar feature point set as follows:
Figure BDA0002892916750000074
r, A thereini、BiAnd t represents a rotation matrix, a source point cloud ith point, a target point cloud ith point and a translation matrix respectively.
The initial conversion matrix of the two adjacent frames of laser radar original point clouds
Figure BDA0002892916750000075
Comprises the following steps:
Figure BDA0002892916750000076
wherein
Figure BDA0002892916750000077
Represents the inverse of IMU pose corresponding to the kth frame point cloud, Tk+1And representing the IMU pose corresponding to the point cloud of the (k + 1) th frame.
The joint error equation is:
Figure BDA0002892916750000078
still further, as shown in fig. 4, the iterative computation of the ICP algorithm on the joint error equation includes:
recording the original point clouds of two adjacent frames of laser radars as source point cloud Ai=(xi yi zi) And a target point cloud Bj=(xj yjzj) Wherein i belongs to (1,2, 3., n), j belongs to (1,2, 3., m), in the formula, (x y z) is a coordinate value of the point cloud under a laser radar coordinate system L, n represents the quantity of source point clouds, and m represents the quantity of target point clouds;
determining an angular point set and a plane point set corresponding to two adjacent frames of laser radar original point clouds according to a Euclidean distance minimum principle between two points; and performing ICP algorithm iterative computation on the joint error equation by taking the conversion initial matrix as an initial value to obtain an optimal transformation matrix, wherein the optimal transformation matrix enables the joint error equation value to be minimum.
Still further, as shown in fig. 4, the iterative computation of the ICP algorithm on the joint error equation further includes:
transforming the target point cloud by adopting the optimal transformation matrix to obtain a new target point cloud;
calculating the average distance d between the new target point cloud and the source point cloud:
Figure BDA0002892916750000081
b 'in the formula'iRepresenting a new target point cloud;
and when the average distance d is smaller than a set distance threshold lambda or the iteration times reach a set iteration time N, stopping iterative computation, and taking the obtained optimal transformation matrix as a final optimal transformation matrix.
In this embodiment, after the average distance d is obtained each time, if any one of the distance threshold λ and the set iteration number N is satisfied, the iterative computation may be stopped, and the optimal transformation matrix is output.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims. It should be understood that features described in different dependent claims and herein may be combined in ways different from those described in the original claims. It is also to be understood that features described in connection with individual embodiments may be used in other described embodiments.

Claims (8)

1. An iterative closest point improvement algorithm of laser radar based on IMU coupling is characterized by comprising the following steps,
synchronizing the lidar and the IMU clock sources;
preprocessing the laser radar original point cloud, and then extracting a characteristic point set to obtain a laser radar characteristic point set; obtaining a pre-integration result and a conversion initial matrix of two adjacent frames of laser radar original point clouds according to the IMU original poses corresponding to the two adjacent frames of laser radar original point clouds;
constructing an original error equation of laser radar point cloud matching according to the laser radar feature point set; constructing an observation error equation of the IMU according to the pre-integration result; then constructing a joint error equation of the IMU and the laser radar based on the original error equation and the observation error equation;
and performing ICP algorithm iterative computation on a joint error equation by taking the conversion initial matrix as an initial value based on the laser radar feature point sets corresponding to the two adjacent frames of laser radar original point clouds to obtain an optimal transformation matrix.
2. The IMU coupling based lidar iterative closest point refinement algorithm of claim 1,
and the laser radar and the IMU realize clock source synchronization through a time synchronizer.
3. The IMU coupling based lidar iterative closest point refinement algorithm of claim 1 or 2, wherein,
after preprocessing the laser radar original point cloud, extracting a characteristic point set, wherein the process of obtaining the laser radar characteristic point set comprises the following steps:
collecting original point clouds of the laser radar, and removing outliers by using a filter; sequentially connecting all the reserved original point clouds into a line, and calculating the cosine absolute value of an included angle formed by the connection line of each original point cloud and the two adjacent original point clouds in front and back; if the cosine absolute value is smaller than a preset minimum threshold, the current original point cloud is an angular point; if the cosine absolute value is greater than or equal to a preset minimum threshold and less than a preset maximum threshold, the current original point cloud is a non-characteristic point; if the cosine absolute value is larger than a preset maximum threshold value, the current original point cloud is a plane point;
the laser radar characteristic point set comprises an angular point set formed by all angular points and a plane point set formed by all plane points.
4. The IMU coupling-based lidar iterative closest point improvement algorithm of claim 3, wherein the specific process of constructing the observation error equation of the IMU according to the pre-integration result comprises:
establishing a laser radar coordinate system L and an IMU coordinate system B, and converting the IMU coordinate system B into the laser radar coordinate system L;
setting IMU original pose information as follows:
Figure FDA0002892916740000011
wherein
Figure FDA0002892916740000012
The IMU state of the laser radar under a laser radar coordinate system L corresponding to the k frame original point cloud of the laser radar, including the IMU position
Figure FDA0002892916740000013
IMU speed
Figure FDA0002892916740000014
IMU quaternion
Figure FDA0002892916740000015
IMU speed drift
Figure FDA0002892916740000016
And IMU acceleration drift
Figure FDA0002892916740000017
Pre-integrating IMU state data corresponding to the k frame of original point cloud of the laser radar and the k +1 frame of original point cloud to obtain state information containing noise in a laser radar coordinate system L relative to the IMU original poses corresponding to the k frame of original point cloud of the laser radar and the k +1 frame of original point cloud:
Figure FDA0002892916740000021
wherein
Figure FDA0002892916740000022
The position of the IMU at the k +1 th frame time of the laser radar coordinate system L,
Figure FDA0002892916740000023
The position of the IMU at the kth frame time of the laser radar coordinate system L,
Figure FDA0002892916740000024
For IMU in lidar coordinate systemSpeed at time of L k-th frame, Δ tkThe time variation, g, corresponding to the IMU from the kth frame to the (k + 1) th frame of the laser radar coordinate system LLIs the gravity acceleration of the IMU under the laser radar coordinate system L,
Figure FDA0002892916740000025
The displacement of the IMU from the kth frame to the (k + 1) th frame in a laser radar coordinate system L;
the velocity of the IMU at the moment of the (k + 1) th frame of the lidar coordinate system L
Figure FDA0002892916740000026
Comprises the following steps:
Figure FDA0002892916740000027
in the formula
Figure FDA0002892916740000028
The velocity of the IMU from the kth frame to the (k + 1) th frame in a laser radar coordinate system L;
further obtain
Figure FDA0002892916740000029
Wherein
Figure FDA00028929167400000210
Is the (k + 1) th frame IMU quaternion,
Figure FDA00028929167400000211
the rotation of the IMU from the kth frame to the (k + 1) th frame under a laser radar coordinate system L;
constructing a state error equation of displacement, speed and angle of the IMU:
Figure FDA00028929167400000212
Figure FDA00028929167400000213
Figure FDA00028929167400000214
wherein
Figure FDA00028929167400000215
Representing the displacement error from the k frame to the k +1 frame of the IMU under the laser radar coordinate system L,
Figure FDA00028929167400000216
Representing the speed error from the k frame to the k +1 frame of the IMU under the laser radar coordinate system L,
Figure FDA00028929167400000217
Representing the rotation error of the IMU from the k frame to the k +1 frame under a laser radar coordinate system L;
constructing an observation error equation of the IMU:
Figure FDA00028929167400000218
EBindicating the observed error of the IMU.
5. The IMU coupling based lidar iterative closest point refinement algorithm of claim 4,
constructing an original error equation of laser radar point cloud matching according to the laser radar feature point set as follows:
Figure FDA0002892916740000031
in the formula ELRepresenting the original error of the lidar point cloud matching, R represents the rotation matrix,AiRepresenting a source point cloud, BiRepresenting the target point cloud and t representing the translation matrix.
6. The IMU coupling based lidar iterative closest point refinement algorithm of claim 5,
the joint error equation is
Figure FDA0002892916740000032
Where E represents the joint error.
7. The IMU coupling based lidar iterative closest point refinement algorithm of claim 6,
the iterative calculation of the ICP algorithm on the joint error equation comprises the following steps:
recording the original point clouds of two adjacent frames of laser radars as source point cloud Ai=(xi yi zi) And a target point cloud Bj=(xj yj zj) Wherein i belongs to (1,2, 3., n), j belongs to (1,2, 3., m), in the formula, (x y z) is a coordinate value of the point cloud under a laser radar coordinate system L, n represents the quantity of source point clouds, and m represents the quantity of target point clouds;
determining an angular point set and a plane point set corresponding to two adjacent frames of laser radar original point clouds according to a Euclidean distance minimum principle between two points; and performing ICP algorithm iterative computation on the joint error equation by taking the conversion initial matrix as an initial value to obtain an optimal transformation matrix, wherein the optimal transformation matrix enables the joint error equation value to be minimum.
8. The IMU coupling based lidar iterative closest point refinement algorithm of claim 7,
the iterative computation of the ICP algorithm on the joint error equation further comprises the following steps:
transforming the target point cloud by adopting the optimal transformation matrix to obtain a new target point cloud;
calculating the average distance d between the new target point cloud and the source point cloud:
Figure FDA0002892916740000033
b 'in the formula'iRepresenting a new target point cloud;
and when the average distance d is smaller than a set distance threshold lambda or the iteration times reach a set iteration time N, stopping iterative computation, and taking the obtained optimal transformation matrix as a final optimal transformation matrix.
CN202110032151.5A 2021-01-11 2021-01-11 Laser radar iteration closest point improvement algorithm based on IMU coupling Active CN112781594B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110032151.5A CN112781594B (en) 2021-01-11 2021-01-11 Laser radar iteration closest point improvement algorithm based on IMU coupling

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110032151.5A CN112781594B (en) 2021-01-11 2021-01-11 Laser radar iteration closest point improvement algorithm based on IMU coupling

Publications (2)

Publication Number Publication Date
CN112781594A true CN112781594A (en) 2021-05-11
CN112781594B CN112781594B (en) 2022-08-19

Family

ID=75756514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110032151.5A Active CN112781594B (en) 2021-01-11 2021-01-11 Laser radar iteration closest point improvement algorithm based on IMU coupling

Country Status (1)

Country Link
CN (1) CN112781594B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113391300A (en) * 2021-05-21 2021-09-14 中国矿业大学 Laser radar three-dimensional point cloud real-time motion compensation method based on IMU
CN113625302A (en) * 2021-09-03 2021-11-09 国网山东省电力公司济宁供电公司 Underground corridor dangerous area identification method and system based on handheld laser radar
CN113625301A (en) * 2021-09-03 2021-11-09 国网山东省电力公司济宁供电公司 Method and system for enlarging mapping view field based on single scanning head laser radar
CN114660589A (en) * 2022-03-25 2022-06-24 中国铁建重工集团股份有限公司 Method, system and device for positioning underground tunnel
CN114723795A (en) * 2022-04-18 2022-07-08 长春工业大学 Bucket wheel machine unmanned operation positioning and mapping method based on improved nearest point registration
CN115683170A (en) * 2023-01-04 2023-02-03 成都西物信安智能系统有限公司 Calibration method based on radar point cloud data fusion error

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180356825A1 (en) * 2017-06-13 2018-12-13 TuSimple UNDISTORTED RAW LiDAR SCANS AND STATIC POINT EXTRACTIONS METHOD FOR GROUND TRUTH STATIC SCENE SPARSE FLOW GENERATION
CN110009739A (en) * 2019-01-29 2019-07-12 浙江省北大信息技术高等研究院 The extraction and coding method of the motion feature of the digital retina of mobile camera
WO2019205813A1 (en) * 2018-04-26 2019-10-31 广东宝乐机器人股份有限公司 Method for correcting gyroscope data of mobile robot, device, and storage medium
CN110658530A (en) * 2019-08-01 2020-01-07 北京联合大学 Map construction method and system based on double-laser-radar data fusion and map
WO2020041668A1 (en) * 2018-08-23 2020-02-27 The Regents Of The University Of California Signals of opportunity aided inertial navigation
CN110879400A (en) * 2019-11-27 2020-03-13 炬星科技(深圳)有限公司 Method, equipment and storage medium for fusion positioning of laser radar and IMU
CN111045438A (en) * 2019-10-21 2020-04-21 武汉大学 Shipborne self-stabilizing platform and control system and method thereof
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
CN111273269A (en) * 2020-02-18 2020-06-12 桂林电子科技大学 IPSO-BP-based radar target positioning method of frequency diversity array
CN111461981A (en) * 2020-03-30 2020-07-28 北京百度网讯科技有限公司 Error estimation method and device for point cloud splicing algorithm
CN111522043A (en) * 2020-04-30 2020-08-11 北京联合大学 Unmanned vehicle laser radar rapid re-matching positioning method
WO2020168787A1 (en) * 2019-02-20 2020-08-27 苏州风图智能科技有限公司 Method and device for determining pose of vehicle body, and drafting method
CN111812669A (en) * 2020-07-16 2020-10-23 南京航空航天大学 Winding inspection device, positioning method thereof and storage medium
CN111951341A (en) * 2020-08-25 2020-11-17 桂林电子科技大学 Closed loop detection improvement method based on RGB-D SLAM
CN111982091A (en) * 2020-07-09 2020-11-24 安徽博龙动力科技股份有限公司 Laser point cloud distortion correction method based on synchronous IMU

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180356825A1 (en) * 2017-06-13 2018-12-13 TuSimple UNDISTORTED RAW LiDAR SCANS AND STATIC POINT EXTRACTIONS METHOD FOR GROUND TRUTH STATIC SCENE SPARSE FLOW GENERATION
WO2019205813A1 (en) * 2018-04-26 2019-10-31 广东宝乐机器人股份有限公司 Method for correcting gyroscope data of mobile robot, device, and storage medium
WO2020041668A1 (en) * 2018-08-23 2020-02-27 The Regents Of The University Of California Signals of opportunity aided inertial navigation
CN110009739A (en) * 2019-01-29 2019-07-12 浙江省北大信息技术高等研究院 The extraction and coding method of the motion feature of the digital retina of mobile camera
WO2020168787A1 (en) * 2019-02-20 2020-08-27 苏州风图智能科技有限公司 Method and device for determining pose of vehicle body, and drafting method
CN110658530A (en) * 2019-08-01 2020-01-07 北京联合大学 Map construction method and system based on double-laser-radar data fusion and map
CN111045438A (en) * 2019-10-21 2020-04-21 武汉大学 Shipborne self-stabilizing platform and control system and method thereof
CN110879400A (en) * 2019-11-27 2020-03-13 炬星科技(深圳)有限公司 Method, equipment and storage medium for fusion positioning of laser radar and IMU
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
CN111273269A (en) * 2020-02-18 2020-06-12 桂林电子科技大学 IPSO-BP-based radar target positioning method of frequency diversity array
CN111461981A (en) * 2020-03-30 2020-07-28 北京百度网讯科技有限公司 Error estimation method and device for point cloud splicing algorithm
CN111522043A (en) * 2020-04-30 2020-08-11 北京联合大学 Unmanned vehicle laser radar rapid re-matching positioning method
CN111982091A (en) * 2020-07-09 2020-11-24 安徽博龙动力科技股份有限公司 Laser point cloud distortion correction method based on synchronous IMU
CN111812669A (en) * 2020-07-16 2020-10-23 南京航空航天大学 Winding inspection device, positioning method thereof and storage medium
CN111951341A (en) * 2020-08-25 2020-11-17 桂林电子科技大学 Closed loop detection improvement method based on RGB-D SLAM

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
WANLI LIU: "LiDAR-IMU Time Delay Calibration Based on", 《SENSORS》 *
伍锡如等: "改进ORB特征的机器人RGB-D SLAM算法", 《计算机工程与应用》 *
周光召等: "结构化环境下基于结构单元软编码的3维激光雷达点云描述子", 《机器人》 *
李帅鑫等: "LiDAR/IMU 紧耦合的实时定位方法", 《自动化学报》 *
束庆波等: "浅析记载激光雷达测量系统的误差检验方法", 《低碳世界》 *
蒋成成等: "一种改进的迭代最近点算法", 《计算机系统应用》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113391300A (en) * 2021-05-21 2021-09-14 中国矿业大学 Laser radar three-dimensional point cloud real-time motion compensation method based on IMU
CN113391300B (en) * 2021-05-21 2022-02-01 中国矿业大学 Laser radar three-dimensional point cloud real-time motion compensation method based on IMU
CN113625302A (en) * 2021-09-03 2021-11-09 国网山东省电力公司济宁供电公司 Underground corridor dangerous area identification method and system based on handheld laser radar
CN113625301A (en) * 2021-09-03 2021-11-09 国网山东省电力公司济宁供电公司 Method and system for enlarging mapping view field based on single scanning head laser radar
CN113625301B (en) * 2021-09-03 2024-01-19 国网山东省电力公司济宁供电公司 Method and system for expanding view field of building map based on single-scanning-head laser radar
CN113625302B (en) * 2021-09-03 2024-05-21 国网山东省电力公司济宁供电公司 Underground corridor dangerous area identification method and system based on handheld laser radar
CN114660589A (en) * 2022-03-25 2022-06-24 中国铁建重工集团股份有限公司 Method, system and device for positioning underground tunnel
CN114660589B (en) * 2022-03-25 2023-03-10 中国铁建重工集团股份有限公司 Method, system and device for positioning underground tunnel
CN114723795A (en) * 2022-04-18 2022-07-08 长春工业大学 Bucket wheel machine unmanned operation positioning and mapping method based on improved nearest point registration
CN115683170A (en) * 2023-01-04 2023-02-03 成都西物信安智能系统有限公司 Calibration method based on radar point cloud data fusion error

Also Published As

Publication number Publication date
CN112781594B (en) 2022-08-19

Similar Documents

Publication Publication Date Title
CN112781594B (en) Laser radar iteration closest point improvement algorithm based on IMU coupling
CN112347840B (en) Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN112649016B (en) Visual inertial odometer method based on dotted line initialization
CN109709801B (en) Indoor unmanned aerial vehicle positioning system and method based on laser radar
CN112268559B (en) Mobile measurement method for fusing SLAM technology in complex environment
CN112304307A (en) Positioning method and device based on multi-sensor fusion and storage medium
CN111665512B (en) Ranging and mapping based on fusion of 3D lidar and inertial measurement unit
WO2020063878A1 (en) Data processing method and apparatus
CN115265523A (en) Robot simultaneous positioning and mapping method, device and readable medium
CN107144278B (en) Lander visual navigation method based on multi-source characteristics
CN114323033B (en) Positioning method and equipment based on lane lines and feature points and automatic driving vehicle
CN109300143A (en) Determination method, apparatus, equipment, storage medium and the vehicle of motion vector field
CN115407357A (en) Low-beam laser radar-IMU-RTK positioning mapping algorithm based on large scene
CN113674412B (en) Pose fusion optimization-based indoor map construction method, system and storage medium
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN113763549A (en) Method, device and storage medium for simultaneous positioning and mapping by fusing laser radar and IMU
CN108613675B (en) Low-cost unmanned aerial vehicle movement measurement method and system
CN116429116A (en) Robot positioning method and equipment
CN117232499A (en) Multi-sensor fusion point cloud map construction method, device, equipment and medium
CN115218889A (en) Multi-sensor indoor positioning method based on dotted line feature fusion
CN114690229A (en) GPS-fused mobile robot visual inertial navigation method
CN114754768A (en) Visual inertial navigation method based on point-line fusion
CN113959437A (en) Measuring method and system for mobile measuring equipment
CN114459474B (en) Inertial/polarization/radar/optical-fluidic combined navigation method based on factor graph
CN114543786B (en) Wall climbing robot positioning method based on visual inertial odometer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20210511

Assignee: Guilin Qunmei Technology Co.,Ltd.

Assignor: GUILIN University OF ELECTRONIC TECHNOLOGY

Contract record no.: X2023980044664

Denomination of invention: Improved Iterative Nearest Point Algorithm for LiDAR Based on IMU Coupling

Granted publication date: 20220819

License type: Common License

Record date: 20231031