CN115097419A - External parameter calibration method and device for laser radar IMU - Google Patents
External parameter calibration method and device for laser radar IMU Download PDFInfo
- Publication number
- CN115097419A CN115097419A CN202210498192.8A CN202210498192A CN115097419A CN 115097419 A CN115097419 A CN 115097419A CN 202210498192 A CN202210498192 A CN 202210498192A CN 115097419 A CN115097419 A CN 115097419A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- cloud data
- frames
- imu
- laser radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Manufacturing & Machinery (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The application relates to the technical field of laser radar calibration, and provides an external reference calibration method for achieving IMU by a laser radar, which comprises the following steps: acquiring continuous N frames of point cloud data detected by a laser radar, wherein N is more than or equal to 2; extracting the characteristics of continuous N frames of point cloud data, and determining characteristic points corresponding to the N frames of point cloud data; projecting N frames of point cloud data to a first coordinate system according to a conversion relation between a preset IMU and the first coordinate system; and determining an external reference matrix from the laser radar to the IMU according to the characteristic point corresponding to the Kth frame point cloud data in the N frames of point cloud data and the characteristic point corresponding to the 1 st frame point cloud data in the N frames of point cloud data under the first coordinate system, wherein K is more than or equal to 1 and is less than N. The external reference calibration method for the lidar to the IMU provided by the embodiment of the application can be based on two sensors, namely the lidar and the IMU, and can be used for providing accurate surrounding environment data for an automatic driving system by optimizing and fusing data acquired by the two sensors.
Description
Technical Field
The application belongs to the technical field of laser radar calibration, and particularly relates to an external reference calibration method and device for achieving laser radar to an IMU (inertial measurement Unit).
Background
In an automatic driving system, it can be said that the accuracy of sensing the surrounding environment by a sensor determines the stability of the operation of the automatic driving system, and in order to improve the accuracy of sensing the surrounding environment by the sensor, a multi-model complementary sensor cooperation technique is often adopted to acquire more surrounding environment information, and then data acquired by multiple sensors are subjected to fusion processing to obtain reliable surrounding environment data.
The laser radar can acquire 3D point cloud data containing accurate depth information and reflection intensity information, an Inertial Measurement Unit (IMU) can stably observe the body motion state of a vehicle and output driving state data of the vehicle at high frequency, but the laser radar can acquire fuzzy point cloud data aiming at a scene with similar environmental factors, and the characteristics of the point cloud data cannot be better extracted subsequently; the data output by the IMU may be affected by temperature and noise such that the output driving state data is biased. Therefore, the defects of a single sensor in the using process can be overcome through the fusion of the two sensors, but the data acquired by the heterogeneous sensors have large difference, and how to further optimize the fusion process based on the two sensors, namely the laser radar and the IMU, provides accurate ambient environment data becomes a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides an external reference calibration method and device for a lidar to reach an IMU (inertial measurement Unit), which can optimize and fuse data acquired by two sensors based on the lidar and the IMU, and provide accurate surrounding environment data for an automatic driving system.
In a first aspect, an embodiment of the present application provides an external reference calibration method for a laser radar IMU, where the method includes: acquiring continuous N frames of point cloud data detected by a laser radar, wherein N is more than or equal to 2; extracting the characteristics of the continuous N frames of point cloud data, and determining characteristic points corresponding to the N frames of point cloud data; projecting N frames of point cloud data to a first coordinate system according to a conversion relation between a preset IMU and the first coordinate system; and determining an external parameter matrix from the laser radar to the IMU according to the characteristic point corresponding to the Kth frame of point cloud data and the characteristic point corresponding to the 1 st frame of point cloud data in the N frames of point cloud data under the first coordinate system, wherein K is more than or equal to 1 and is less than N.
In a possible implementation manner of the first aspect, acquiring continuous N frames of point cloud data detected by a laser radar, where N is greater than or equal to 2, includes: acquiring point cloud data detected by a laser radar; dividing the point cloud data into M sliding windows, wherein each sliding window comprises N frames of point cloud data, M is more than or equal to 1, and N is more than or equal to 2.
In a possible implementation manner of the first aspect, performing feature extraction on continuous N frames of point cloud data, and determining feature points corresponding to the N frames of point cloud data includes: dividing each frame of point cloud data in the N frames of point cloud data into a plurality of first pixel grids according to a first preset resolution; and determining the characteristic point corresponding to each frame of point cloud data according to the covariance matrix corresponding to each first voxel grid in the plurality of first voxel grids.
In a possible implementation manner of the first aspect, determining, according to a covariance matrix corresponding to each first voxel grid in a plurality of first voxel grids, a feature point corresponding to each frame of point cloud data includes: calculating the characteristic value of a covariance matrix corresponding to each first voxel grid in the plurality of first voxel grids, and determining the characteristic point corresponding to the point cloud data in each first voxel grid;
if the feature points corresponding to the point cloud data in each first voxel grid belong to the same feature point, reserving the first voxel grid;
if the feature points corresponding to the point cloud data in each first voxel grid do not belong to the same feature point, dividing the first voxel grid into a plurality of second voxel grids according to a second preset resolution, and determining the feature points corresponding to the point cloud data in each second voxel grid until the feature points of the point cloud data in each first voxel grid are determined, wherein the second preset resolution is smaller than the first preset resolution.
In a possible implementation manner of the first aspect, the method further includes: if the number of the point cloud data in the first voxel grid is larger than a preset threshold value, acquiring the mean value of all the point cloud data in the first voxel grid; and determining characteristic points of the point cloud data in each first voxel grid.
In a possible implementation manner of the first aspect, after feature extraction is performed on continuous N frames of point cloud data, and feature points corresponding to the N frames of point cloud data are determined, the method further includes:
and establishing a hash table according to the association relation between the N frames of point cloud data and the characteristic points corresponding to the N frames of point cloud data.
In a possible implementation manner of the first aspect, determining an external reference matrix from the laser radar to the IMU according to a feature point corresponding to a kth frame point cloud data and a feature point corresponding to a 1 st frame point cloud data in N frames of point cloud data in a first coordinate system, where K is greater than or equal to 1 and less than N, includes:
and under a first coordinate system, minimizing the sum of distances from the characteristic point corresponding to the Kth frame of point cloud data in the N frames of point cloud data to the characteristic point corresponding to the 1 st frame of point cloud data to obtain an external reference matrix from the laser radar to the IMU, wherein K is more than or equal to 1 and is less than N.
In a second aspect, an embodiment of the present application provides an external reference calibration apparatus for a laser radar IMU, including: the acquisition unit is used for acquiring continuous N frames of point cloud data detected by the laser radar, wherein N is more than or equal to 2;
the first determining unit is used for extracting the characteristics of continuous N frames of point cloud data and determining the characteristic points corresponding to the N frames of point cloud data;
the conversion unit is used for projecting the N frames of point cloud data to the first coordinate system according to the conversion relation between the preset IMU and the first coordinate system;
and the second determining unit is used for determining an external parameter matrix from the laser radar to the IMU according to the characteristic point corresponding to the Kth frame point cloud data and the characteristic point corresponding to the 1 st frame point cloud data in the N frames of point cloud data under the first coordinate system, wherein K is more than or equal to 1 and is less than N.
In a third aspect, the present application provides a terminal device, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the method according to the first aspect or any alternative manner of the first aspect when executing the computer program.
In a fourth aspect, a computer readable storage medium stores a computer program which, when executed by a processor, implements a method as set forth in the first aspect or any alternative of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the method according to the first aspect or any alternative manner of the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that:
according to the method, the characteristic extraction is carried out on multi-frame point cloud data detected by a laser radar to obtain characteristic points corresponding to the multi-frame point cloud data, then the multi-frame point cloud data are converted to the same first coordinate system based on the conversion relation between the first frame point cloud data and other frame point cloud data in the multi-frame point cloud data, and finally an extrinsic reference conversion matrix from the laser radar to the IMU is obtained according to the characteristic points of each frame point cloud data in the point cloud data and the characteristic points corresponding to the first frame point cloud data in the same first coordinate system. According to the method, fusion processing is not directly carried out on data acquired by the laser radar and IMU data, the problem that a calibration result is inaccurate due to an accumulated error of the IMU data is solved, the data acquired by the laser radar is processed through optimization based on the laser radar and the IMU, and accurate surrounding environment data is provided for an automatic driving system.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required for the embodiments or the prior art descriptions will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
FIG. 1 is a schematic diagram of a plurality of sensor sensing ranges provided by an embodiment of the present application;
FIG. 2 is a schematic flow chart of filtering-based laser radar to IMU external parameter estimation according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of an external reference calibration method for a laser radar IMU according to an embodiment of the present application;
fig. 4 is a schematic view of a scene for acquiring multi-frame point cloud data detected by a laser radar according to an embodiment of the present application;
fig. 5 is a schematic diagram illustrating an effect of feature extraction according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a first coordinate system transformation process according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating an effect of processing acquired point cloud data by using an unaptimized extrinsic parameter matrix according to an embodiment of the present application;
fig. 8 is a schematic diagram illustrating an effect of processing the acquired same point cloud data by using the optimized external reference matrix according to the embodiment of the present application;
fig. 9 is a block diagram of a structure of an external reference calibration apparatus of an laser radar IMU according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In an automatic driving system, it can be said that the accuracy of sensing the surrounding environment by a sensor determines the stability of the operation of the automatic driving system, and in order to improve the accuracy of sensing the surrounding environment by the sensor, a multi-sensor calibration technology is often adopted to acquire more surrounding environment information, and then data acquired by multiple sensors are subjected to fusion processing to obtain reliable surrounding environment data.
In the multi-sensor calibration technology, the ability of sensing the surrounding environment and the sensing range of the multi-sensor are different, and the calibration process of the multi-sensor is a process of determining the relative position relationship among the plurality of sensors. Referring to fig. 1, a schematic diagram of a sensing range of multiple sensors provided in an embodiment of the present application is shown, and referring to fig. 1, Four-Layer laser scanner represents a Four-Layer laser scanning sensor, Single-Layer laser scanner represents a Single-Layer laser scanning sensor, Radar represents a millimeter wave Radar sensor, uidasonic represents an ultrasonic sensor, and Imu represents an inertial measurement sensor.
The laser radar can acquire 3D point cloud data containing accurate depth information and reflection intensity information, the IMU can stably observe the body motion state of the vehicle and output driving state data of the vehicle at high frequency, but the laser radar can acquire fuzzy point cloud data aiming at scenes with similar environmental factors, and the characteristics of the point cloud data cannot be better extracted subsequently; the data output by the IMU may be affected by temperature and noise such that the output driving state data is biased. Therefore, the fusion of the two sensors can overcome the defect of a single sensor in the using process, but the data acquired by heterogeneous sensors has great difference, and how to further optimize the fusion process based on the two sensors, namely the laser radar and the IMU, provides accurate ambient environment data becomes a problem to be solved urgently.
In the prior art, two calibration technologies are provided for a laser radar sensor and an IMU sensor, one is an external reference calibration method based on filtering, the method is to respectively solve pose data of the laser radar and pose data of the IMU, and then the pose data of the laser radar pose data and the pose data of the IMU are fused by using a Kalman filtering algorithm.
As shown in fig. 2, which is a schematic flow chart of filtering-based lidar IMU extrinsic parameter estimation provided in an embodiment of the present application, the filtering-based extrinsic parameter calibration method determines a relationship between a lidar coordinate system and an IMU coordinate system at two adjacent moments, specifically, it is assumed that a first moment corresponding to the two adjacent moments is marked as k, a second moment is marked as k +1, and L is marked as L k Coordinate system representing the lidar at time k, I k A coordinate system, L, representing the IMU at time k k+1 Coordinate system representing the laser radar at the time k +1, I k+1 Coordinate system representing IMU at time k +1, IMU to lidar coordinate systemConvert the matrix intoThe rotation matrix of the laser radar from the time k to the time k +1 isIf the conversion matrix from IMU to laser radar coordinate system is knownThen the rotation matrix of the lidar from time k to time k +1Can be determined by the following equation (1):
the above formula (1) can be solved with reference to the following formula relating to quaternions, and the above formula (1) is expressed by quaternions as the following formula (2):
wherein, the following formula (3) can be obtained by multiplying the above (2) to the left and right respectively:
if there is a corresponding relationship between the data detected by the sets of laser radars and the data acquired by the IMU, an over-determined equation shown in the following formula (4) can be obtained:
in the above formula (4), ω represents the weight corresponding to different groups of dataA weight value determined by a difference between a transformation vector between the laser radars at two adjacent time instants and a transformation vector derived by an IMU coordinate system, Q representing the weight value in the above equation (2)
Finally, the formula (4) is calculated by using a singular value decomposition method, and the eigenvector corresponding to the minimum singular value is taken to obtain a conversion matrix of the laser radar reaching the IMU coordinate systemTo obtain the compound shown in FIG. 2
In the method, the external reference calibration parameters between the data detected by the laser radar and the data obtained by the IMU cannot be obtained under the coordinate system of the vehicle body, the data obtained by the IMU has accumulated errors, and the data obtained by the IMU is not optimized, so that the accurate calibration parameters cannot be obtained.
The other method is an optimized extrinsic parameter calibration method, the method utilizes assumed six-degree-of-freedom parameters to convert data detected by a laser radar to an IMU coordinate system to form point cloud data, then determines nearest neighbor data points through a kd tree (k-dimensional tree) or an octree based on the point cloud data obtained by two times of scanning, calculates the distance corresponding to each nearest neighbor data point to obtain a total distance, considers point cloud data formed by the data detected by the laser radar converted to the IMU coordinate system obtained by two times of scanning to coincide when the total distance is minimum, and takes the corresponding six-degree-of-freedom parameters as calibration parameters for the laser radar to reach an IMU sensor. Although the method considers the situation that the data acquired by the IMU has accumulated errors, a corresponding model needs to be established and the total distance of the nearest neighbor data points needs to be calculated in the process, so that the calculation time is prolonged, and the real-time performance of data processing cannot be guaranteed.
Therefore, in order to solve the above problems, an embodiment of the present application provides an extrinsic parameter calibration method for a lidar to reach an IMU, where the method includes performing feature extraction on multiple frames of point cloud data detected by a lidar to obtain feature points corresponding to the multiple frames of point cloud data, then converting the multiple frames of point cloud data to the same first coordinate system based on a conversion relationship between a first frame of point cloud data in the multiple frames of point cloud data and other frames of point cloud data, and finally obtaining an extrinsic parameter conversion matrix from the lidar to the IMU according to feature points of each frame of point cloud data in the same first coordinate system and feature points corresponding to the first frame of point cloud data. The method does not directly perform fusion processing on the data acquired by the laser radar and the IMU data, overcomes the problem of inaccurate calibration result caused by the accumulative error of the IMU data, realizes the optimization processing of the data acquired by the laser radar based on the two sensors of the laser radar and the IMU, and provides accurate ambient data for the automatic driving system.
In addition, the method is based on the same first coordinate system, the sum of distances from the characteristic point corresponding to the Kth frame of point cloud data to the characteristic point corresponding to the 1 st frame of point cloud data in the multi-frame of point cloud data is minimized, the external parameter matrix from the laser radar to the IMU is obtained, the obtained external parameter transformation matrix is the optimized external parameter transformation matrix, and the accuracy of the external parameter transformation matrix is further guaranteed.
The technical solution of the present application will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 3 is a schematic flow chart of an external reference calibration method for a laser radar IMU according to an embodiment of the present application, where the method includes:
s101, acquiring continuous N frames of point cloud data detected by a laser radar, wherein N is more than or equal to 2.
It should be understood that point cloud data refers to a collection of vectors in a three-dimensional coordinate system. In this step, three-dimensional coordinates corresponding to each point cloud point in the point cloud data are essentially obtained.
In order to accelerate the processing process of the point cloud data, in one possible implementation, N frames of point cloud data detected by the laser radar are acquired, where N is greater than or equal to 2, and the method includes: acquiring point cloud data detected by a laser radar; dividing the point cloud data into M sliding windows, wherein each sliding window comprises N frames of point cloud data, M is more than or equal to 1, and N is more than or equal to 2.
For example, assuming that the point cloud data detected by the laser radar includes 10 frames of point cloud data, the 10 frames of point cloud data may be divided into two sliding windows, each sliding window includes 5 frames of point cloud data, and the point cloud data in the sliding window including 5 frames of point cloud data is processed in sequence.
In the embodiment of the application, the acquired point cloud data is stored as a file in a pcd format, so that subsequent data calling is facilitated.
In order to sufficiently acquire point cloud data at different rotation angles, in the embodiment of the application, a vehicle with an IMU and a lidar deployed drives around a splayed track as shown in fig. 4 to acquire the point cloud data at a rotation angle of 360 °.
In a possible implementation, in order to avoid the influence of noise on the reference result, the point cloud data detected by the laser radar may be filtered, and the noise may be filtered from the detected point cloud data, for example, the filtering method may include kalman filtering, smoothing filtering, and/or laplacian operator, and the like.
And S102, extracting the characteristics of the continuous N frames of point cloud data, and determining the characteristic points corresponding to the N frames of point cloud data.
The feature extraction process is also a process of extracting feature points corresponding to each frame of point cloud data in the N frames of point cloud data, that is, it is determined whether point cloud points in each frame of point cloud data belong to points on an edge line (hereinafter, referred to as line points) or points on a plane (hereinafter, referred to as plane points).
In one possible implementation, the performing feature extraction on the continuous N frames of point cloud data, and determining feature points corresponding to the N frames of point cloud data includes: dividing each frame of point cloud data in the N frames of point cloud data into a plurality of first voxel grids according to a first preset resolution; and determining the characteristic point corresponding to each frame of point cloud data according to the covariance matrix corresponding to each first voxel grid in the plurality of first voxel grids.
Determining feature points corresponding to each frame of point cloud data according to a covariance matrix corresponding to each first voxel grid in a plurality of first voxel grids, wherein the determining comprises the following steps: calculating the characteristic value of a covariance matrix corresponding to each first voxel grid in the plurality of first voxel grids, and determining the characteristic point corresponding to the point cloud data in each first voxel grid; if the feature points corresponding to the point cloud data in each first pixel grid belong to the same feature point, reserving the first pixel grids; if the feature points corresponding to the point cloud data in each first voxel grid do not belong to the same feature point, dividing the first voxel grid into a plurality of second voxel grids according to a second preset resolution, and determining the feature points corresponding to the point cloud data in each second voxel grid until the feature points of the point cloud data in each first voxel grid are determined, wherein the second preset resolution is smaller than the first preset resolution.
In the embodiment of the present application, a feature point corresponding to each frame of point cloud data in N frames of point cloud data is extracted by using a self-adaptive voxel grid method, and an effect schematic diagram after feature extraction provided by the embodiment of the present application is shown in fig. 5.
The feature extraction process comprises: aiming at each frame of point cloud data in the N frames of point cloud data, dividing a plurality of first pixel grids according to a first preset resolution, calculating a characteristic value of a covariance matrix in each first pixel grid in the plurality of first pixel grids, judging whether point cloud points in each first pixel grid fall on the same edge line or plane, and if the points in the first pixel grids all fall on the same edge line or plane, keeping the current first pixel grid; on the contrary, dividing the first voxel grid into a plurality of second voxel grids according to a second preset resolution, wherein the second preset resolution is smaller than the first preset resolution; the step of determining that the points in each second voxel grid all lie on the same edge line or plane is repeated until the point cloud data in each first voxel grid all lie on the same edge line or plane.
For example, assuming that the three-dimensional coordinate value of the point cloud data acquired by the laser radar is measured in meters, the first preset resolution may be set to 1m, and a voxel grid with the resolution of 1m is selected to divide each frame of point cloud data in N frames of point cloud data.
In the actual design process, if the number of the point cloud data in the first voxel grid is greater than a first preset threshold, obtaining the average value of the point cloud data in the first voxel grid so as to judge whether the point cloud points in the first voxel grid fall on the same edge line or plane.
Of course, if the number of the point cloud data in the second voxel grid is greater than the second preset threshold, the average value of the point cloud data in the second voxel grid may also be obtained to determine whether the point cloud points in the second voxel grid fall on the same edge line or plane.
The first preset threshold and the second preset threshold may be designed according to actual needs, and the present application is not limited thereto.
In another possible implementation manner, the acquired point cloud data is divided into M sliding windows, wherein each sliding window of the M sliding windows comprises N frames of point cloud data, M is more than or equal to 1, and N is more than or equal to 2; and performing feature extraction on the N frames of point cloud data in each sliding window, and determining a feature point corresponding to each frame of point cloud data in the N frames of point cloud data.
S103, projecting the N frames of point cloud data to a preset first coordinate system according to a conversion relation between the preset IMU and the first coordinate system.
The first coordinate system refers to a fixed coordinate system, and in this embodiment, the first coordinate system may be, for example, a world coordinate system, where the world coordinate system is an absolute coordinate system of an autopilot system where the lidar and the IMU are located.
Referring to fig. 6, a schematic diagram of a first coordinate system transformation process provided in the embodiment of the present application is shown, and referring to fig. 6, a solid line coordinate system L represents a laser radar coordinate system, a dashed line coordinate system I represents an IMU coordinate system, and the first coordinate system is a world coordinate system W.
Assuming that N is 5, the time P in fig. 6 represents the time for acquiring the 1 st frame point cloud data in the 5 frames of point cloud data, and the time O in fig. 6 represents the time for acquiring the 3 rd frame point cloud data in the 5 frames of point cloud data, the process of projecting the 3 rd frame point cloud data to the world coordinate system W may be referred to the following formula (5):
in the above-mentioned formula (5),representing a conversion matrix from the laser radar coordinate system at the time O to the laser radar coordinate system at the time P;represents a transformation matrix between the laser radar coordinate system and the IMU coordinate system, namely a preset external parameter matrix,representing a transformation matrix between the IMU coordinate system at time P to the world coordinate system, i.e.An inverse matrix representing a transformation matrix between the IMU coordinate system to the world coordinate system at time P,and representing a transformation matrix between the IMU coordinate system to the world coordinate system at the time O. It is understood that the above process of projecting the 3 rd frame point cloud data to the world coordinate system W is essentially a process of projecting the 2 nd frame point cloud data to the nth frame point cloud data of the N frame point cloud data to the 1 st frame point cloud data, respectively, and then converting the data to the world coordinate system W.
In the process of practical application, the material is,namely the conversion relation between the IMU and the world coordinate system W, the specific value of the conversion relation can be determined according to the data collected by the IMU,the specific value of the parameter matrix can be determined according to the point cloud data detected by the laser radar, the preset external parameter matrix can be set according to practical application, and the application is not limited to this.
After the continuous N frames of point cloud data are all converted into the first coordinate system (i.e., the world coordinate system) by the above formula (5), in order to facilitate query and retrieval of the point cloud data and the corresponding feature points thereof, optionally, a hash table is established according to the association relationship between the N frames of point cloud data and the feature points respectively corresponding to the N frames of point cloud data.
And S104, determining an external parameter matrix from the laser radar to the IMU according to the characteristic point corresponding to the Kth frame point cloud data and the characteristic point corresponding to the 1 st frame point cloud data in the N frames of point cloud data under the first coordinate system, wherein K is more than or equal to 1 and is less than N.
In the embodiment of the application, the sum of distances from the characteristic point corresponding to the Kth frame of point cloud data to the characteristic point corresponding to the 1 st frame of point cloud data in the N frames of point cloud data is minimized by using the following formula (6) in a first coordinate system to obtain an external parameter matrix from a laser radar to an IMU, wherein K is more than or equal to 1 and is less than N.
In the above formula (6), i represents the i-th feature point, q represents a feature point in a certain edge line or plane of the 1 st frame point cloud data in the M frame sliding window, and p i Feature point p representing Kth frame point cloud data in M frame sliding window i ,n T The direction vector representing the edge line or the normal vector of the plane.
Suppose that the ith feature point is from the S-th i Extracting frames, wherein i belongs to {1, …, N }, Si belongs to {1, …, M }, and the three-dimensional coordinate mark corresponding to the point cloud data in the M-frame sliding window is T ═ T (T ═ T) 1 ,…,T M ) (ii) a Then the characteristic point p of the Kth frame point cloud data in the M frame sliding window i Can be expressed by the following formula (7):
p i =R si p fi +t si i=1,…,N (7)
in the above formula (7), p fi (i e is epsilon {1, …, N }) represents a feature point extracted from the M-frame sliding window, and the feature point is associated to the same plane or the same edge line; r si Representing a rotation matrix, t si A translation matrix is represented.
And (4) determining the external parameter matrix from the laser radar to the IMU according to the formula (6) and the formula (7).
And aiming at the condition that the point cloud data detected by the laser radar is divided into sliding windows, minimizing the sum of the distances from the Kth frame of point cloud data in the M frames of sliding windows to the corresponding updated 1 st frame of point cloud data, and obtaining an external parameter matrix from the laser radar to the IMU of each sliding window, wherein K is more than or equal to 1 and less than N. It will be appreciated that the lidar to IMU external parameter matrices acquired based on each sliding window are equal, based on the unchanged deployment positions of the lidar and IMU on the vehicle.
Fig. 7 is a schematic diagram of an effect of processing acquired point cloud data by using an unaptimized extrinsic parameter matrix according to the embodiment of the present application, and fig. 8 is a schematic diagram of an effect of processing the same acquired point cloud data by using an optimized extrinsic parameter matrix according to the embodiment of the present application, which is shown in fig. 8.
According to the external reference calibration method for achieving IMU (inertial measurement Unit) by the laser radar, feature extraction is carried out on multi-frame point cloud data detected by the laser radar to obtain feature points corresponding to the multi-frame point cloud data, then the multi-frame point cloud data are converted to the same first coordinate system based on the conversion relation between the first frame point cloud data and other frames of point cloud data in the multi-frame point cloud data, and finally an external reference conversion matrix from the laser radar to the IMU is obtained according to the feature points of each frame of point cloud data in the point cloud data and the feature points corresponding to the first frame point cloud data in the same first coordinate system. The method does not directly perform fusion processing on the data acquired by the laser radar and the IMU data, overcomes the problem of inaccurate calibration result caused by the accumulative error of the IMU data, realizes the optimization processing of the data acquired by the laser radar based on the two sensors of the laser radar and the IMU, and provides accurate ambient data for the automatic driving system.
In addition, the method is based on the same first coordinate system, the sum of distances from the characteristic point corresponding to the Kth frame of point cloud data to the characteristic point corresponding to the 1 st frame of point cloud data in the multi-frame of point cloud data is minimized, the external parameter matrix from the laser radar to the IMU is obtained, the obtained external parameter transformation matrix is the optimized external parameter transformation matrix, and the accuracy of the external parameter transformation matrix is further guaranteed.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Corresponding to the external reference calibration method from the laser radar to the IMU described in the foregoing embodiment, fig. 9 shows a structural block diagram of an external reference calibration apparatus for a laser radar to an IMU provided in the embodiment of the present application, and for convenience of description, only the parts related to the embodiment of the present application are shown.
Referring to fig. 5, the external reference calibration apparatus 200 of the apparatus for laser radar to IMU includes: the acquisition unit 201 is used for acquiring continuous N frames of point cloud data detected by a laser radar, wherein N is more than or equal to 2;
a first determining unit 202, configured to perform feature extraction on continuous N frames of point cloud data, and determine feature points corresponding to the N frames of point cloud data;
the conversion unit 203 is configured to project the N frames of point cloud data to the first coordinate system according to a conversion relationship between a preset IMU and the first coordinate system;
and a second determining unit 204, configured to determine, in the first coordinate system, an external reference matrix from the laser radar to the IMU according to a feature point corresponding to a kth frame of point cloud data and a feature point corresponding to a 1 st frame of point cloud data in the N frames of point cloud data, where K is greater than or equal to 1 and is less than N.
Optionally, the obtaining unit 201 is further configured to: acquiring point cloud data detected by a laser radar; dividing the point cloud data into M sliding windows, wherein each sliding window comprises N frames of point cloud data, M is more than or equal to 1, and N is more than or equal to 2.
Optionally, the first determining unit 202 is further configured to: dividing each frame of point cloud data in the N frames of point cloud data into a plurality of first pixel grids according to a first preset resolution; and determining the characteristic point corresponding to each frame of point cloud data according to the covariance matrix corresponding to each first voxel grid in the plurality of first voxel grids.
Optionally, determining a feature point corresponding to each frame of point cloud data according to a covariance matrix corresponding to each first voxel grid in the plurality of first voxel grids includes: calculating the characteristic value of a covariance matrix corresponding to each first voxel grid in the plurality of first voxel grids, and determining the characteristic point corresponding to the point cloud data in each first voxel grid;
if the feature points corresponding to the point cloud data in each first voxel grid belong to the same feature point, reserving the first voxel grid;
if the feature points corresponding to the point cloud data in each first voxel grid do not belong to the same feature point, dividing the first voxel grid into a plurality of second voxel grids according to a second preset resolution, and determining the feature points corresponding to the point cloud data in each second voxel grid until the feature points of the point cloud data in each first voxel grid are determined, wherein the second preset resolution is smaller than the first preset resolution.
Optionally, the method further comprises: if the number of the point cloud data in the first voxel grid is larger than a preset threshold value, acquiring the mean value of all the point cloud data in the first voxel grid; feature points of the point cloud data within each first voxel grid are determined.
Optionally, after feature extraction is performed on continuous N frames of point cloud data, and feature points corresponding to the N frames of point cloud data are determined, the method further includes:
and establishing a hash table according to the association relation between the N frames of point cloud data and the characteristic points corresponding to the N frames of point cloud data.
Optionally, the second determining unit 204 is further configured to: and under a first coordinate system, minimizing the sum of distances from the characteristic point corresponding to the Kth frame of point cloud data in the N frames of point cloud data to the characteristic point corresponding to the 1 st frame of point cloud data to obtain an external reference matrix from the laser radar to the IMU, wherein K is more than or equal to 1 and is less than N.
It should be noted that, for the information interaction, execution process, and other contents between the above devices/units, the specific functions and technical effects thereof based on the same concept as those of the method embodiment of the present application can be specifically referred to the method embodiment portion, and are not described herein again.
Based on the same inventive concept, the embodiment of the present application further provides a terminal device, where the terminal device 300 is shown in fig. 10.
As shown in fig. 10, the terminal device 300 of this embodiment includes: a processor 301, a memory 302, and a computer program 303 stored in the memory 302 and operable on the processor 301. The computer program 303 may be executed by the processor 301 to generate instructions, and the processor 301 may implement the steps in the embodiments of the authority authentication method according to the instructions. Alternatively, the processor 301 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 303.
Illustratively, the computer program 303 may be partitioned into one or more modules/units, which are stored in the memory 302 and executed by the processor 301 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 303 in the terminal device 300.
Those skilled in the art will appreciate that fig. 10 is merely an example of the terminal device 300 and is not intended to limit the terminal device 300 and may include more or less components than those shown, or some components may be combined, or different components, for example, the terminal device 300 may also include input and output devices, network access devices, buses, etc.
The Processor 301 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 302 may be an internal storage unit of the terminal device 300, such as a hard disk or a memory of the terminal device 300. The memory 302 may also be an external storage device of the terminal device 300, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device 300. Further, the memory 302 may also include both an internal storage unit of the terminal device 300 and an external storage device. The memory 302 is used to store computer programs and other programs and data required by the terminal device 300. The memory 302 may also be used to temporarily store data that has been output or is to be output.
The terminal device provided in this embodiment may execute the method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Embodiments of the present application also provide a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the method of the above-mentioned method embodiments.
The embodiment of the present application further provides a computer program product, which, when running on a terminal device, enables the terminal device to implement the method of the foregoing method embodiment when executed.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc.
Reference throughout this application to "one embodiment" or "some embodiments," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In the description of the present application, it is to be understood that the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature.
In addition, in the present application, unless otherwise explicitly specified or limited, the terms "connected," "connected," and the like are to be construed broadly, e.g., as meaning both mechanically and electrically; the terms may be directly connected or indirectly connected through an intermediate medium, and may be used for communicating between two elements or for interacting between two elements, unless otherwise specifically defined, and the specific meaning of the terms in the present application may be understood by those skilled in the art according to specific situations.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
Claims (10)
1. An external reference calibration method for a laser radar IMU is characterized by comprising the following steps:
acquiring continuous N frames of point cloud data detected by a laser radar, wherein N is more than or equal to 2;
extracting the characteristics of the continuous N frames of point cloud data, and determining characteristic points corresponding to the N frames of point cloud data;
projecting the N frames of point cloud data to a first coordinate system according to a conversion relation between a preset IMU and the first coordinate system;
and under the first coordinate system, determining an external parameter matrix from the laser radar to the IMU according to the characteristic point corresponding to the Kth frame point cloud data and the characteristic point corresponding to the 1 st frame point cloud data in the N frames of point cloud data, wherein K is more than or equal to 1 and is less than N.
2. The method of claim 1, wherein the acquiring N frames of point cloud data detected by the laser radar, N being greater than or equal to 2, comprises:
acquiring point cloud data detected by a laser radar;
and dividing the point cloud data into M sliding windows, wherein each sliding window comprises N frames of point cloud data, M is more than or equal to 1, and N is more than or equal to 2.
3. The method of claim 1, wherein the performing feature extraction on the continuous N frames of point cloud data and determining feature points corresponding to the N frames of point cloud data comprises:
dividing each frame of point cloud data in the N frames of point cloud data into a plurality of first voxel grids according to a first preset resolution;
and determining the characteristic points corresponding to each frame of point cloud data according to the covariance matrix corresponding to each first voxel grid in the plurality of first voxel grids.
4. The method of claim 3, wherein determining the feature points corresponding to each frame of point cloud data according to the covariance matrix corresponding to each of the plurality of first voxel grids comprises:
calculating the characteristic value of a covariance matrix corresponding to each first voxel grid in a plurality of first voxel grids, and determining the characteristic point corresponding to the point cloud data in each first voxel grid;
if the feature points corresponding to the point cloud data in each first voxel grid belong to the same feature point, reserving the first voxel grid;
if the feature points corresponding to the point cloud data in each first voxel grid do not belong to the same feature point, dividing the first voxel grid into a plurality of second voxel grids according to a second preset resolution, and determining the feature points corresponding to the point cloud data in each second voxel grid until the feature points of the point cloud data in each first voxel grid are determined, wherein the second preset resolution is smaller than the first preset resolution.
5. The method according to claim 3 or 4, characterized in that the method further comprises: if the number of the point cloud data in the first voxel grid is larger than a preset threshold value, acquiring the mean value of all the point cloud data in the first voxel grid; and determining characteristic points of the point cloud data in each first voxel grid.
6. The method of claim 1, wherein after performing feature extraction on the continuous N frames of point cloud data and determining feature points corresponding to the N frames of point cloud data, the method further comprises:
and establishing a hash table according to the association relation between the N frames of point cloud data and the characteristic points corresponding to the N frames of point cloud data.
7. The method of claim 1, wherein determining an external reference matrix from the lidar to the IMU according to a feature point corresponding to a K-th frame point cloud data and a feature point corresponding to a 1-th frame point cloud data in the N-frame point cloud data under the first coordinate system, wherein K is greater than or equal to 1 and less than N comprises:
and under the first coordinate system, minimizing the sum of distances from the characteristic point corresponding to the Kth frame of point cloud data in the N frames of point cloud data to the characteristic point corresponding to the 1 st frame of point cloud data to obtain an external parameter matrix from the laser radar to the IMU, wherein K is more than or equal to 1 and is less than N.
8. An external reference calibration device of a laser radar IMU is characterized by comprising:
the acquisition unit is used for acquiring continuous N frames of point cloud data detected by the laser radar, wherein N is more than or equal to 2;
the first determining unit is used for extracting the characteristics of the continuous N frames of point cloud data and determining the characteristic points corresponding to the N frames of point cloud data;
the conversion unit is used for projecting the N frames of point cloud data to a first coordinate system according to a conversion relation between a preset IMU and the first coordinate system;
and the second determining unit is used for determining an external parameter matrix from the laser radar to the IMU according to the characteristic point corresponding to the Kth frame point cloud data and the characteristic point corresponding to the 1 st frame point cloud data in the N frames of point cloud data under the first coordinate system, wherein K is more than or equal to 1 and is less than N.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210498192.8A CN115097419A (en) | 2022-05-09 | 2022-05-09 | External parameter calibration method and device for laser radar IMU |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210498192.8A CN115097419A (en) | 2022-05-09 | 2022-05-09 | External parameter calibration method and device for laser radar IMU |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115097419A true CN115097419A (en) | 2022-09-23 |
Family
ID=83287072
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210498192.8A Pending CN115097419A (en) | 2022-05-09 | 2022-05-09 | External parameter calibration method and device for laser radar IMU |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115097419A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117292140A (en) * | 2023-10-17 | 2023-12-26 | 小米汽车科技有限公司 | Point cloud data processing method and device, vehicle and storage medium |
CN117590362A (en) * | 2024-01-19 | 2024-02-23 | 深圳海星智驾科技有限公司 | Multi-laser radar external parameter calibration method, device and equipment |
-
2022
- 2022-05-09 CN CN202210498192.8A patent/CN115097419A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117292140A (en) * | 2023-10-17 | 2023-12-26 | 小米汽车科技有限公司 | Point cloud data processing method and device, vehicle and storage medium |
CN117292140B (en) * | 2023-10-17 | 2024-04-02 | 小米汽车科技有限公司 | Point cloud data processing method and device, vehicle and storage medium |
CN117590362A (en) * | 2024-01-19 | 2024-02-23 | 深圳海星智驾科技有限公司 | Multi-laser radar external parameter calibration method, device and equipment |
CN117590362B (en) * | 2024-01-19 | 2024-04-16 | 深圳海星智驾科技有限公司 | Multi-laser radar external parameter calibration method, device and equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110988912B (en) | Road target and distance detection method, system and device for automatic driving vehicle | |
CN111553859B (en) | Laser radar point cloud reflection intensity completion method and system | |
CN108319655B (en) | Method and device for generating grid map | |
WO2018142900A1 (en) | Information processing device, data management device, data management system, method, and program | |
KR102054455B1 (en) | Apparatus and method for calibrating between heterogeneous sensors | |
CN115097419A (en) | External parameter calibration method and device for laser radar IMU | |
CN110189257B (en) | Point cloud acquisition method, device, system and storage medium | |
CN114217665B (en) | Method and device for synchronizing time of camera and laser radar and storage medium | |
EP3293700A1 (en) | 3d reconstruction for vehicle | |
CN115376109B (en) | Obstacle detection method, obstacle detection device, and storage medium | |
CN115187941A (en) | Target detection positioning method, system, equipment and storage medium | |
CN115436920A (en) | Laser radar calibration method and related equipment | |
CN114494466A (en) | External parameter calibration method, device and equipment and storage medium | |
CN117541655A (en) | Method for eliminating radar map building z-axis accumulated error by fusion of visual semantics | |
CN112712062A (en) | Monocular three-dimensional object detection method and device based on decoupling truncated object | |
CN115236643B (en) | Sensor calibration method, system, device, electronic equipment and medium | |
CN114648639B (en) | Target vehicle detection method, system and device | |
CN113627569B (en) | Data fusion method and device for radar video all-in-one machine of traffic large scene and storage medium | |
CN111462321B (en) | Point cloud map processing method, processing device, electronic device and vehicle | |
CN115082289A (en) | Projection method, device and equipment of laser radar point cloud and storage medium | |
CN114241011A (en) | Target detection method, device, equipment and storage medium | |
CN112433193A (en) | Multi-sensor-based mold position positioning method and system | |
CN118608435B (en) | De-distortion method and device for point cloud, electronic equipment and readable storage medium | |
CN112802117B (en) | Laser radar and camera calibration parameter blind restoration method | |
CN115994934B (en) | Data time alignment method and device and domain controller |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |