CN113933818A - Method, device, storage medium and program product for calibrating laser radar external parameter - Google Patents

Method, device, storage medium and program product for calibrating laser radar external parameter Download PDF

Info

Publication number
CN113933818A
CN113933818A CN202111335620.7A CN202111335620A CN113933818A CN 113933818 A CN113933818 A CN 113933818A CN 202111335620 A CN202111335620 A CN 202111335620A CN 113933818 A CN113933818 A CN 113933818A
Authority
CN
China
Prior art keywords
radar
data
point cloud
adjacent frames
inertial navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111335620.7A
Other languages
Chinese (zh)
Inventor
丁文东
秦莹莹
杨瀚
高巍
彭亮
万国伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Intelligent Technology Beijing Co Ltd
Original Assignee
Apollo Intelligent Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Intelligent Technology Beijing Co Ltd filed Critical Apollo Intelligent Technology Beijing Co Ltd
Priority to CN202111335620.7A priority Critical patent/CN113933818A/en
Publication of CN113933818A publication Critical patent/CN113933818A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Abstract

The scheme provided by the disclosure is used for estimating the relative pose of an inertial measurement unit between adjacent data frames based on the track of the inertial measurement unit and estimating the relative pose of a radar between the adjacent frames according to radar data, so that the radar and the inertial measurement unit can be calibrated based on the relative pose of the inertial measurement unit and the relative pose of the radar between the adjacent data frames, the rotation parameters and the translation parameters between the radar and the inertial measurement unit can be calibrated through the change of the poses, and the calibration success rate can be further improved.

Description

Method, device, storage medium and program product for calibrating laser radar external parameter
Technical Field
The present disclosure relates to high-precision maps and autopilot technology in data processing technology, and more particularly, to a method, apparatus, storage medium, and program product for calibrating laser radar external parameters.
Background
With the development of unmanned driving and intelligent driving technologies, higher requirements are put on the scale and the scene of a high-precision map. When the high-precision map is manufactured, a laser radar on a vehicle is required to be used for collecting point cloud data, and then three-dimensional reconstruction is carried out according to the point cloud data, so that the high-precision map is obtained. In the three-dimensional reconstruction process, the accuracy requirement on external parameters of the vehicle radar is high.
In the prior art, an external parameter estimation scheme based on a B-spline continuous track exists, the scheme uses the B-spline parameterized track, calculates a parameterized equation to construct an objective function to estimate the track of an IMU (Inertial Measurement Unit) of a vehicle, estimates the track of a radar by using a radar odometer, and calibrates the radar external parameter based on the track of the IMU and the track of the radar.
In the calibration mode, only the rotation external parameter of the radar can be estimated, so that the error of a calibration system is large, and the success rate of calibration is reduced.
Disclosure of Invention
The disclosure provides a method, equipment, storage medium and program product for calibrating laser radar external parameters, so as to improve the accuracy and success rate of radar external parameter calibration.
According to a first aspect of the present disclosure, a method for calibrating a laser radar external parameter is provided, including:
acquiring radar data acquired by a radar sensor arranged on a vehicle, inertial navigation data acquired by an inertial measurement unit arranged on the vehicle and positioning data acquired by a positioning system arranged on the vehicle;
determining an inertial navigation track of the inertial measurement unit according to the positioning data and the inertial navigation data;
determining the inertial navigation relative pose of the inertial measurement unit between adjacent frames according to the inertial navigation track, and determining the radar relative pose of the radar between adjacent frames according to the radar data;
and determining calibration parameters of the radar relative to the inertial measurement unit according to the inertial navigation relative pose and the radar relative pose.
According to a second aspect of the present disclosure, there is provided a calibration apparatus for a lidar external parameter, including:
the system comprises a data acquisition unit, a data acquisition unit and a data processing unit, wherein the data acquisition unit is used for acquiring radar data acquired by a radar sensor arranged on a vehicle, inertial navigation data acquired by an inertial measurement unit arranged on the vehicle and positioning data acquired by a positioning system arranged on the vehicle;
the track determining unit is used for determining an inertial navigation track of the inertial measurement unit according to the positioning data and the inertial navigation data;
the position and pose determining unit is used for determining the inertial navigation relative position and pose of the inertial measuring unit between adjacent frames according to the inertial navigation track and determining the radar relative position and pose of the radar between adjacent frames according to the radar data;
and the calibration unit is used for determining calibration parameters of the radar relative to the inertial measurement unit according to the inertial navigation relative pose and the radar relative pose.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of the first aspect.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising: a computer program, stored in a readable storage medium, from which at least one processor of an electronic device can read the computer program, execution of the computer program by the at least one processor causing the electronic device to perform the method of the first aspect.
According to the method, the device, the storage medium and the program product for calibrating the laser radar external parameter, the relative pose of the inertial measurement unit between the adjacent data frames can be estimated based on the track of the inertial measurement unit, the relative pose of the radar between the adjacent frames can be estimated according to the radar data, so that the radar and the inertial measurement unit can be calibrated based on the relative pose of the inertial measurement unit and the relative pose of the radar between the adjacent data frames, the rotation parameter and the translation parameter between the radar and the inertial measurement unit can be calibrated through the change of the poses, and the calibration success rate can be improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic illustration of a vehicle shown in an exemplary embodiment;
FIG. 2 is a schematic flow chart illustrating a method for calibrating a lidar external parameter according to an exemplary embodiment of the present disclosure;
FIG. 3 is a schematic flow chart diagram illustrating a method for calibrating a lidar external parameter according to another exemplary embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of a calibration apparatus for a lidar external parameter according to an exemplary embodiment of the present disclosure;
FIG. 5 is a schematic structural diagram of a calibration apparatus for a lidar external reference according to another exemplary embodiment of the present disclosure;
FIG. 6 is a block diagram of an electronic device used to implement methods of embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
FIG. 1 is a schematic illustration of a vehicle, shown in an exemplary embodiment.
The vehicle is provided with a radar sensor 11 and an inertial measurement unit IMU12, the radar sensor 11 having a coordinate system O1X1Y1Z1, and the inertial measurement unit 12 also having a coordinate system O2X2Y2Z 2. An installation error angle and a position error exist between the laser radar and the IMU, so that three-dimensional coordinates of the same group of mark points measured by the two sensors are different.
When a vehicle provided with a radar and an IMU is used for collecting road data, three-dimensional reconstruction needs to be carried out according to point cloud data collected by the radar, and position information of a three-dimensional reconstruction road environment needs to be determined. A combined positioning approach of lidar and IMU may be employed in determining the position information. Because the coordinate systems of the laser radar and the IMU are not identical, the three-dimensional coordinates of the same group of mark points measured by the two sensors are different. Therefore, in order to meet the positioning accuracy of the positioning system, parameters between the radar and the IMU need to be calibrated.
In the prior art, an external parameter estimation scheme based on a B-spline continuous track exists, the scheme uses a B-spline parameterized track, derives a parameterized equation to construct an objective function to estimate the track of an IMU (inertial measurement Unit) of a vehicle, estimates the track of a radar by using a radar odometer, and calibrates the radar external parameter based on the track of the IMU and the track of the radar. In the calibration mode, only the rotation external parameter of the radar can be estimated, so that the error of a calibration system is large, and the success rate of calibration is reduced.
In order to solve the technical problem, the radar and the IMU are calibrated by using the relative pose of the IMU and the relative pose of the radar between adjacent data frames, wherein the relative pose is used for representing the change of the poses of the sensors in two frames of data, and the radar and the IMU are arranged on the same vehicle, so that the position transformation of the two sensors is the same in the driving process of the vehicle, and the parameters between the radar and the IMU can be calibrated on the basis of the position transformation.
Fig. 2 is a schematic flowchart of a method for calibrating a lidar external parameter according to an exemplary embodiment of the disclosure.
As shown in fig. 2, the method for calibrating the external parameter of the laser radar provided by the present disclosure includes:
step 201, acquiring radar data acquired by a radar sensor arranged on a vehicle, inertial navigation data acquired by an inertial measurement unit arranged on the vehicle, and positioning data acquired by a positioning system arranged on the vehicle.
The method provided by the disclosure can be executed by an electronic device with computing capability, and the electronic device can be an in-vehicle device.
Specifically, a radar sensor, an IMU and a positioning system may be provided on the vehicle. Further, the radar sensor can collect point cloud data when the vehicle is running, the IMU can collect inertial navigation data, the Positioning System can collect Positioning data, and the Positioning System can be, for example, a GPS (Global Positioning System).
Further, the radar sensor operates on the principle that a detection signal (laser beam) is transmitted to a target, and then a received signal (target echo) reflected from the target is compared with the transmission signal to obtain point cloud data. The point cloud data may include information about the object, such as parameters of the object's distance, orientation, altitude, speed, attitude, and even shape.
In practical application, the IMU may include three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detecting three-axis acceleration signals, and the gyroscopes detecting angular velocity signals, measuring the angular velocity and acceleration of the object in three-dimensional space, so as to determine the attitude of the object.
Wherein, radar sensor, IMU and positioning system can send the data of gathering for electronic equipment can acquire inertial navigation data, radar data and location data respectively.
In an alternative embodiment, the vehicle may move linearly or in a curve, and a certain amount of data of linear movement and a certain amount of data of curve movement may be screened out from the data collected by the sensors.
And step 202, determining an inertial navigation track of the inertial measurement unit according to the positioning data and the inertial navigation data.
Specifically, the electronic device may determine the inertial navigation trajectory of the IMU using the positioning data and the inertial navigation data. The position of the vehicle at present is recorded in the positioning data, and the posture change condition of the vehicle is recorded in the inertial navigation data, so that the inertial navigation track can be determined by combining the positioning data and the inertial navigation data.
Furthermore, under the not good condition of positioning system signal under certain specific conditions, the locating data of acquireing also can be inaccurate, consequently, can utilize the better locating data of signal to combine to be used to lead data, determine the orbit of going of vehicle, can determine IMU's the orbit of being used to lead according to the orbit of going again. For example, when the signal of the positioning system at the point a is good, the position of the vehicle can be determined according to the positioning data at the point a, and when the signal of the positioning system is not good, the attitude change of the vehicle can be determined according to the inertial navigation data, and the vehicle running track can be generated by combining the position of the point a.
In practical application, the relative position of the IMU and the positioning system can be calibrated in advance, and an inertial navigation track is generated according to the relative position of the IMU and the positioning system and the vehicle running track. The inertial navigation track is used for representing the position change condition of inertial navigation when the vehicle runs.
And 203, determining the inertial navigation relative pose of the inertial measurement unit between adjacent frames according to the inertial navigation track, and determining the radar relative pose of the radar between adjacent frames according to the radar data.
The inertial navigation track comprises multi-frame data, and the data form a plurality of position information of inertial navigation within a period of time. The electronic equipment can determine the relative pose of inertial navigation between adjacent data frames according to the multi-frame data in the inertial navigation track.
Specifically, pose transformation information of inertial navigation is determined according to position information of the inertial navigation in two continuous frames of inertial navigation data, for example, the second frame of inertial navigation data moves by a distance L in the X-axis direction of the inertial navigation coordinate system compared with the first frame of inertial navigation data.
Furthermore, the electronic equipment can also determine the relative position of the radar between adjacent frames according to the radar data.
During practical application, the radar data acquired by the electronic equipment comprises multi-frame data, and the electronic equipment can determine the pose transformation information of the radar according to the pose information in the radar data of two continuous frames. For example, the second frame of radar data is shifted by a distance H in the X-axis direction of the radar coordinate system compared to the first frame of radar data.
The inertial navigation data and the radar data can have frame identification information, so that the inertial navigation data and the radar data which are acquired at the same moment can be determined according to the frame identification, and the inertial navigation data and the radar data which are acquired at the same moment are utilized to calibrate parameters. The frame identification information may be, for example, a frame number or may be time information.
And 204, determining calibration parameters of the radar relative to the inertial measurement unit according to the inertial navigation relative pose and the radar relative pose.
Specifically, the electronic device can calibrate the radar according to the inertial navigation relative pose and the radar relative pose, so that the parameters of the radar relative to the IMU are determined.
Further, the radar and the IMU are arranged on the vehicle, so that the relative position between the radar and the IMU is not changed, the calibration parameters of the radar relative to the inertial measurement unit are not changed, and the radar and the IMU are arranged on the same vehicle, and the pose changes of the radar and the IMU are the same.
In practical application, during the running process of the vehicle, the attitude changes of the radar and the IMU at two adjacent moments are the same, so that the parameters of the radar relative to the IMU can be determined according to the relative attitude of the radar and the relative attitude of the IMU between adjacent frames.
For example, a first parameter of the radar relative to the IMU may be determined according to the relative pose of the radar and the relative pose of the IMU between a first set of adjacent frames, and a second parameter of the radar relative to the IMU may be determined according to the relative pose of the radar and the relative pose of the IMU between a second set of adjacent frames.
The calibration mode provided by the disclosure can avoid complicated and strict calibration processes, can calibrate external parameters in the three-dimensional reconstruction process, automatically optimizes and compensates the external parameters by utilizing data acquired during three-dimensional reconstruction, and is favorable for improving the high-precision production and acquisition efficiency.
The method for calibrating the external parameter of the laser radar comprises the following steps: acquiring radar data acquired by a radar sensor arranged on a vehicle, inertial navigation data acquired by an inertial measurement unit arranged on the vehicle and positioning data acquired by a positioning system arranged on the vehicle; determining an inertial navigation track of an inertial measurement unit according to the positioning data and the inertial navigation data; determining the inertial navigation relative pose of the inertial measurement unit between adjacent frames according to the inertial navigation track, and determining the radar relative pose of the radar between adjacent frames according to radar data; and determining calibration parameters of the radar relative to the inertial measurement unit according to the inertial navigation relative pose and the radar relative pose. In this embodiment, the electronic device may estimate the relative pose of the IMU between the adjacent data frames based on the IMU trajectory, and estimate the relative pose of the radar between the adjacent frames according to the radar data, so that the radar and the IMU may be calibrated based on the relative pose of the IMU and the relative pose of the radar between the adjacent data frames, and the rotation parameter and the translation parameter between the radar and the IMU may be calibrated through the change of the pose, thereby improving the success rate of calibration.
Fig. 3 is a schematic flowchart of a method for calibrating a lidar external parameter according to another exemplary embodiment of the disclosure.
As shown in fig. 3, the method for calibrating the external parameter of the laser radar provided by the present disclosure includes:
step 301, acquiring initial radar data acquired by a radar sensor, initial inertial navigation data acquired by an inertial measurement unit and initial positioning data acquired by a positioning system when a vehicle runs.
The data collected by the radar sensor during the vehicle running is initial radar data, for example, the data collected by the radar sensor during the vehicle running for 20 minutes can be used as the initial radar data.
Specifically, the data acquired by the IMU during vehicle driving is the initial inertial navigation data, for example, the data acquired by the IMU during the vehicle driving for 20 minutes may be used as the initial inertial navigation data.
Furthermore, the data collected by the positioning system when the vehicle runs are initial positioning data, for example, the data collected by the positioning system during the time period can be used as the initial positioning data when the vehicle runs for 20 minutes.
During practical application, the electronic equipment can acquire initial radar data, initial inertial navigation data and initial positioning data which are acquired by a radar sensor, an IMU and a positioning system when a vehicle runs.
Step 302, screening out radar data meeting a preset driving state from the initial radar data, screening out inertial navigation data meeting the preset driving state from the initial inertial navigation data, and screening out positioning data meeting the preset driving state from the initial positioning data.
In actual application, the preset running state can be preset. When a vehicle runs on a road, multiple running states may exist, and data collected by sensors in different running states may also be different. Therefore, in order to improve the accuracy of parameter calibration, radar data, inertial navigation data and positioning data meeting the preset driving state can be screened out from the initial radar data, the initial inertial navigation data and the initial positioning data.
The preset driving state may include, for example, a linear motion and a rotational motion. More linear motion and rotary motion data can be screened out from the initial radar data, the initial inertial navigation data and the initial positioning data. However, if the data is excessive, the calculation amount may be too large, and therefore, radar data, inertial navigation data and positioning data of a section of continuous track can be selected, and the section of continuous track can comprise a straight running track and a rotating running track.
The screened radar data, inertial navigation data and positioning data are data in the same time period, for example, data in the time period from time t1 to time t2 in the collected data. The continuous trajectory of this time includes both linear and rotational motion.
And step 303, screening reliable positioning data with the confidence coefficient reaching a preset value from the positioning data, and determining an initial running track of the vehicle according to the reliable positioning data.
Specifically, after the electronic device acquires the positioning data, the electronic device can screen out the reliable positioning data with better signals. For example, can be according to the confidence coefficient of the positioning data of positioning system output, determine reliable positioning data in the positioning data, if the confidence coefficient of positioning data reaches the default, then can confirm that this positioning data is reliable positioning data.
Furthermore, the initial running track of the vehicle can be determined according to the reliable positioning data. For example, the driving trajectory of a continuous vehicle may be determined according to the reliable positioning data, and the determined driving trajectory may be used as the initial driving trajectory. For example, the position of the vehicle can be determined according to the reliable positioning data, so as to obtain the running track of the vehicle, and a continuous initial running track can be determined in the running track.
And step 304, determining an inertial navigation track of the inertial measurement unit according to the initial driving track and the inertial navigation data.
In practical application, the inertial navigation data are used for representing the pose state of the IMU, and the IMU is arranged on the vehicle, so that the pose state of the vehicle can be represented by the inertial navigation data. After the positioning information of the vehicle at a time is determined, the following travel track of the vehicle can be estimated based on the pose state of the vehicle after the time.
The initial driving track obtained through reliable positioning data can be considered to be accurate, and therefore the complete driving track of the vehicle can be estimated by combining the initial driving track and inertial navigation data, and the accurate inertial navigation track of the IMU can be determined.
In the embodiment, the electronic device can restore the accurate driving track of the vehicle according to the accurate positioning data and the accurate inertial navigation data, and further estimate the inertial navigation track of the IMU, so that the accurate inertial navigation track can be obtained.
For example, the complete driving track of the vehicle can be determined according to the initial driving track and inertial navigation data; and carrying out deviation processing on the complete running track according to the relative position between the positioning system and the inertial measurement unit to obtain the inertial navigation track of the inertial measurement unit.
The initial running track can be generated according to the positioning data with better signals, and then the complete running track of the vehicle is generated according to the inertial navigation data after the vehicle passes through the initial running track. For example, the tail end of the initial travel track is position a, and after the travel track passes through the position a, the change of the position and posture of the vehicle can be estimated according to the inertial navigation data, for example, when the vehicle moves a first distance in the X-axis direction, the next position of the vehicle can be determined, and based on this, the complete travel track of the vehicle can be obtained.
Specifically, the electronic device may perform offset processing on the complete driving trajectory according to the relative position of the positioning system and the IMU to obtain the inertial navigation trajectory. For example, if the IMU is disposed at 20 cm from the first direction of the positioning system, the entire driving trajectory may be shifted by 20 cm from the first direction, so as to obtain an inertial navigation trajectory.
In the embodiment, the IMU trajectory can be obtained according to the positioning data and the inertial navigation data with higher confidence coefficient without adopting a parameter derivation mode, the determined inertial navigation trajectory is more accurate, and the calibration parameters determined based on the inertial navigation trajectory are more accurate.
And 305, determining the inertial navigation relative pose of the inertial measurement unit between adjacent frames according to the inertial navigation track.
Step 305 is similar to the implementation of step 203, and is not described again.
Step 306, registering the point cloud data of the adjacent frames to obtain the relative position and posture of the radar between the adjacent frames; the radar data comprises point cloud data.
Further, the radar data acquired by the electronic device includes point cloud data, and the electronic device can determine the change of the relative pose of the radar according to the point cloud data of continuous multiple frames.
Point cloud registration refers to determining a transformation matrix so that one point cloud can be registered as high as possible with another point. When the vehicle runs, the pose of the radar sensor changes, so that continuous two frames of point clouds collected by the radar do not coincide. The pose transformation of the radar in the two frames of data can be determined by registering the two continuous frames of point cloud data, and then the relative pose of the radar is obtained. The attitude change process of the radar can be accurately determined in a point cloud registration mode.
In practical application, point cloud registration is an iterative process, and two point clouds are approximately overlapped by continuously iterating and optimizing a transformation matrix. In order to determine a transformation matrix between two continuous frames of point cloud data more quickly, the scheme provided by the disclosure adopts a hierarchical registration mode to perform optimization iteration, so that the registration speed is increased.
And aiming at any two adjacent frames of point cloud data, a hierarchical registration mode can be adopted.
Specifically, the point cloud data of adjacent frames can be subjected to multi-level sampling processing, so that multi-level point cloud sampling data with different accuracies of the adjacent frames are obtained.
The two adjacent frames of point cloud data comprise a first point cloud and a second point cloud. The first point cloud and the second point cloud can be respectively processed in a multi-level mode, and specifically, the down-sampling processing can be carried out according to different resolutions. For example, the first point cloud and the second point cloud may be respectively downsampled according to three levels of resolutions, i.e., low resolution, medium resolution, and high resolution, to obtain a first low point cloud, a first medium point cloud, and a first high point cloud, and a second low point cloud, a second medium point cloud, and a second high point cloud.
For example, the point cloud data may be sampled with a probability of 60%, may be sampled with a probability of 80%, and may be sampled with a probability of 90%.
Furthermore, the electronic device may sequentially register point cloud sampling data of adjacent frames according to a sequence from low-level accuracy to high-level accuracy, and then register point cloud data with the highest accuracy, where the point cloud data with the highest accuracy refers to point cloud data before sampling.
For example, the electronic device may register the low-resolution point cloud sampling data to obtain a registration result, where an initial value may be randomly generated when registering the low-resolution point cloud sampling data, and then registration is performed based on the initial value. And then taking the registration result of the low resolution as an initial value, and performing registration when the point cloud sampling data of the medium resolution is registered to obtain a registration result. And then taking the high-resolution registration result as an initial value, and performing registration when the high-resolution point cloud sampling data is registered to obtain a registration result. And then, the point cloud data before sampling can be registered by using the high-resolution registration result as an initial value to obtain the relative position and posture of the radar between adjacent frames.
In practical application, the electronic device may acquire the first point cloud data and the second point cloud data of the same precision level of adjacent frames. The first point cloud data and the second point cloud data may be point cloud sampling data or point cloud data included in the radar data. For example, in the registration process, when the point cloud sampling data of each level are registered, the point cloud sampling data of adjacent frames are obtained, and after the sampling data are sequentially registered, when the point cloud data before sampling are required to be registered, the point cloud data of adjacent frames before sampling are obtained.
The first point cloud data and the second point cloud data represented by the first point cloud data are three-dimensional data, the three-dimensional first point cloud data and the three-dimensional second point cloud data can be projected to a two-dimensional plane to obtain two-dimensional point cloud data, and then the two-dimensional point cloud can be registered, so that the registration speed is improved.
Specifically, the first three-dimensional point cloud may be mapped to the two-dimensional plane according to the first point cloud data to obtain a first two-dimensional point cloud, and the second three-dimensional point cloud may be mapped to the two-dimensional plane according to the second point cloud data to obtain a second two-dimensional point cloud. For example, the point cloud may be projected onto a 2D plane using the angle of the three-dimensional point cloud with respect to the center of the radar sensor as coordinates. For example, a cylindrical projection mode may be adopted to perform projection processing on the point cloud.
And registering the point cloud sampling data or the point cloud data of the adjacent frames according to the positions of the two-dimensional point cloud and the second two-dimensional point cloud on the two-dimensional plane. When the acquired first point cloud data and the acquired second point cloud data are point cloud sampling data, the point cloud sampling data of the adjacent frames can be registered according to the positions of the two-dimensional point cloud and the second two-dimensional point cloud on the two-dimensional plane. When the acquired first point cloud data and the acquired second point cloud data are point cloud data, the point cloud data of the adjacent frames can be registered according to the positions of the first two-dimensional point cloud and the second two-dimensional point cloud on the two-dimensional plane.
Further, an ICP algorithm (Iterative Closest Point algorithm) may be used to register the first two-dimensional Point cloud and the second two-dimensional Point cloud. Specifically, according to the positions of the first two-dimensional point cloud and the second two-dimensional point cloud on the two-dimensional plane, a point corresponding to the first two-dimensional point cloud is determined in the second two-dimensional point cloud, and then a registration result between the first two-dimensional point cloud and the second two-dimensional point cloud is determined according to the corresponding relation. And processing the first two-dimensional point cloud by using the obtained registration result, determining an error based on the processing result and the second two-dimensional point cloud, and adjusting the corresponding relation between points according to the error, thereby obtaining the final registration result of the first two-dimensional point cloud and the second two-dimensional point cloud through multiple iterations.
In this way, the points with the corresponding relationship in the first two-dimensional point cloud and the second two-dimensional point cloud can be determined, and further, the registration result of the point cloud sampling data or the point cloud data of the adjacent frames can be determined according to the position of the two-dimensional point cloud with the corresponding relationship in the three-dimensional space. In particular, offset parameters and rotation parameters can be determined, by means of which the points in the three-dimensional space having a correspondence can be brought into coincidence as much as possible.
When the point cloud data of the adjacent frames are registered, the registration processes of the adjacent frames of different groups are independent, so that the point cloud data of a plurality of groups of adjacent frames can be registered in parallel by using a parallel computing architecture of a display card in the electronic equipment, and the registration speed is improved.
Specifically, the electronic device may further obtain a local point cloud map constructed according to the point cloud data and/or driving speed information of the vehicle. Therefore, the point cloud data is subjected to motion compensation according to the local point cloud map and/or the driving speed information of the vehicle, and the compensated point cloud data is obtained.
The electronic device can construct a local point cloud map according to the point cloud data acquired by the radar. The radar sensor can collect multi-frame point cloud data in the vehicle driving process, the electronic equipment can process the multi-frame point cloud data, and the point cloud data are accumulated to obtain a local point cloud map.
The point cloud data acquired by the radar sensor may be distorted when the vehicle runs, so that the acquired point cloud data can be compensated according to the local point cloud map to obtain accurate point cloud data.
Furthermore, the electronic equipment can also acquire the running speed of the vehicle, and then motion compensation is carried out on the point cloud data acquired by the radar through the running speed, so that more accurate point cloud data can be obtained.
In practical application, the point cloud data of the adjacent frames can be registered according to the compensated point cloud data. Because the compensated point cloud data is more accurate, the relative pose of the radar obtained by registering the compensated point cloud data is more accurate.
307, determining calibration parameters of the radar corresponding to the adjacent frames relative to the inertial measurement unit according to the radar rotation parameters and the radar translation parameters between the adjacent frames and the inertial navigation rotation parameters and the inertial navigation translation parameters between the adjacent frames; the relative position and pose of the radar comprises a radar rotation parameter and a radar translation parameter; the inertial navigation relative pose comprises inertial navigation rotation parameters and inertial navigation translation parameters.
Wherein, passing the point cloudThe radar relative pose obtained by registration comprises radar rotation parameters
Figure BDA0003350383160000121
And radar translation parameter
Figure BDA0003350383160000122
Inertial navigation relative pose determined through inertial navigation track comprises inertial navigation rotation parameters
Figure BDA0003350383160000123
And inertial navigation translation parameters
Figure BDA0003350383160000124
In particular, the method comprises the following steps of,
Figure BDA0003350383160000125
the rotation parameter is used for representing the rotation parameter of the radar sensor at the j frame data relative to the i frame data;
Figure BDA0003350383160000126
the translation parameter is used for representing the translation parameter of the radar sensor at the j frame data relative to the i frame data;
Figure BDA0003350383160000127
the rotation parameter is used for characterizing the rotation parameter of the IMU at the j frame data relative to the i frame data,
Figure BDA0003350383160000128
and the translation parameters are used for characterizing the translation parameters of the IMU at the j frame data relative to the i frame data.
Further, at the time corresponding to the adjacent frame, the pose changes of the radar sensor and the IMU should be the same, so that the following formula can be constructed based on the pose changes, and the calibration parameter R of the radar corresponding to the adjacent frame relative to the IMU is determinedBLAnd tBL。RBLRefers to the rotation parameter of the radar relative to the inertial measurement unit, tBLRefers to the translation parameters of the radar relative to the inertial measurement unit.
Figure BDA0003350383160000131
In practical application, R corresponding to each group of adjacent frames can be determined based on the above formulaBLAnd tBL
308, fitting calibration parameters of the radar corresponding to the multiple groups of adjacent frames relative to the inertial measurement unit to obtain the calibration parameters of the radar relative to the inertial measurement unit; the calibration parameters comprise rotation parameters and translation parameters of the radar relative to the inertial measurement unit.
In practical application, calibration parameters of the radar relative to the IMU can be determined for each group of adjacent frames, and the calibration parameters can be fitted to obtain final calibration parameters of the radar relative to the IMU.
The rotation parameters in the calibration parameters can be fitted to obtain the final rotation parameter R of the radar relative to the IMUBLFitting can be carried out on translation parameters in the calibration parameters to obtain the final translation parameter t of the radar relative to the IMUBL
In the embodiment, the radar and the IMU can be calibrated based on the relative pose of the IMU and the relative pose of the radar between the adjacent data frames, and the rotation parameters and the translation parameters between the radar and the IMU can be calibrated through the change of the poses, so that the calibration success rate can be improved.
In an alternative embodiment, a plurality of radar sensors may be provided in the vehicle, and in this embodiment, each radar sensor may be calibrated according to the above method to obtain a calibration parameter of each radar sensor compared to the IMU, and the final calibration parameter of each radar sensor may be optimized according to the calibration parameters of each radar sensor and a plurality of groups of adjacent frames.
Step 309, determining the relative pose of the first radar corresponding to the adjacent frame with respect to the second radar according to the calibration parameter of the first radar corresponding to the adjacent frame with respect to the inertial measurement unit and the calibration parameter of the second radar corresponding to the adjacent frame with respect to the inertial measurement unit.
Based on the step 308, the calibration parameters of the radar corresponding to each group of adjacent frames relative to the inertial measurement unit can be obtained, and the relative poses of the two radar sensors can be determined according to the calibration parameters of the adjacent frames of any two of the multiple radar sensors. The first radar and the second radar are referred to herein as radar sensors.
Specifically, for example, the data of the ith frame and the jth frame may be used to determine that the first radar M is compared with the IMU
Figure BDA0003350383160000132
And
Figure BDA0003350383160000133
and the second radar N is compared with the IMU
Figure BDA0003350383160000134
And
Figure BDA0003350383160000135
can be based on
Figure BDA0003350383160000141
And
Figure BDA0003350383160000142
and
Figure BDA0003350383160000143
determining the relative pose of the first radar M corresponding to the ith and the jth frames compared with the second radar N
Figure BDA0003350383160000144
Further, for each group of adjacent frames, the relative pose of the first radar M compared with the second radar N can be determined, and since the installation positions of the first radar M and the second radar N on the vehicle are fixed, the relative poses of the first radar M and the second radar N are also fixed, the obtained radar calibration parameters can be optimized by using the relative poses of the first radar M compared with the second radar N corresponding to each group of adjacent frames. And then obtaining accurate calibration parameters of the radar.
And 310, optimizing calibration parameters of the first radar and calibration parameters of the second radar according to the relative poses of the first radar and the second radar corresponding to the adjacent frames.
The calibration parameters of the first radar and the calibration parameters of the second radar may be calibrated based on the following equation:
Figure BDA0003350383160000145
wherein the content of the first and second substances,
Figure BDA0003350383160000146
is a calibration parameter of the second radar N,
Figure BDA0003350383160000147
are the calibration parameters of the first radar M,
Figure BDA0003350383160000148
is the minimum error term.
For each group of adjacent frames, a corresponding minimum error term may be determined, and the minimum error term corresponding to each adjacent frame may meet the requirement by adjusting the calibration parameters of the first radar M and the second radar N, for example, when the minimum error term corresponding to each group of adjacent frames is less than a threshold, the optimized calibration parameters of the first radar M and the second radar N may be obtained.
After step 308 or 310, it may further include:
and 311, determining the relative pose of the radar between the adjacent frames after registration and the state of the radar to be optimized in each frame according to the calibration parameters of the radar corresponding to the adjacent frames relative to the inertial measurement unit.
Steps 312, 313 can be performed for each radar, optimizing the state to be optimized of the radar.
Based on step 308, calibration parameters of the radar corresponding to the adjacent frames relative to the inertial measurement unit can be determined, and the relative pose of the radar between the adjacent frames after registration can be determined according to the calibration parameters.
Specifically, the pose of the radar at the ith frame and the pose of the radar at the jth frame can be determined according to the calibration parameter, for example, the pose of the radar at the ith frame can be determined according to the calibration parameter and the inertial navigation data of the ith frame, and the pose of the radar at the jth frame can be determined according to the calibration parameter and the inertial navigation data of the jth frame. At the moment, the calibration parameters of the radar are more accurate parameters, so that the more accurate radar pose can be determined based on the calibration parameters.
Furthermore, the relative pose of the radar can be determined according to the radar pose after registration
Figure BDA0003350383160000151
And then more accurate relative position and posture of the radar can be obtained.
In practical application, the state to be optimized refers to a pose of the radar compared with a global coordinate System, the global coordinate System may be, for example, a UTM (Universal transform Mercator Grid System) coordinate System, and the state to be optimized of the radar in each frame may be determined according to the determined pose state of the radar in each frame and the pose state of the global coordinate System.
For example, the rotation parameters and translation parameters of the radar compared to a global coordinate system.
And step 312, optimizing the state to be optimized according to the relative pose of the radar between the adjacent frames after registration and the calibration parameters of the radar.
Wherein the state to be optimized of the radar can be optimized based on the following formula.
Figure BDA0003350383160000152
Specifically, accurate calibration parameters of the radar can be obtained through the steps, and the position and posture of the radar compared with a global coordinate system can be optimized by using the calibration parameters.
In particular, Rc、tcMeans any one ofCalibration parameter of radar, Ri、tiMeans that the radar is in the i frame to be optimized, Rj、tjRefers to the state of the radar to be optimized in the j frame.
Figure BDA0003350383160000153
Figure BDA0003350383160000154
Is the minimum error term calculated by optimizing Ri、ti、Rj、tjSo that the minimum error term meets the requirement, thereby obtaining the optimized Ri、ti、Rj、tj
By the implementation mode, the state of the radar can be optimized according to the calibration parameters of the radar, and the accurate state of the radar can be obtained.
Fig. 4 is a schematic structural diagram of a calibration apparatus for a lidar external parameter according to an exemplary embodiment of the present disclosure.
As shown in fig. 4, the calibration apparatus 400 for lidar external parameters provided by the present disclosure includes:
a data acquisition unit 410, configured to acquire radar data acquired by a radar sensor disposed on a vehicle, inertial navigation data acquired by an inertial measurement unit disposed on the vehicle, and positioning data acquired by a positioning system disposed on the vehicle;
a track determining unit 420, configured to determine an inertial navigation track of the inertial measurement unit according to the positioning data and the inertial navigation data;
the pose determining unit 430 is used for determining the inertial navigation relative pose of the inertial measurement unit between adjacent frames according to the inertial navigation track, and determining the radar relative pose of the radar between adjacent frames according to the radar data;
and the calibration unit 440 is configured to determine calibration parameters of the radar relative to the inertial measurement unit according to the inertial navigation relative pose and the radar relative pose.
The calibration device for the laser radar external parameter can estimate the relative pose of the IMU between the adjacent data frames based on the IMU track, and estimate the relative pose of the radar between the adjacent data frames according to the radar data, so that the radar and the IMU can be calibrated based on the relative pose of the IMU and the relative pose of the radar between the adjacent data frames, the rotation parameter and the translation parameter between the radar and the IMU can be calibrated through the pose change, and the calibration success rate can be improved.
Fig. 5 is a schematic structural diagram of a calibration apparatus for a lidar external parameter according to another exemplary embodiment of the present disclosure.
As shown in fig. 5, in a calibration apparatus 500 of a lidar external reference provided by the present disclosure, a data acquisition unit 510 is similar to the data acquisition unit 410 shown in fig. 4, a trajectory determination unit 520 is similar to the trajectory determination unit 420 shown in fig. 4, a pose determination unit 530 is similar to the pose determination unit 430 shown in fig. 4, and a calibration unit 540 is similar to the calibration unit 440 shown in fig. 4.
In an alternative embodiment, the radar data includes point cloud data;
the pose determination unit 530 comprises a radar pose determination module 531 for comprising:
and registering the point cloud data of the adjacent frames to obtain the relative position and posture of the radar between the adjacent frames.
In an optional implementation, the radar pose determination module 531 includes:
the sampling submodule 5311 is configured to perform multi-level sampling processing on the point cloud data of adjacent frames to obtain multi-level point cloud sampling data of different accuracies of the adjacent frames;
a hierarchical registration submodule 5312, configured to sequentially register the point cloud sampling data and the point cloud data of the adjacent frames according to an order from a low-level accuracy to a high-level accuracy; wherein, the registration result of the low-level point cloud sampling data is used as an initial value when the high-level point cloud sampling data is registered;
and the result of registering the point cloud data with the highest precision is the relative position and posture of the radar between the adjacent frames.
In an optional embodiment, when the hierarchical registration sub-module 5312 registers the point cloud sampling data and the point cloud data of the adjacent frames, it is specifically configured to:
acquiring first point cloud data and second point cloud data of the same precision level of adjacent frames; wherein the first point cloud data and the second point cloud data are point cloud sampling data or the point cloud data included in the radar data;
mapping the first three-dimensional point cloud to a two-dimensional plane according to the first point cloud data to obtain a first two-dimensional point cloud, and mapping the second three-dimensional point cloud to the two-dimensional plane according to the second point cloud data to obtain a second two-dimensional point cloud;
and registering the point cloud sampling data or the point cloud data of the adjacent frames according to the positions of the two-dimensional point cloud and the second two-dimensional point cloud on a two-dimensional plane.
In an alternative embodiment, the radar pose determination module 531 further comprises a compensation sub-module 5313 for the radar pose determination module 531 to, prior to registering the point cloud data of adjacent frames:
acquiring a local point cloud map constructed according to the point cloud data and/or the driving speed of the vehicle;
performing motion compensation on the point cloud data according to the local point cloud map and/or the driving speed of the vehicle to obtain compensated point cloud data;
the radar pose determination module 531 is specifically configured to:
and registering the point cloud data of the adjacent frames according to the compensated point cloud data.
In an optional embodiment, the radar pose determination module 531 specifically uses a parallel computing architecture of a graphics card in the electronic device to perform parallel registration on multiple sets of point cloud data of adjacent frames.
In an alternative embodiment, the radar relative pose comprises a radar rotation parameter and a radar translation parameter; the inertial navigation relative pose comprises inertial navigation rotation parameters and inertial navigation translation parameters;
the calibration unit 540 includes:
an inter-frame calibration module 541, configured to determine, according to a radar rotation parameter and a radar translation parameter between adjacent frames, and an inertial navigation rotation parameter and an inertial navigation translation parameter between adjacent frames, a calibration parameter of the radar corresponding to the adjacent frame with respect to the inertial measurement unit;
a fitting module 542, configured to fit calibration parameters of the radar relative to the inertial measurement unit corresponding to multiple groups of adjacent frames to obtain calibration parameters of the radar relative to the inertial measurement unit; the calibration parameters comprise rotation parameters and translation parameters of the radar relative to the inertial measurement unit.
In an alternative embodiment, at least two of the radar sensors are provided on the vehicle;
the calibration unit 540 further includes an optimizing module 543, configured to, after the fitting module 542 fits the calibration parameters of the radar with respect to the inertial measurement unit corresponding to multiple sets of adjacent frames:
determining the relative pose of a first radar corresponding to an adjacent frame relative to a second radar according to the calibration parameters of the first radar corresponding to the adjacent frame relative to the inertial measurement unit and the calibration parameters of the second radar corresponding to the adjacent frame relative to the inertial measurement unit;
and optimizing calibration parameters of the first radar and calibration parameters of the second radar according to the relative poses of the first radar and the second radar corresponding to the adjacent frames.
In an optional implementation, the apparatus further comprises a state optimization unit 550 configured to:
determining the relative pose of the radar between adjacent frames after registration and the state of the radar to be optimized in each frame according to the calibration parameters of the radar corresponding to the adjacent frames relative to the inertial measurement unit;
and optimizing the state to be optimized according to the relative pose of the radar between adjacent frames after registration and the calibration parameters of the radar.
In an alternative embodiment, the data obtaining unit 510 includes:
an initial data obtaining module 511, configured to obtain initial radar data collected by the radar sensor when the vehicle runs, initial inertial navigation data collected by the inertial measurement unit, and initial positioning data collected by a positioning system;
a data screening unit 512, configured to screen out radar data meeting a preset driving state from the initial radar data, screen out inertial navigation data meeting the preset driving state from the initial inertial navigation data, and screen out positioning data meeting the preset driving state from the initial positioning data.
In an optional implementation, the trajectory determination unit 520 includes:
the initial track determining module 521 is configured to screen reliable positioning data with a confidence reaching a preset value from the positioning data, and determine an initial running track of the vehicle according to the reliable positioning data;
an inertial navigation trajectory determining module 522, configured to determine an inertial navigation trajectory of the inertial measurement unit according to the initial driving trajectory and the inertial navigation data.
In an optional implementation, the inertial navigation trajectory determination module 522 is specifically configured to:
determining a complete driving track of the vehicle according to the initial driving track and the inertial navigation data;
and carrying out deviation processing on the complete running track according to the relative position between the positioning system and the inertial measurement unit to obtain an inertial navigation track of the inertial measurement unit.
The present disclosure provides a method, an apparatus, a storage medium, and a program product for calibrating a laser radar external reference, which are applied to a high-precision map and an automatic driving technology in a data processing technology to improve the accuracy and success rate of radar external reference calibration.
In the technical scheme of the disclosure, the collection, storage, use, processing, transmission, provision, disclosure and other processing of the personal information of the related user are all in accordance with the regulations of related laws and regulations and do not violate the good customs of the public order.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
According to an embodiment of the present disclosure, the present disclosure also provides a computer program product comprising: a computer program, stored in a readable storage medium, from which at least one processor of the electronic device can read the computer program, the at least one processor executing the computer program causing the electronic device to perform the solution provided by any of the embodiments described above.
FIG. 6 illustrates a schematic block diagram of an example electronic device 600 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 can also be stored. The calculation unit 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 601 performs the various methods and processes described above, such as the method of calibration of the lidar external parameters. For example, in some embodiments, the method of calibration of a lidar external reference may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into RAM 603 and executed by computing unit 601, one or more steps of the method of calibration of a lidar external reference described above may be performed. Alternatively, in other embodiments, the calculation unit 601 may be configured by any other suitable means (e.g. by means of firmware) to perform the method of calibration of the lidar external parameters.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (27)

1. A calibration method of laser radar external parameters comprises the following steps:
acquiring radar data acquired by a radar sensor arranged on a vehicle, inertial navigation data acquired by an inertial measurement unit arranged on the vehicle and positioning data acquired by a positioning system arranged on the vehicle;
determining an inertial navigation track of the inertial measurement unit according to the positioning data and the inertial navigation data;
determining the inertial navigation relative pose of the inertial measurement unit between adjacent frames according to the inertial navigation track, and determining the radar relative pose of the radar between adjacent frames according to the radar data;
and determining calibration parameters of the radar relative to the inertial measurement unit according to the inertial navigation relative pose and the radar relative pose.
2. The method of claim 1, wherein the radar data includes point cloud data;
determining the relative pose of the radar between adjacent frames, comprising:
and registering the point cloud data of the adjacent frames to obtain the relative position and posture of the radar between the adjacent frames.
3. The method of claim 2, wherein the registering the point cloud data of adjacent frames to obtain the relative radar pose between adjacent frames comprises:
performing multi-level sampling processing on the point cloud data of adjacent frames to obtain multi-level point cloud sampling data with different accuracies of the adjacent frames;
sequentially registering the point cloud sampling data and the point cloud data of the adjacent frames according to the sequence from low-level precision to high-level precision; wherein, the registration result of the low-level point cloud sampling data is used as an initial value when the high-level point cloud sampling data is registered;
and the result of registering the point cloud data with the highest precision is the relative position and posture of the radar between the adjacent frames.
4. The method of claim 3, wherein the point cloud sampled data and point cloud data of the adjacent frames are registered by:
acquiring first point cloud data and second point cloud data of the same precision level of adjacent frames; wherein the first point cloud data and the second point cloud data are point cloud sampling data or the point cloud data included in the radar data;
mapping the first three-dimensional point cloud to a two-dimensional plane according to the first point cloud data to obtain a first two-dimensional point cloud, and mapping the second three-dimensional point cloud to the two-dimensional plane according to the second point cloud data to obtain a second two-dimensional point cloud;
and registering the point cloud sampling data or the point cloud data of the adjacent frames according to the positions of the two-dimensional point cloud and the second two-dimensional point cloud on a two-dimensional plane.
5. The method of claim 2, wherein prior to registering the point cloud data of adjacent frames, further comprising:
acquiring a local point cloud map constructed according to the point cloud data and/or the driving speed of the vehicle;
performing motion compensation on the point cloud data according to the local point cloud map and/or the driving speed of the vehicle to obtain compensated point cloud data;
the registering of the point cloud data of the adjacent frames comprises:
and registering the point cloud data of the adjacent frames according to the compensated point cloud data.
6. The method of any one of claims 2-5, wherein the registration of the sets of point cloud data of adjacent frames is performed in parallel using a parallel computing architecture of a graphics card in the electronic device.
7. The method of any of claims 1-6, wherein the radar relative pose comprises a radar rotation parameter and a radar translation parameter; the inertial navigation relative pose comprises inertial navigation rotation parameters and inertial navigation translation parameters;
determining calibration parameters of the radar relative to the inertial measurement unit according to the inertial navigation relative pose and the radar relative pose, wherein the calibration parameters comprise:
according to the radar rotation parameter and the radar translation parameter between adjacent frames and the inertial navigation rotation parameter and the inertial navigation translation parameter between adjacent frames, determining the calibration parameter of the radar corresponding to the adjacent frames relative to the inertial measurement unit;
fitting calibration parameters of the radar corresponding to a plurality of groups of adjacent frames relative to the inertial measurement unit to obtain the calibration parameters of the radar relative to the inertial measurement unit; the calibration parameters comprise rotation parameters and translation parameters of the radar relative to the inertial measurement unit.
8. The method of claim 7, wherein at least two of the radar sensors are disposed on the vehicle;
after the radar corresponding to the multiple groups of adjacent frames is fitted with respect to the calibration parameters of the inertial measurement unit, the method further includes:
determining the relative pose of a first radar corresponding to an adjacent frame relative to a second radar according to the calibration parameters of the first radar corresponding to the adjacent frame relative to the inertial measurement unit and the calibration parameters of the second radar corresponding to the adjacent frame relative to the inertial measurement unit;
and optimizing calibration parameters of the first radar and calibration parameters of the second radar according to the relative poses of the first radar and the second radar corresponding to the adjacent frames.
9. The method of claim 7, further comprising:
determining the relative pose of the radar between adjacent frames after registration and the state of the radar to be optimized in each frame according to the calibration parameters of the radar corresponding to the adjacent frames relative to the inertial measurement unit;
and optimizing the state to be optimized according to the relative pose of the radar between adjacent frames after registration and the calibration parameters of the radar.
10. The method of any one of claims 1-9, wherein the acquiring radar data collected by a radar sensor disposed on a vehicle, inertial navigation data collected by an inertial measurement unit disposed on the vehicle, and positioning data collected by a positioning system disposed on the vehicle comprises:
acquiring initial radar data acquired by the radar sensor, initial inertial navigation data acquired by the inertial measurement unit and initial positioning data acquired by a positioning system when the vehicle runs;
and screening out radar data meeting a preset driving state from the initial radar data, screening out inertial navigation data meeting the preset driving state from the initial inertial navigation data, and screening out positioning data meeting the preset driving state from the initial positioning data.
11. The method according to any one of claims 1-9, wherein said determining an inertial navigation trajectory of said inertial measurement unit from said positioning data, said inertial navigation data, comprises:
screening reliable positioning data with confidence coefficient reaching a preset value from the positioning data, and determining an initial running track of the vehicle according to the reliable positioning data;
and determining the inertial navigation track of the inertial measurement unit according to the initial driving track and the inertial navigation data.
12. The method of claim 11, wherein the determining an inertial navigation trajectory of the inertial measurement unit from the initial travel trajectory, the inertial navigation data, comprises:
determining a complete driving track of the vehicle according to the initial driving track and the inertial navigation data;
and carrying out deviation processing on the complete running track according to the relative position between the positioning system and the inertial measurement unit to obtain an inertial navigation track of the inertial measurement unit.
13. A calibration device for laser radar external parameters comprises:
the system comprises a data acquisition unit, a data acquisition unit and a data processing unit, wherein the data acquisition unit is used for acquiring radar data acquired by a radar sensor arranged on a vehicle, inertial navigation data acquired by an inertial measurement unit arranged on the vehicle and positioning data acquired by a positioning system arranged on the vehicle;
the track determining unit is used for determining an inertial navigation track of the inertial measurement unit according to the positioning data and the inertial navigation data;
the position and pose determining unit is used for determining the inertial navigation relative position and pose of the inertial measuring unit between adjacent frames according to the inertial navigation track and determining the radar relative position and pose of the radar between adjacent frames according to the radar data;
and the calibration unit is used for determining calibration parameters of the radar relative to the inertial measurement unit according to the inertial navigation relative pose and the radar relative pose.
14. The apparatus of claim 13, wherein the radar data comprises point cloud data;
the pose determination unit comprises a radar pose determination module for:
and registering the point cloud data of the adjacent frames to obtain the relative position and posture of the radar between the adjacent frames.
15. The apparatus of claim 14, wherein the radar pose determination module comprises:
the sampling submodule is used for carrying out multi-level sampling processing on the point cloud data of the adjacent frames to obtain multi-level point cloud sampling data with different accuracies of the adjacent frames;
the hierarchical registration submodule is used for sequentially registering the point cloud sampling data and the point cloud data of the adjacent frames according to the sequence from low-level precision to high-level precision; wherein, the registration result of the low-level point cloud sampling data is used as an initial value when the high-level point cloud sampling data is registered;
and the result of registering the point cloud data with the highest precision is the relative position and posture of the radar between the adjacent frames.
16. The apparatus of claim 15, wherein the hierarchical registration sub-module, when registering the point cloud sample data and the point cloud data of the adjacent frames, is specifically configured to:
acquiring first point cloud data and second point cloud data of the same precision level of adjacent frames; wherein the first point cloud data and the second point cloud data are point cloud sampling data or the point cloud data included in the radar data;
mapping the first three-dimensional point cloud to a two-dimensional plane according to the first point cloud data to obtain a first two-dimensional point cloud, and mapping the second three-dimensional point cloud to the two-dimensional plane according to the second point cloud data to obtain a second two-dimensional point cloud;
and registering the point cloud sampling data or the point cloud data of the adjacent frames according to the positions of the two-dimensional point cloud and the second two-dimensional point cloud on a two-dimensional plane.
17. The apparatus of claim 14, wherein the radar pose determination module further comprises a compensation sub-module to, prior to registering the point cloud data of adjacent frames:
acquiring a local point cloud map constructed according to the point cloud data and/or the driving speed of the vehicle;
performing motion compensation on the point cloud data according to the local point cloud map and/or the driving speed of the vehicle to obtain compensated point cloud data;
the radar pose determination module is specifically configured to:
and registering the point cloud data of the adjacent frames according to the compensated point cloud data.
18. The apparatus according to any one of claims 14-17, wherein the radar pose determination module is configured to register sets of point cloud data of adjacent frames in parallel, in particular using a parallel computing architecture of a graphics card in the electronic device.
19. The apparatus of any of claims 13-18, wherein the radar relative pose comprises a radar rotation parameter and a radar translation parameter; the inertial navigation relative pose comprises inertial navigation rotation parameters and inertial navigation translation parameters;
the calibration unit comprises:
the inter-frame calibration module is used for determining calibration parameters of the radar corresponding to the adjacent frames relative to the inertial measurement unit according to the radar rotation parameters and the radar translation parameters between the adjacent frames and the inertial navigation rotation parameters and the inertial navigation translation parameters between the adjacent frames;
the fitting module is used for fitting calibration parameters of the radar corresponding to a plurality of groups of adjacent frames relative to the inertial measurement unit to obtain the calibration parameters of the radar relative to the inertial measurement unit; the calibration parameters comprise rotation parameters and translation parameters of the radar relative to the inertial measurement unit.
20. The apparatus of claim 19, wherein at least two of the radar sensors are disposed on the vehicle;
the calibration unit further comprises an optimization module, configured to, after the fitting module fits calibration parameters of the radar relative to the inertial measurement unit corresponding to multiple sets of adjacent frames:
determining the relative pose of a first radar corresponding to an adjacent frame relative to a second radar according to the calibration parameters of the first radar corresponding to the adjacent frame relative to the inertial measurement unit and the calibration parameters of the second radar corresponding to the adjacent frame relative to the inertial measurement unit;
and optimizing calibration parameters of the first radar and calibration parameters of the second radar according to the relative poses of the first radar and the second radar corresponding to the adjacent frames.
21. The apparatus of claim 19, further comprising a state optimization unit to:
determining the relative pose of the radar between adjacent frames after registration and the state of the radar to be optimized in each frame according to the calibration parameters of the radar corresponding to the adjacent frames relative to the inertial measurement unit;
and optimizing the state to be optimized according to the relative pose of the radar between adjacent frames after registration and the calibration parameters of the radar.
22. The apparatus according to any one of claims 13-21, wherein the data acquisition unit comprises:
the initial data acquisition module is used for acquiring initial radar data acquired by the radar sensor, initial inertial navigation data acquired by the inertial measurement unit and initial positioning data acquired by a positioning system when the vehicle runs;
and the data screening unit is used for screening out radar data meeting a preset driving state from the initial radar data, screening out inertial navigation data meeting the preset driving state from the initial inertial navigation data, and screening out positioning data meeting the preset driving state from the initial positioning data.
23. The apparatus of any one of claims 13-21, wherein the trajectory determination unit comprises:
the initial track determining module is used for screening reliable positioning data with confidence coefficient reaching a preset value from the positioning data and determining an initial running track of the vehicle according to the reliable positioning data;
and the inertial navigation track determining module is used for determining the inertial navigation track of the inertial measurement unit according to the initial driving track and the inertial navigation data.
24. The apparatus of claim 23, wherein the inertial navigation trajectory determination module is specifically configured to:
determining a complete driving track of the vehicle according to the initial driving track and the inertial navigation data;
and carrying out deviation processing on the complete running track according to the relative position between the positioning system and the inertial measurement unit to obtain an inertial navigation track of the inertial measurement unit.
25. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-12.
26. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-12.
27. A computer program product comprising a computer program which, when executed by a processor, carries out the steps of the method of any one of claims 1 to 12.
CN202111335620.7A 2021-11-11 2021-11-11 Method, device, storage medium and program product for calibrating laser radar external parameter Pending CN113933818A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111335620.7A CN113933818A (en) 2021-11-11 2021-11-11 Method, device, storage medium and program product for calibrating laser radar external parameter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111335620.7A CN113933818A (en) 2021-11-11 2021-11-11 Method, device, storage medium and program product for calibrating laser radar external parameter

Publications (1)

Publication Number Publication Date
CN113933818A true CN113933818A (en) 2022-01-14

Family

ID=79286569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111335620.7A Pending CN113933818A (en) 2021-11-11 2021-11-11 Method, device, storage medium and program product for calibrating laser radar external parameter

Country Status (1)

Country Link
CN (1) CN113933818A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114593751A (en) * 2022-03-11 2022-06-07 北京京东乾石科技有限公司 External parameter calibration method, device, medium and electronic equipment
CN115235527A (en) * 2022-07-20 2022-10-25 上海木蚁机器人科技有限公司 Sensor external parameter calibration method and device and electronic equipment
CN115993089A (en) * 2022-11-10 2023-04-21 山东大学 PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method
WO2023143132A1 (en) * 2022-01-29 2023-08-03 北京三快在线科技有限公司 Sensor data calibration
CN117092625A (en) * 2023-10-10 2023-11-21 北京斯年智驾科技有限公司 External parameter calibration method and system of radar and combined inertial navigation system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023143132A1 (en) * 2022-01-29 2023-08-03 北京三快在线科技有限公司 Sensor data calibration
CN114593751A (en) * 2022-03-11 2022-06-07 北京京东乾石科技有限公司 External parameter calibration method, device, medium and electronic equipment
CN115235527A (en) * 2022-07-20 2022-10-25 上海木蚁机器人科技有限公司 Sensor external parameter calibration method and device and electronic equipment
CN115993089A (en) * 2022-11-10 2023-04-21 山东大学 PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method
CN115993089B (en) * 2022-11-10 2023-08-15 山东大学 PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method
CN117092625A (en) * 2023-10-10 2023-11-21 北京斯年智驾科技有限公司 External parameter calibration method and system of radar and combined inertial navigation system
CN117092625B (en) * 2023-10-10 2024-01-02 北京斯年智驾科技有限公司 External parameter calibration method and system of radar and combined inertial navigation system

Similar Documents

Publication Publication Date Title
CN109270545B (en) Positioning true value verification method, device, equipment and storage medium
CN109061703B (en) Method, apparatus, device and computer-readable storage medium for positioning
CN113933818A (en) Method, device, storage medium and program product for calibrating laser radar external parameter
CN109435955B (en) Performance evaluation method, device and equipment for automatic driving system and storage medium
CN112113574B (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN114018274B (en) Vehicle positioning method and device and electronic equipment
CN112835085B (en) Method and device for determining vehicle position
CN110187375A (en) A kind of method and device improving positioning accuracy based on SLAM positioning result
CN113655453A (en) Data processing method and device for sensor calibration and automatic driving vehicle
CN112455502B (en) Train positioning method and device based on laser radar
CN114119886A (en) High-precision map point cloud reconstruction method and device, vehicle, equipment and storage medium
CN114323033B (en) Positioning method and equipment based on lane lines and feature points and automatic driving vehicle
CN115164936A (en) Global pose correction method and device for point cloud splicing in high-precision map manufacturing
CN113592951A (en) Method and device for calibrating external parameters of vehicle-road cooperative middle-road side camera and electronic equipment
CN114485698A (en) Intersection guide line generating method and system
CN113580134A (en) Visual positioning method, device, robot, storage medium and program product
CN113554712A (en) Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN115900697B (en) Object motion trail information processing method, electronic equipment and automatic driving vehicle
CN110388917B (en) Aircraft monocular vision scale estimation method and device, aircraft navigation system and aircraft
CN113495281B (en) Real-time positioning method and device for movable platform
CN115727871A (en) Track quality detection method and device, electronic equipment and storage medium
CN109459769A (en) A kind of autonomic positioning method and system
CN112835086B (en) Method and device for determining vehicle position
CN115560744A (en) Robot, multi-sensor-based three-dimensional mapping method and storage medium
US11859979B2 (en) Delta position and delta attitude aiding of inertial navigation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination