CN112285676A - Laser radar and IMU external reference calibration method and device - Google Patents

Laser radar and IMU external reference calibration method and device Download PDF

Info

Publication number
CN112285676A
CN112285676A CN202011136461.3A CN202011136461A CN112285676A CN 112285676 A CN112285676 A CN 112285676A CN 202011136461 A CN202011136461 A CN 202011136461A CN 112285676 A CN112285676 A CN 112285676A
Authority
CN
China
Prior art keywords
pose information
imu
laser radar
point cloud
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011136461.3A
Other languages
Chinese (zh)
Other versions
CN112285676B (en
Inventor
陈骏超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Imotion Automotive Technology Suzhou Co Ltd
Original Assignee
Imotion Automotive Technology Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imotion Automotive Technology Suzhou Co Ltd filed Critical Imotion Automotive Technology Suzhou Co Ltd
Priority to CN202011136461.3A priority Critical patent/CN112285676B/en
Publication of CN112285676A publication Critical patent/CN112285676A/en
Application granted granted Critical
Publication of CN112285676B publication Critical patent/CN112285676B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manufacturing & Machinery (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application relates to a method and a device for calibrating external parameters of a laser radar and an IMU (inertial measurement Unit), belonging to the technical field of computers, wherein the method comprises the following steps: acquiring point cloud data acquired by a laser radar and inertial measurement data acquired by an IMU (inertial measurement Unit); performing point cloud matching on the point cloud data to calculate the current position and orientation information of the laser radar; calculating current IMU pose information based on the inertial measurement data; aligning the laser radar pose information and the IMU pose information based on the timestamp to obtain an aligned pose information group; determining an external reference calibration value based on the aligned pose information group by using a RANSAC algorithm; the problem of poor stability and accuracy caused by using only the hand-eye calibration equation to perform external reference calibration can be solved; the RANSAC algorithm can well remove the influence of the abnormal pose in the hand-eye calibration pose on the result, so that the accuracy and the stability of the calibration result can be improved.

Description

Laser radar and IMU external reference calibration method and device
Technical Field
The application relates to a laser radar and IMU external reference calibration method and device, and belongs to the technical field of computers.
Background
The mapping and positioning are key technologies for realizing automatic driving of the unmanned vehicle. In actual operation of the vehicle, measurement using a single sensor may cause a large measurement error due to the limitation of the sensor itself. At present, a solution using a laser radar and an Inertial Measurement Unit (IMU) fusion is the mainstream research direction in the industry at present. Because different sensors are respectively assembled at different positions on a carrier, and attitude angles during simultaneous installation are possibly inconsistent, calibrating the deviation of the accurate position and attitude angle between the two sensors (namely external parameters of the sensors) is an important basis for realizing the parameter fusion of the sensors.
When external parameters of the laser radar and the IMU are calibrated, a hand-eye calibration equation is used for calibration, namely motion information of the laser radar and the IMU at a specific moment is respectively solved, a target equation is established by utilizing constraint between poses of the laser radar and the IMU, and a result is obtained by optimizing the target equation.
However, the obtained sensor mileage information contains a lot of abnormal data, and the calibration result obtained by directly solving the obtained result is unstable by using the result, and a large error exists.
Disclosure of Invention
The application provides a laser radar and IMU external reference calibration method and device, which can solve the problem of poor stability and accuracy caused by external reference calibration only by using a hand-eye calibration equation. The application provides the following technical scheme:
in a first aspect, a method for calibrating laser radar and IMU external parameters is provided, where the method includes:
acquiring point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU;
calculating the current position and orientation information of the laser radar based on the point cloud data;
calculating current IMU pose information based on the inertial measurement data;
aligning the laser radar pose information and the IMU pose information based on a timestamp to obtain an aligned pose information group;
determining an external reference calibration value based on the aligned set of pose information using the RANSAC algorithm.
Optionally, the determining, using the RANSAC algorithm, an external reference calibration value based on the aligned set of pose information includes:
circularly executing the following steps until the circulation condition does not meet the preset condition: inputting a plurality of groups of aligned pose information into a hand-eye calibration equation to obtain a rough external parameter estimation value; determining a coarse model score, and a set of coarse outliers and a set of coarse inliers in the plurality of sets of the aligned sets of pose information using the RANSAC algorithm and the coarse outlier estimates; when the score of the rough model is higher than a score threshold value, determining a parameter model corresponding to the rough external parameter estimation value as a candidate parameter model;
circularly executing the following steps until the only parameter model is screened out from the candidate parameter models, and obtaining the external reference value corresponding to the parameter model: for the rough internal point set corresponding to each candidate parameter model, inputting the rough internal point set into the hand-eye calibration equation again to obtain a fine external parameter estimation value; re-determining a fine model score, and a fine outlier set and a fine inlier set in the coarse inlier set using the RANSAC algorithm and the fine outlier estimate; and screening out candidate parameter models which do not meet the model condition based on the fine model score.
Optionally, the preset condition is that the execution times reach preset times; or, the number of the candidate parameter models reaches a preset number.
Optionally, the model condition is that after the candidate parametric models are sorted according to the fine model score from high to low, the candidate parametric models are sorted in the top N bits, where N is 1/N, and N is the number of the candidate parametric models.
Optionally, the aligning the lidar pose information and the IMU pose information based on a timestamp to obtain an aligned pose information group includes:
acquiring a target timestamp corresponding to each frame of laser radar pose information;
determining two timestamps adjacent and nearest to the target timestamp from timestamps of the IMU pose information;
and performing interpolation compensation on the IMU pose information according to the target timestamp and the determined difference value between the two IMUs adjacent timestamps to obtain the pose information group aligned to the target timestamp.
Optionally, after the acquiring point cloud data acquired by the lidar and inertial measurement data acquired by the IMU, the method further includes:
preprocessing the point cloud data to remove noise;
and performing down-sampling processing on the preprocessed point cloud data to obtain processed point cloud data, wherein the processed point cloud data is used for calculating the laser radar pose information.
Optionally, the calculating current lidar pose information based on the point cloud data includes:
registering the processed point cloud data of the adjacent frames by using a Normal Distribution Transformation (NDT) algorithm;
and obtaining the laser radar pose information according to the pose transformation integral obtained by registration.
Optionally, the calculating current IMU pose information based on the inertial measurement data includes:
and pre-integrating the inertial measurement data to obtain the IMU pose information.
In a second aspect, a lidar and IMU external reference calibration apparatus is provided, the apparatus includes:
the data acquisition module is used for acquiring point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU;
the first calculation module is used for calculating the current laser radar pose information based on the point cloud data;
the second calculation module is used for calculating the current IMU pose information based on the inertial measurement data;
the data alignment module is used for aligning the laser radar pose information and the IMU pose information based on a timestamp to obtain an aligned pose information group;
and the external reference calibration module is used for determining an external reference calibration value based on the aligned pose information group by using the RANSAC algorithm.
The beneficial effect of this application lies in: the method comprises the steps of acquiring point cloud data acquired by a laser radar and inertial measurement data acquired by an IMU; calculating the current position and orientation information of the laser radar based on the point cloud data; calculating current IMU pose information based on the inertial measurement data; aligning the laser radar pose information and the IMU pose information based on the timestamp to obtain an aligned pose information group; determining an external reference calibration value based on the aligned pose information group by using a RANSAC algorithm; the problem of poor stability and accuracy caused by using only the hand-eye calibration equation to perform external reference calibration can be solved; the RANSAC algorithm can well remove the influence of the abnormal pose in the hand-eye calibration pose on the result, so that the accuracy and the stability of the calibration result can be improved.
The foregoing description is only an overview of the technical solutions of the present application, and in order to make the technical solutions of the present application more clear and clear, and to implement the technical solutions according to the content of the description, the following detailed description is made with reference to the preferred embodiments of the present application and the accompanying drawings.
Drawings
Fig. 1 is a flowchart of a lidar and IMU external reference calibration method according to an embodiment of the present disclosure;
FIG. 2 is a flowchart of a lidar and IMU external reference calibration method according to another embodiment of the present disclosure;
fig. 3 is a block diagram of a lidar and IMU external reference calibration apparatus according to an embodiment of the present disclosure.
Detailed Description
The following detailed description of embodiments of the present application will be described in conjunction with the accompanying drawings and examples. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
First, several terms referred to in the present application will be described.
Random Sample Consensus algorithm (Random Sample Consensus, RANSAC): mathematical model parameters of the data can be iteratively calculated according to a set of sample data sets containing abnormal data, so that normal data and abnormal data are distinguished, and model parameters conforming to the data are obtained.
The principle of the RANSAC algorithm includes the following steps, performed cyclically:
1. a subset is chosen randomly and assumed to be an in-office point.
2. And fitting a model by using the local interior points, wherein the model is adapted to the assumed local interior points, and all unknown parameters can be calculated from the assumed local interior points.
3. And (3) testing other data in the whole data by using the model obtained in the step (2), and if a certain point is suitable for the estimated model, considering the certain point to be also an intra-local point, and expanding the intra-local point.
4. If enough points are classified as hypothetical intra-office points, the estimated model is reasonable enough.
5. All the expanded local interior points are used to re-estimate the model.
6. The model is evaluated by estimating the error rate of the local inliers to the model.
7. If the current model works better than the best model and is selected as the best model, otherwise the current model is discarded.
Inertial Measurement Unit (IMU): the device is used for measuring the three-axis attitude angle and the acceleration of the object, can calculate the motion attitude of the object in a period of time, and can be applied to vehicle positioning.
Normal Distributed Transform algorithm (NDT): the method is a point cloud registration algorithm, is applied to a statistical model of three-dimensional point cloud, determines the optimal matching between two point clouds by using an iterative optimization method and obtains corresponding pose transformation.
Calibrating the hands and eyes: and is used for calibrating rigid body coordinate conversion between the tail end camera of the mechanical arm and the tail end of the mechanical arm at the earliest time. In a world coordinate system, for the motion at any moment, the terminal pose A and the terminal pose B of the manipulator meet the motion constraint AX (XB), and the external parameter X can be obtained by solving the objective equation through optimization.
Optionally, the present application takes the execution subject of each embodiment as an example of a control device, the control device may be installed in a vehicle or a device independent from the vehicle, and the implementation manner of the control device is not limited in this embodiment. The control device is in communication connection with the laser radar and the IMU in the current vehicle, and can acquire point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU.
Fig. 1 is a flowchart of a lidar and IMU external reference calibration method according to an embodiment of the present disclosure. The method at least comprises the following steps:
step 101, point cloud data acquired by a laser radar and inertial measurement data acquired by an IMU are acquired.
The frequency at which the lidar and the IMU acquire data is typically different, and in general, the acquisition frequency of the lidar is lower than that of the IMU. In other words, the acquisition speed of the IMU is faster than that of the lidar. Based on the principle, the control device does not simultaneously acquire point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU.
Among these, inertial measurement data include, but are not limited to: acceleration values and angle values. Of course, the inertial measurement data may also include other parameters, which are not listed here.
And 102, calculating the current laser radar pose information based on the point cloud data.
Optionally, after point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU are acquired, preprocessing the point cloud data to remove noise; and performing down-sampling processing on the preprocessed point cloud data to obtain processed point cloud data, wherein the processed point cloud data is used for calculating the position and attitude information of the laser radar. At this time, calculating the current laser radar pose information based on the point cloud data, including: using an NDT algorithm to register the processed point cloud data of the adjacent frames; and obtaining the laser radar pose information according to the pose transformation integral obtained by registration.
And 103, calculating the current IMU pose information based on the inertial measurement data.
Schematically, the inertial measurement data is pre-integrated to obtain IMU pose information.
And the laser radar pose information and the IMU pose information are used for indicating the position and the running angle of the current vehicle.
In addition, the execution order of step 103 and step 102 is not limited in this embodiment.
And step 104, aligning the laser radar pose information and the IMU pose information based on the timestamp to obtain an aligned pose information group.
And each aligned pose information group comprises aligned laser radar pose information and aligned IMU pose information.
Schematically, aligning the laser radar pose information and the IMU pose information based on the timestamp to obtain an aligned pose information group, including: acquiring a target timestamp corresponding to each frame of laser radar pose information; determining two timestamps adjacent and closest to the target timestamp from the timestamps of the IMU pose information; and performing interpolation compensation on the IMU pose information according to the target timestamp and the determined difference value between the two IMUs adjacent to the target timestamp to obtain a pose information group aligned to the target timestamp.
Assume the lidar pose information for target timestamp t1 is X1; because of the high acquisition frequency of the IMU, there are two timestamps t0 and t2 that are nearest to t1, the timestamp t0 corresponds to IMU pose information Y1, the timestamp t1 corresponds to IMU pose information Y2, and there may be no IMU pose information corresponding to the target timestamp t 1. Therefore, in the present embodiment, IMU pose information Y corresponding to the target timestamp t1 may be estimated using the IMU pose information Y1 corresponding to the timestamp t0 and IMU pose information Y2 corresponding to the timestamp t 1. Where t0< t1< t2, Y is proportional to the time difference to an intermediate value, and is represented by the following equation:
Y=(t1-t0)/(t2-t0)*Y1+(t2-t1)/(t2-t0)*Y2。
and 105, determining a hand-eye calibration equation and an external reference calibration value corresponding to the hand-eye calibration equation based on the aligned pose information set by using a RANSAC algorithm.
The method for determining the hand-eye calibration equation and the external reference calibration value corresponding to the hand-eye calibration equation based on the aligned pose information set by using the RANSAC algorithm comprises the following steps:
circularly executing the following steps until the circulation condition does not meet the preset condition: inputting a plurality of groups of aligned pose information into a hand-eye calibration equation to obtain a rough external parameter estimation value; determining a rough model score and rough outer point sets and rough inner point sets in a plurality of groups of aligned pose information sets by using a RANSAC algorithm and rough outer parameter estimation values; when the score of the rough model is higher than a score threshold value, determining a parameter model corresponding to the rough external parameter estimation value as a candidate parameter model;
the following steps are executed in a circulating mode until the only parameter model is screened out from the candidate parameter models, and the external reference value corresponding to the parameter model is obtained: for the rough internal point set corresponding to each candidate parameter model, inputting the rough internal point set into the hand-eye calibration equation again to obtain a fine external parameter estimation value; determining the score of the fine model and the fine outer point set and the fine inner point set in the rough inner point set again by using a RANSAC algorithm and the fine outer parameter estimation value; candidate parametric models that do not meet the model conditions are screened out based on the refined model scores.
Optionally, the preset condition is that the execution times reach preset times; or, the number of the candidate parameter models reaches a preset number.
Illustratively, the model condition is that after each candidate parameter model is sorted according to the fine model score from high to low, the candidate parameter models in the top N bits are sorted, wherein N is N/2, and N is the number of the candidate parameter models.
In order to more clearly understand the lidar and IMU external reference calibration method provided by the present application, the method is described as an example. Referring to fig. 2, the method includes the following steps:
and step 21, preprocessing and downsampling the point cloud data of the laser radar to obtain processed point cloud data.
And step 22, registering the processed point clouds of the adjacent frames by using an NDT algorithm, and continuously outputting corresponding laser radar pose information in an integral manner according to pose transformation obtained by registration.
And step 23, pre-integrating the inertial measurement data of the IMU to obtain corresponding IMU pose information.
The present embodiment does not limit the execution sequence between step 23 and step 21.
And 24, carrying out time synchronization on the laser radar pose information and the IMU pose information to obtain an aligned pose information set.
And 25, randomly selecting a plurality of aligned pose information sets and inputting the pose information sets into a hand-eye calibration equation to be calculated to obtain rough external parameter estimation values.
And 26, calculating corresponding model scores according to the rough external parameter estimated values, and distinguishing rough inner point sets and rough outer point sets in the plurality of aligned pose information sets.
And 27, if the rough model score of the parameter model corresponding to the rough external parameter estimation value is higher than the set score threshold value, adding the parameter model into the candidate queue, and repeating the steps 25 and 26 until the preset times of operation or the number of the candidate parameter models in the candidate model queue reaches the preset number.
And step 28, randomly selecting several groups of posture information groups in the rough interior point set of each candidate parameter model in the candidate queue to substitute into the hand-eye calibration equation to obtain a fine external parameter estimation value.
And 29, calculating the score of the fine model according to the fine extrinsic parameter estimated value, and further distinguishing a fine inner point set and a fine outer point set in the rough inner point set.
Step 291, sorting all candidate parameter models according to the sequence of the fine model scores from high to low; and deleting half of the candidate parameter models with lower fine model scores, and repeatedly executing the steps 28 and 29 until only one candidate parameter model is left in the queue to obtain the final parameter model corresponding to the external reference calibration value, namely the external reference calibration value.
In summary, according to the method for calibrating the external reference of the laser radar and the IMU provided by the embodiment, point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU are acquired; calculating the current position and orientation information of the laser radar based on the point cloud data; calculating current IMU pose information based on the inertial measurement data; aligning the laser radar pose information and the IMU pose information based on the timestamp to obtain an aligned pose information group; determining an external reference calibration value based on the aligned pose information group by using a RANSAC algorithm; the problem of poor stability and accuracy caused by using only the hand-eye calibration equation to perform external reference calibration can be solved; the RANSAC algorithm can well remove the influence of the abnormal pose in the hand-eye calibration pose on the result, so that the accuracy and the stability of the calibration result can be improved.
Fig. 3 is a block diagram of a lidar and IMU external reference calibration apparatus according to an embodiment of the present disclosure. The device at least comprises the following modules: a data acquisition module 310, a first calculation module 320, a second calculation module 330, a data alignment module 340, and an external reference calibration module 350.
A data acquisition module 310, configured to acquire point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU;
a first calculation module 320, configured to calculate current lidar pose information based on the point cloud data;
a second calculation module 330, configured to calculate current IMU pose information based on the inertial measurement data;
a data alignment module 340, configured to align the laser radar pose information and the IMU pose information based on a timestamp, to obtain an aligned pose information group;
an extrinsic calibration module 350, configured to determine an extrinsic calibration value based on the aligned pose information set using the RANSAC algorithm.
For relevant details reference is made to the above-described method embodiments.
It should be noted that: in the lidar and IMU external reference calibration device provided in the above embodiment, when the lidar and IMU external reference calibration is performed, only the division of the above functional modules is used for illustration, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the lidar and IMU external reference calibration device is divided into different functional modules to complete all or part of the above described functions. In addition, the lidar and IMU external reference calibration apparatus provided by the above embodiment and the lidar and IMU external reference calibration method embodiment belong to the same concept, and the specific implementation process thereof is detailed in the method embodiment and is not described herein again.
Optionally, the present application further provides a computer-readable storage medium, in which a program is stored, and the program is loaded and executed by a processor to implement the lidar and IMU external reference calibration method according to the foregoing method embodiment.
Optionally, the present application further provides a computer product, which includes a computer-readable storage medium, where a program is stored in the computer-readable storage medium, and the program is loaded and executed by a processor to implement the lidar and IMU external reference calibration method according to the foregoing method embodiment.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (9)

1. A laser radar and IMU external reference calibration method is characterized by comprising the following steps:
acquiring point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU;
calculating the current laser radar position and orientation information based on the point cloud data;
calculating current IMU pose information based on the inertial measurement data;
aligning the laser radar pose information and the IMU pose information based on a timestamp to obtain an aligned pose information group;
determining an external reference calibration value based on the aligned set of pose information using the RANSAC algorithm.
2. The method of claim 1, wherein the determining an extrinsic calibration value based on the set of aligned pose information using the RANSAC algorithm comprises:
circularly executing the following steps until the circulation condition does not meet the preset condition: inputting a plurality of groups of aligned pose information into a hand-eye calibration equation to obtain a rough external parameter estimation value; determining a coarse model score, and a set of coarse outliers and a set of coarse inliers in the plurality of sets of the aligned sets of pose information using the RANSAC algorithm and the coarse outlier estimates; when the score of the rough model is higher than a score threshold value, determining a parameter model corresponding to the rough external parameter estimation value as a candidate parameter model;
circularly executing the following steps until the only parameter model is screened out from the candidate parameter models, and obtaining the external reference value corresponding to the parameter model: for the rough internal point set corresponding to each candidate parameter model, inputting the rough internal point set into the hand-eye calibration equation again to obtain a fine external parameter estimation value; re-determining a fine model score, and a fine outlier set and a fine inlier set in the coarse inlier set using the RANSAC algorithm and the fine outlier estimate; and screening out candidate parameter models which do not meet the model condition based on the fine model score.
3. The method according to claim 2, wherein the preset condition is that the execution times reach a preset number; or, the number of the candidate parameter models reaches a preset number.
4. The method according to claim 2, wherein the model condition is that after each candidate parametric model is sorted from high to low according to the fine model score, the candidate parametric model is sorted in the top N bits, where N is N/2, and N is the number of candidate parametric models.
5. The method of claim 1, wherein aligning the lidar pose information and the IMU pose information based on timestamps to obtain an aligned set of pose information comprises:
acquiring a target timestamp corresponding to each frame of laser radar pose information;
determining two timestamps adjacent and nearest to the target timestamp from timestamps of the IMU pose information;
and performing interpolation compensation on the IMU pose information according to the target timestamp and the determined difference value between the two IMUs adjacent timestamps to obtain the pose information group aligned to the target timestamp.
6. The method of claim 1, wherein after the acquiring the lidar acquired point cloud data and the IMU acquired inertial measurement data, further comprising:
preprocessing the point cloud data to remove noise;
and performing down-sampling processing on the preprocessed point cloud data to obtain processed point cloud data, wherein the processed point cloud data is used for calculating the laser radar pose information.
7. The method of claim 6, wherein the calculating current lidar pose information based on the point cloud data comprises:
registering the processed point cloud data of the adjacent frames by using a Normal Distribution Transformation (NDT) algorithm;
and obtaining the laser radar pose information according to the pose transformation integral obtained by registration.
8. The method of claim 1, wherein said calculating current IMU pose information based on said inertial measurement data comprises:
and pre-integrating the inertial measurement data to obtain the IMU pose information.
9. A laser radar and IMU external reference calibration device is characterized by comprising:
the data acquisition module is used for acquiring point cloud data acquired by the laser radar and inertial measurement data acquired by the IMU;
the first calculation module is used for calculating the current laser radar pose information based on the point cloud data;
the second calculation module is used for calculating the current IMU pose information based on the inertial measurement data;
the data alignment module is used for aligning the laser radar pose information and the IMU pose information based on a timestamp to obtain an aligned pose information group;
and the external reference calibration module is used for determining an external reference calibration value based on the aligned pose information group by using the RANSAC algorithm.
CN202011136461.3A 2020-10-22 2020-10-22 Laser radar and IMU external parameter calibration method and device Active CN112285676B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011136461.3A CN112285676B (en) 2020-10-22 2020-10-22 Laser radar and IMU external parameter calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011136461.3A CN112285676B (en) 2020-10-22 2020-10-22 Laser radar and IMU external parameter calibration method and device

Publications (2)

Publication Number Publication Date
CN112285676A true CN112285676A (en) 2021-01-29
CN112285676B CN112285676B (en) 2024-02-09

Family

ID=74423502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011136461.3A Active CN112285676B (en) 2020-10-22 2020-10-22 Laser radar and IMU external parameter calibration method and device

Country Status (1)

Country Link
CN (1) CN112285676B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112946681A (en) * 2021-05-17 2021-06-11 知行汽车科技(苏州)有限公司 Laser radar positioning method fusing combined navigation information
CN113484843A (en) * 2021-06-02 2021-10-08 福瑞泰克智能系统有限公司 Method and device for determining external parameters between laser radar and integrated navigation
CN113495281A (en) * 2021-06-21 2021-10-12 杭州飞步科技有限公司 Real-time positioning method and device for movable platform
CN113759349A (en) * 2021-09-22 2021-12-07 阿波罗智能技术(北京)有限公司 Calibration method and device for laser radar and positioning device and automatic driving vehicle
CN114166219A (en) * 2021-12-01 2022-03-11 珠海一微半导体股份有限公司 Method, chip and robot for correcting inertial navigation error
CN114413887A (en) * 2021-12-24 2022-04-29 北京理工大学前沿技术研究院 Method, equipment and medium for calibrating external parameters of sensor
CN116380935A (en) * 2023-06-02 2023-07-04 中南大学 High-speed railway box girder damage detection robot car and damage detection method
WO2023131123A1 (en) * 2022-01-05 2023-07-13 上海三一重机股份有限公司 External parameter calibration method and apparatus for combined navigation device and laser radar

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109143205A (en) * 2018-08-27 2019-01-04 深圳清创新科技有限公司 Integrated transducer external parameters calibration method, apparatus
CN109901139A (en) * 2018-12-28 2019-06-18 文远知行有限公司 Laser radar scaling method, device, equipment and storage medium
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data
CN110579754A (en) * 2019-10-15 2019-12-17 戴姆勒股份公司 Method for determining external parameters of a lidar and other sensors of a vehicle
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
CN111443337A (en) * 2020-03-27 2020-07-24 北京航空航天大学 Radar-IMU calibration method based on hand-eye calibration

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109143205A (en) * 2018-08-27 2019-01-04 深圳清创新科技有限公司 Integrated transducer external parameters calibration method, apparatus
CN109901139A (en) * 2018-12-28 2019-06-18 文远知行有限公司 Laser radar scaling method, device, equipment and storage medium
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data
CN110579754A (en) * 2019-10-15 2019-12-17 戴姆勒股份公司 Method for determining external parameters of a lidar and other sensors of a vehicle
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
CN111443337A (en) * 2020-03-27 2020-07-24 北京航空航天大学 Radar-IMU calibration method based on hand-eye calibration

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴修振;刘刚;于凤全;张源原;: "基于单目视觉的GPS辅助相机外参数标定", 光学精密工程, no. 08 *
吴晗;李巍;董明利;: "基于RANSAC的便携式激光扫描测量臂手眼标定方法", 计算机工程与应用, no. 23 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112946681A (en) * 2021-05-17 2021-06-11 知行汽车科技(苏州)有限公司 Laser radar positioning method fusing combined navigation information
CN113484843A (en) * 2021-06-02 2021-10-08 福瑞泰克智能系统有限公司 Method and device for determining external parameters between laser radar and integrated navigation
CN113495281A (en) * 2021-06-21 2021-10-12 杭州飞步科技有限公司 Real-time positioning method and device for movable platform
CN113495281B (en) * 2021-06-21 2023-08-22 杭州飞步科技有限公司 Real-time positioning method and device for movable platform
CN113759349A (en) * 2021-09-22 2021-12-07 阿波罗智能技术(北京)有限公司 Calibration method and device for laser radar and positioning device and automatic driving vehicle
CN113759349B (en) * 2021-09-22 2022-10-04 阿波罗智能技术(北京)有限公司 Calibration method of laser radar and positioning equipment Equipment and autonomous driving vehicle
CN114166219A (en) * 2021-12-01 2022-03-11 珠海一微半导体股份有限公司 Method, chip and robot for correcting inertial navigation error
CN114413887A (en) * 2021-12-24 2022-04-29 北京理工大学前沿技术研究院 Method, equipment and medium for calibrating external parameters of sensor
CN114413887B (en) * 2021-12-24 2024-04-02 北京理工大学前沿技术研究院 Sensor external parameter calibration method, device and medium
WO2023131123A1 (en) * 2022-01-05 2023-07-13 上海三一重机股份有限公司 External parameter calibration method and apparatus for combined navigation device and laser radar
CN116380935A (en) * 2023-06-02 2023-07-04 中南大学 High-speed railway box girder damage detection robot car and damage detection method

Also Published As

Publication number Publication date
CN112285676B (en) 2024-02-09

Similar Documents

Publication Publication Date Title
CN112285676A (en) Laser radar and IMU external reference calibration method and device
CN108332758B (en) Corridor identification method and device for mobile robot
CN109507706B (en) GPS signal loss prediction positioning method
JP7173471B2 (en) 3D position estimation device and program
CN110986968A (en) Method and device for real-time global optimization and error loop judgment in three-dimensional reconstruction
CN108829996B (en) Method and device for obtaining vehicle positioning information
CN111915675A (en) Particle filter point cloud positioning method based on particle drift, and device and system thereof
CN114323033A (en) Positioning method and device based on lane lines and feature points and automatic driving vehicle
CN113310505B (en) External parameter calibration method and device of sensor system and electronic equipment
CN117392241B (en) Sensor calibration method and device in automatic driving and electronic equipment
WO2017141469A1 (en) Position estimation device
CN113759347B (en) Coordinate relation calibration method, device, equipment and medium
CN114061592B (en) Adaptive robust AUV navigation method based on multiple models
CN115727871A (en) Track quality detection method and device, electronic equipment and storage medium
CN115309630A (en) Method, device and equipment for generating automatic driving simulation data and storage medium
CN115655305A (en) External parameter calibration method and device, computing equipment, storage medium and vehicle
CN115560744A (en) Robot, multi-sensor-based three-dimensional mapping method and storage medium
CN112991445B (en) Model training method, gesture prediction method, device, equipment and storage medium
CN111812668B (en) Winding inspection device, positioning method thereof and storage medium
CN113538699A (en) Positioning method, device and equipment based on three-dimensional point cloud and storage medium
CN113465616A (en) Track abnormal point detection method and device, electronic equipment, computer program product and computer readable storage medium
CN114119885A (en) Image feature point matching method, device and system and map construction method and system
CN113670300A (en) Loop detection method and device of SLAM system
CN111982108A (en) Mobile robot positioning method, device, equipment and storage medium
CN114842084B (en) Map construction method and device and mobile detection equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 215123 g2-1901 / 1902 / 2002, No. 88, Jinjihu Avenue, Suzhou Industrial Park, Suzhou City, Jiangsu Province

Applicant after: Zhixing Automotive Technology (Suzhou) Co.,Ltd.

Address before: 215123 g2-1901 / 1902 / 2002, No. 88, Jinjihu Avenue, Suzhou Industrial Park, Suzhou City, Jiangsu Province

Applicant before: IMOTION AUTOMOTIVE TECHNOLOGY (SUZHOU) Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant