CN112577517A - Multi-element positioning sensor combined calibration method and system - Google Patents

Multi-element positioning sensor combined calibration method and system Download PDF

Info

Publication number
CN112577517A
CN112577517A CN202011267228.9A CN202011267228A CN112577517A CN 112577517 A CN112577517 A CN 112577517A CN 202011267228 A CN202011267228 A CN 202011267228A CN 112577517 A CN112577517 A CN 112577517A
Authority
CN
China
Prior art keywords
camera
point cloud
laser radar
data
positioning sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011267228.9A
Other languages
Chinese (zh)
Inventor
蒋亚妮
郭晋峰
姚明江
涂曙光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC Volkswagen Automotive Co Ltd
Original Assignee
SAIC Volkswagen Automotive Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC Volkswagen Automotive Co Ltd filed Critical SAIC Volkswagen Automotive Co Ltd
Priority to CN202011267228.9A priority Critical patent/CN112577517A/en
Publication of CN112577517A publication Critical patent/CN112577517A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Abstract

The invention discloses a combined calibration method of a multi-element positioning sensor, which comprises the following steps: 100: synchronizing the time of a plurality of positioning sensors installed on a vehicle, wherein the sensors at least comprise a camera, a laser radar and inertial navigation; 200: collecting first point cloud data of a laser radar and data of an inertial measurement unit of inertial navigation in a first state that a vehicle runs at a set constant speed; collecting second point cloud data of the laser radar and image data of the camera in a second static state of the vehicle; 300: calibrating the laser radar and the inertia measurement unit according to the collected first point cloud data and the collected inertia measurement unit data to obtain an external parameter matrix of the laser radar and the inertia measurement unit; calibrating the laser radar and the camera based on the collected second point cloud data and the collected image data, and obtaining a transformation matrix from the point cloud data of the laser radar to the image data of the camera; 400: and calibrating the camera and the inertial navigation based on the external parameter matrix and the transformation matrix.

Description

Multi-element positioning sensor combined calibration method and system
Technical Field
The present invention relates to a calibration method and system, and more particularly, to a method and system for calibrating a sensor of an autonomous vehicle.
Background
In recent years, with the rapid development of the automatic driving technology, the gradual improvement and the improvement of the automatic driving system, the possibility of the automatic driving vehicle to be used in daily life is increasing.
Autopilot is a very bulky system where positioning is one of the most important functions, involving a very large number of positioning sensors, such as lidar, cameras, inertial navigation, etc. Each sensor can output corresponding information or results, but generally, different sensors output different information formats or types, and users can give different work tasks according to the applicable scenes and hardware cost of each sensor. The camera has outstanding detection and identification capabilities, but is easily limited by ambient light and cannot acquire accurate three-dimensional information of a target; the laser radar has the advantages of long detection distance, accurate three-dimensional distance information, strong robustness, small environmental influence, high cost and poor detection capability on environments with multiple textures and different colors.
In order to obtain better results, researchers typically perform fusion of information from multiple sensors, combining the advantages of each sensor to obtain more accurate and consistent results. Each sensor has an independent coordinate system, and the information or results output by each sensor are based on the respective coordinate system. Before information fusion between different sensors is performed, all information needs to be converted to a same coordinate system. However, since different sensors have different mounting requirements, they are typically mounted at different locations on the vehicle, for example, the lidar is typically mounted at the roof of the vehicle, the camera is mounted at the front windshield of the vehicle, and the IMU is mounted at the center of the rear axle of the vehicle. Therefore, before multi-sensor information fusion is carried out, relative position information between the sensors needs to be acquired, namely external parameter joint calibration between the sensors.
In addition, each sensor can obtain positioning results with different accuracies, and the requirements on the environment are different. In order to compensate for the defects of different sensors in a specific environment and obtain relatively accurate positioning, a fusion positioning algorithm is usually adopted to position data acquired by a laser radar, a camera and inertial navigation. The first step in implementing the fusion positioning algorithm is to calibrate different sensors to obtain the relative position relationship between them. At present, various different calibration methods exist in academic circles and engineering circles for sensors in fusion algorithms, but the results of calibration by adopting the existing calibration methods are not ideal, the proposed calibration methods are more suitable for the field of unmanned aerial vehicles, and a better scheme for joint calibration of sensors of vehicles does not exist.
Based on the above, aiming at the defects in the prior art, the invention is expected to obtain a novel multi-element positioning sensor combined calibration method, the multi-element positioning sensor combined calibration method can quickly and accurately calibrate the external parameters of the laser radar and the inertial measurement unit, the processing speed and the calibration result precision of the method are superior to those of the existing method, and the method can be effectively used for calibrating each vehicle-mounted sensor on the current automatic driving vehicle.
Disclosure of Invention
One of the purposes of the invention is to provide a multi-element positioning sensor combined calibration method, which can quickly and accurately calibrate the external parameters of a laser radar and an inertia measurement unit, has better processing speed and calibration result precision than the existing method, can be effectively used for calibrating each vehicle-mounted sensor on the current automatic driving vehicle, and has good popularization prospect and application value.
In order to achieve the above object, the present invention provides a method for jointly calibrating a multi-element positioning sensor, which comprises the following steps:
100: synchronizing the time of a plurality of positioning sensors installed on a vehicle, wherein the plurality of positioning sensors at least comprise a camera, a laser radar and inertial navigation;
200: acquiring first point cloud data of the laser radar and data of an inertial measurement unit of inertial navigation in a first state that the vehicle runs at a set constant speed; collecting second point cloud data of the laser radar and image data of the camera in a second static state of the vehicle;
300: calibrating the laser radar and the inertia measurement unit based on the collected first point cloud data and the data of the inertia measurement unit to obtain an external parameter matrix of the laser radar and the inertia measurement unit; calibrating the laser radar and the camera based on the collected second point cloud data and the collected image data to obtain a transformation matrix from the point cloud data of the laser radar to the image data of the camera;
400: and calibrating the camera and the inertial navigation based on the external parameter matrix and the transformation matrix.
Further, in the multi-positioning sensor joint calibration method of the present invention, in step 100, a GPS module is used to synchronize the time of several positioning sensors.
Further, in the multi-element positioning sensor combined calibration method, in step 100, a GPS module is adopted to output GPGGA format information with a GPS timestamp and PPS pulses as input of a laser radar and a camera, and when the laser radar and the camera start to operate, after receiving time input from the GPS module, a time service module built in the laser radar and the camera performs time service so that the laser radar and the camera output data with the GPS timestamp; and a GPS module is arranged in the inertial navigation, so that the inertial navigation outputs data with a GPS timestamp.
Furthermore, in the multi-element positioning sensor combined calibration method, GPGGA format information with a GPS time stamp output by a GPS module in inertial navigation is used as the input of a laser radar, and PPS pulse with a GPS time stamp output by another external GPS module is used as the input of a camera.
Further, in the multi-element positioning sensor joint calibration method of the present invention, in step 300, the external reference matrix of the lidar and the inertial measurement unit is obtained based on the following formula:
AX=XB
wherein X represents the external reference matrix to be calibrated, A and B are observation matrices, A is the pose increment of the laser radar relative to the initial pose at the time T, and B is the pose increment of the inertial measurement unit relative to the initial pose at the time T.
Further, in the multi-element positioning sensor joint calibration method of the present invention, in step 300, an external parameter matrix X of the laser radar and the inertial measurement unit is obtained by solving using a hand-eye calibration method.
Further, in the multi-element positioning sensor joint calibration method of the present invention, in step 200: and data acquisition is carried out on the rectangular checkerboard serving as the calibration board by adopting a laser radar and a camera.
Further, in the multi-element positioning sensor joint calibration method of the present invention, in step 300, a transformation matrix from the point cloud data of the lidar to the image data of the camera is obtained based on the following steps:
acquiring pixel coordinates of four vertexes of the calibration plate and pixel coordinates of corner points of the connection of all black blocks and white blocks on the calibration plate based on image data of a camera;
acquiring a point cloud plane where the calibration plate is located based on the second point cloud data, and acquiring laser point cloud coordinates of four vertexes of the point cloud plane and laser point cloud coordinates of all corner points of the calibration plate;
solving the pixel coordinate and the laser point cloud coordinate based on an EPnP algorithm to obtain a transformation matrix between the pixel coordinate and the laser point cloud coordinate;
projecting other image data and point cloud data by using the transformation matrix to check whether the point cloud data can be correctly projected into a calibration plate range in the image data, and finely adjusting the transformation matrix according to the deviation until laser point cloud coordinates of all the angular points are projected onto the corresponding image data;
and (3) solving the projection error according to the following formula, and performing optimization solution on the error function to obtain a transformation matrix which minimizes the projection error:
argmin‖KTPL-Puv2
wherein K is an internal reference matrix of the camera, T represents a transformation matrix from the laser radar to the camera, KTPLAs a result of the projection of the point cloud data onto the pixel coordinates of the image data, PuvRepresenting the pixel coordinates in the image data of the calibration plate.
Accordingly, another object of the present invention is to provide a multi-positioning sensor joint calibration system, which can be used to implement the multi-positioning sensor joint calibration method of the present invention.
In order to achieve the above object, the present invention provides a multi-positioning sensor joint calibration system, which includes a camera, a laser radar, an inertial navigation module, and a time synchronization module, wherein the multi-positioning sensor joint calibration system executes the multi-positioning sensor joint calibration method of the present invention.
Compared with the prior art, the multi-element positioning sensor combined calibration method and the system have the following advantages and beneficial effects:
the multi-element positioning sensor combined calibration method is different from the prior open source method, and can realize the calibration of the laser radar and the camera by only using a checkerboard which is simply used for calibrating the camera. The method does not need to modify the checkerboard, and can carry out automatic online calibration through an algorithm by collecting a small amount of camera image data and laser point cloud data to calculate a calibration result. And finally, calculating to obtain the position relation between the camera and the inertial navigation.
In addition, in some embodiments, the multivariate positioning sensor combined calibration method provided by the invention can be used for quickly and accurately calibrating the external parameters of the laser radar and the inertial measurement unit by means of the concept of robot eye calibration.
The multi-element positioning sensor combined calibration method is suitable for calibrating each vehicle-mounted sensor on the current automatic driving vehicle, and the processing speed and the calibration result precision of the method are superior to those of the existing method. Three accurate combined calibration results lay a solid foundation for the fusion positioning algorithm.
Accordingly, the multi-element positioning sensor combined calibration system of the present invention can be used for implementing the multi-element positioning sensor combined calibration method of the present invention, which also has the advantages and benefits described above.
Drawings
Fig. 1 schematically shows a flowchart of the steps of the multi-element positioning sensor joint calibration method according to an embodiment of the present invention.
Fig. 2 schematically shows a schematic view of a vehicle movement trajectory.
Fig. 3 schematically shows an observation model of the lidar and the IMU at different times according to the multi-element positioning sensor joint calibration method of the present invention.
Fig. 4 schematically shows a flowchart of a joint calibration calculation of a laser radar and a camera according to an embodiment of the joint calibration method for a multi-element positioning sensor of the present invention.
Detailed Description
The multi-component positioning sensor joint calibration method and system according to the present invention will be further explained and illustrated with reference to the drawings and the specific embodiments, which, however, should not be construed to unduly limit the technical solutions of the present invention.
The invention discloses a multi-element positioning sensor combined calibration system which can be used for implementing the automatic driving local path planning method.
In the automatic driving local path planning system of the present invention, it may include: the system comprises a camera, a laser radar, an inertial navigation and time synchronization module, wherein the multi-element positioning sensor combined calibration system can execute the multi-element positioning sensor combined calibration method, and the specific steps are shown in fig. 1.
Fig. 1 schematically shows a flowchart of the steps of the multi-element positioning sensor joint calibration method according to an embodiment of the present invention.
As shown in fig. 1, in the present embodiment, since the time stamps of the output data of each positioning sensor itself on the vehicle are different, it is first necessary to synchronize the time of the three positioning sensors, i.e., the camera, the Lidar (Lidar) and the inertial navigation, so as to output the data under the same time stamp, thereby achieving the data alignment.
It should be noted that the inertial navigation positioning sensor includes an Inertial Measurement Unit (IMU), and when the time synchronization of each positioning sensor is completed, the calibration of the lidar and the inertial measurement unit and the calibration of the lidar and the camera may be performed. And respectively calibrating external parameters of the camera and the laser radar, an Inertial Measurement Unit (IMU) and the external parameters of the laser radar according to an algorithm, and finally calculating to obtain the relative position relation between the camera and the inertial navigation according to the two results to finish the calibration of the camera and the inertial navigation.
Therefore, in this embodiment, the multi-element positioning sensor joint calibration method according to the present invention may include the following steps:
100: synchronizing the time of a plurality of positioning sensors installed on a vehicle, wherein the plurality of positioning sensors at least comprise a camera, a laser radar and inertial navigation;
200: acquiring first point cloud data of the laser radar and data of an inertial measurement unit of inertial navigation in a first state that the vehicle runs at a set constant speed; collecting second point cloud data of the laser radar and image data of the camera in a second static state of the vehicle;
300: calibrating the laser radar and the inertia measurement unit based on the collected first point cloud data and the data of the inertia measurement unit to obtain an external parameter matrix of the laser radar and the inertia measurement unit; calibrating the laser radar and the camera based on the collected second point cloud data and the collected image data to obtain a transformation matrix from the point cloud data of the laser radar to the image data of the camera;
400: and calibrating the camera and the inertial navigation based on the external parameter matrix and the transformation matrix.
In order to maintain the consistency of the timestamps of the positioning sensors, in the present embodiment, in step (1) of the method of the present invention, the GPS module may be used to synchronize the time of the positioning sensors.
When the GPS module is adopted to synchronize the time of a plurality of positioning sensors, the GPS module can effectively output GPGGA format information with GPS time stamps and PPS pulses, and the GPGGA format information and the PPS pulses are respectively used as the input of a laser radar and a camera. When the laser radar and the camera start to operate, after time input from the GPS module is received, the built-in time service modules of the laser radar and the camera can carry out time service, so that the laser radar and the camera output data with GPS time stamps.
Note that, in the present embodiment, another GPS module is also incorporated in the inertial navigation system so that the inertial navigation system outputs data with a GPS time stamp. In the multi-element positioning sensor combined calibration method, GPGGA format information with GPS time stamp output by a GPS module in inertial navigation can be used as input of a laser radar, and time synchronization is realized through hardware connection; and the PPS pulse with a GPS time stamp is output by another external GPS module to be used as the input of the camera, so that the time synchronization is realized.
With continued reference to fig. 1, it can be seen that, in the multi-element positioning sensor joint calibration method of the present invention, before calibrating the lidar and the inertial measurement unit and before calibrating the lidar and the camera, the data meeting the requirements need to be collected, that is, step 200 of the multi-element positioning sensor joint calibration method of the present invention needs to be completed first.
It should be noted that, because the inertial measurement unit and the GPS module are included in the inertial navigation, the data of the stable inertial measurement unit can be obtained only after the GPS signal is stable in the open external environment. In addition, the external environment needs to have more vivid environmental textures, so that the accuracy of the positioning and attitude determination effects of the laser radar can be ensured.
In this embodiment, when the multi-element positioning sensor joint calibration method is implemented, after a suitable field is selected, the output of the sensor to be calibrated is stable, and then the collection of the first point cloud data of the laser radar and the data of the inertial measurement unit of the inertial navigation is started. When data are collected, the vehicle stably moves forwards at a constant speed of 10km/h, the collection includes the movement of the vehicle such as straight movement and turning, the route is not repeated as much as possible, and the vehicle does not back in the moving process.
Fig. 2 schematically shows a schematic view of a vehicle movement trajectory.
Fig. 3 schematically shows an observation model of the lidar and the IMU at different times according to the multi-element positioning sensor joint calibration method of the present invention.
As shown in fig. 2, it is assumed that the vehicle moves from the time 0 according to the trajectory of the dotted line, the pose of the vehicle at the time 0 is recorded as the initial pose, and the pose at the time T is taken to be compared with the pose of the vehicle at the time 0. Accordingly, referring to fig. 3 in combination, in this embodiment, let a be the pose increment of the lidar relative to its initial pose at time T, B be the pose increment of the inertial measurement unit relative to its initial pose at time T, and X be the external reference matrix of the inertial measurement unit relative to the lidar. Wherein, A can be obtained by laser point cloud through a Lidar Slam algorithm known in the prior art, and B can be obtained by a combined navigation algorithm of inertial navigation known in the prior art.
That is, in the present embodiment, a plurality of pairs of a and B measurement values may be collected, and a stable external parameter matrix X of the lidar and the inertial measurement unit is obtained by solving and solving a classical hand-eye calibration problem. In the multi-element positioning sensor joint calibration method of the present invention, in step 300, the external reference matrix of the lidar and the inertial measurement unit can be obtained based on the following formula (1):
AX=XB (1)
wherein X represents the external reference matrix to be calibrated, A and B are observation matrices, A is the pose increment of the laser radar relative to the initial pose at the time T, and B is the pose increment of the inertial measurement unit relative to the initial pose at the time T.
Fig. 4 schematically shows a flowchart of a joint calibration calculation of a laser radar and a camera according to an embodiment of the joint calibration method for a multi-element positioning sensor of the present invention.
In the present invention, when the second point cloud data of the laser radar and the image data of the camera are collected in step 200, the light is required to be sufficient, and the vehicle is kept still.
In this embodiment, can select a size to be 50 cm's rectangle chess board check calibration board, and the handheld rectangle chess board check calibration board of experimenter is located the distance about 5m apart from the camera dead ahead, starts camera and laser radar, carries out data acquisition to the rectangle chess board check, gathers the clear data of multiunit texture for the demarcation calculation in later stage. In the process of collecting the second point cloud data and the image information, the coordinate system of the camera can be set as XYZ in advance, and the coordinate system of the laser radar can be set as X ' Y ' Z '.
It should be noted that, in step 300 of the multi-element positioning sensor joint calibration method according to the present invention, the main purpose of the lidar and camera calibration process is to obtain a transformation matrix T of the camera and the lidar, and a specific calculation process in this embodiment may be shown in fig. 4.
As shown in fig. 4, with reference to fig. 1, in the present embodiment, the present invention may obtain a transformation matrix from point cloud data of the laser radar to image data of the camera based on the following steps:
step (1): and acquiring pixel coordinates of four vertexes of the rectangular chessboard pattern calibration plate and pixel coordinates of corner points of the intersection of all black and white blocks on the rectangular chessboard pattern calibration plate based on image data of a camera.
In this embodiment, the outer edge of the calibration plate in the image data may be first obtained by an eight-domain edge tracking method, so as to obtain the pixel coordinates of the four vertices of the rectangular checkerboard calibration plate. And detecting the pixel coordinates of the corner points of the intersection of all the black and white blocks on the rectangular chessboard calibration plate by utilizing corner point detection.
Step (2): and acquiring a point cloud plane where the rectangular chessboard pattern calibration plate is located based on the second point cloud data, and acquiring laser point cloud coordinates of four vertexes of the point cloud plane and laser point cloud coordinates of all corner points of the rectangular chessboard pattern calibration plate.
In the present embodiment, the RANSAC algorithm is used to eliminate the influence of noise and outliers and to obtain a point cloud plane on which the rectangular checkerboard calibration plate is located. And taking out the point cloud at the top left corner in the point cloud plane, and calculating the laser point cloud coordinates of the rest 3 vertexes and the laser point cloud coordinates of all the corner points of the rectangular chessboard calibration plate according to the laser point cloud coordinates of the point and the actual physical size of the calibration plate.
And (3): solving the pixel coordinates and the laser Point cloud coordinates based on an EPnP algorithm (Estimation of perceptual-n-Point) to obtain a transformation matrix between the pixel coordinates and the laser Point cloud coordinates.
And (4): and projecting other image data and point cloud data by using the transformation matrix to check whether the point cloud data can be correctly projected into a calibration plate range in the image data, and finely adjusting the transformation matrix according to the deviation until the laser point cloud coordinates of all the angular points are projected onto the corresponding image data.
And (5): and (3) solving the projection error according to the following formula, and performing optimization solution on the error function to obtain a transformation matrix which minimizes the projection error:
argmin‖KTPL-Puv2 (2)
wherein K is an internal reference matrix of the camera, T represents a transformation matrix from the laser radar to the camera, KTPLAs a result of the projection of the point cloud data onto the pixel coordinates of the image data, PuvRepresenting the pixel coordinates in the image data of the calibration plate.
Correspondingly, in step 300 of the multi-element positioning sensor joint calibration method of the present invention, based on the external parameter matrix X and the transformation matrix T calculated by the above formula (1) and formula (2), the camera and the inertial navigation can be calibrated, and the conversion relationship from the inertial navigation to the camera is calculated as follows: q ═ X × T.
In order to better illustrate the superiority of the multi-element positioning sensor combined calibration method, a comparison experiment is carried out on the multi-element positioning sensor combined calibration method and a combined calibration method which uses more laser radars and cameras at present, so as to further illustrate.
When a contrast experiment is carried out, an experimenter puts a conical barrel in a calibration visual field, and carries out calibration experiments by respectively adopting two calibration methods, and compares finally obtained calibration results, wherein the coincidence rate of the heavily projected point cloud and the conical barrel needs to be compared during comparison.
Therefore, in the aspect of the precision of the finally obtained calibration result, in the reprojected images generated by the two calibration methods, the coincidence rate of the reprojected point cloud obtained by adopting the multi-element positioning sensor combined calibration method and the conical barrel is higher; the coincidence rate of the point cloud and the conical barrel in the re-projection image generated by the existing laser radar and camera combined calibration method is relatively low, and the re-projection point cloud of the conical barrel and the image of the conical barrel have an obvious deviation in the transverse direction.
In conclusion, the multi-element positioning sensor combined calibration method is different from the conventional open source method, and the calibration of the laser radar and the camera can be realized only by using a checkerboard which is simply used for calibrating the camera. The method does not need to modify the checkerboard, and can carry out automatic online calibration through an algorithm by collecting a small amount of camera image data and laser point cloud data to calculate a calibration result. And finally, calculating to obtain the position relation between the camera and the inertial navigation.
In addition, in some embodiments, the multivariate positioning sensor combined calibration method provided by the invention can be used for quickly and accurately calibrating the external parameters of the laser radar and the inertial measurement unit by means of the concept of robot eye calibration.
The multi-element positioning sensor combined calibration method is suitable for calibrating each vehicle-mounted sensor on the current automatic driving vehicle, and the processing speed and the calibration result precision of the method are superior to those of the existing method. Three accurate combined calibration results lay a solid foundation for the fusion positioning algorithm.
Accordingly, the multi-element positioning sensor combined calibration system of the present invention can be used for implementing the multi-element positioning sensor combined calibration method of the present invention, which also has the advantages and benefits described above.
It should be noted that the prior art in the protection scope of the present invention is not limited to the examples given in the present application, and all the prior art which is not inconsistent with the technical scheme of the present invention, including but not limited to the prior patent documents, the prior publications and the like, can be included in the protection scope of the present invention.
In addition, the combination of the features in the present application is not limited to the combination described in the claims of the present application or the combination described in the embodiments, and all the features described in the present application may be freely combined or combined in any manner unless contradictory to each other.
It should also be noted that the above-mentioned embodiments are only specific embodiments of the present invention. It is apparent that the present invention is not limited to the above embodiments and similar changes or modifications can be easily made by those skilled in the art from the disclosure of the present invention and shall fall within the scope of the present invention.

Claims (9)

1. A multi-element positioning sensor combined calibration method is characterized by comprising the following steps:
100: synchronizing the time of a plurality of positioning sensors installed on a vehicle, wherein the plurality of positioning sensors at least comprise a camera, a laser radar and inertial navigation;
200: acquiring first point cloud data of the laser radar and data of an inertial measurement unit of inertial navigation in a first state that the vehicle runs at a set constant speed; collecting second point cloud data of the laser radar and image data of the camera in a second static state of the vehicle;
300: calibrating the laser radar and the inertia measurement unit based on the collected first point cloud data and the data of the inertia measurement unit to obtain an external parameter matrix of the laser radar and the inertia measurement unit; calibrating the laser radar and the camera based on the collected second point cloud data and the collected image data to obtain a transformation matrix from the point cloud data of the laser radar to the image data of the camera;
400: and calibrating the camera and the inertial navigation based on the external parameter matrix and the transformation matrix.
2. The method for multi-element positioning sensor joint calibration according to claim 1, wherein in step 100, the GPS module is used to synchronize the times of several positioning sensors.
3. The multi-element positioning sensor joint calibration method according to claim 2, wherein in step 100, a GPS module is adopted to output GPGGA format information with a GPS timestamp and PPS pulses as input of the laser radar and the camera, and when the laser radar and the camera start to operate, and after receiving time input from the GPS module, a time service module built in the laser radar and the camera performs time service so that the laser radar and the camera output data with the GPS timestamp; and a GPS module is arranged in the inertial navigation, so that the inertial navigation outputs data with a GPS timestamp.
4. The multi-element positioning sensor joint calibration method according to claim 3, wherein GPGGA format information with GPS time stamp output by a GPS module in inertial navigation is used as input of a laser radar, and PPS pulse with GPS time stamp output by another external GPS module is used as input of a camera.
5. The multi-element positioning sensor joint calibration method according to claim 1, wherein in step 300, the external reference matrix of the lidar and the inertial measurement unit is obtained based on the following formula:
AX=XB
and B is the pose increment of the inertial measurement unit relative to the initial pose at the moment T.
6. The multi-element positioning sensor joint calibration method according to claim 5, wherein in step 300, the external parameter matrix X of the lidar and the inertial measurement unit is obtained by solving with a hand-eye calibration method.
7. The multi-element positioning sensor joint calibration method according to claim 1, wherein in step 200: and data acquisition is carried out on the rectangular checkerboard serving as the calibration board by adopting a laser radar and a camera.
8. The multi-element positioning sensor joint calibration method according to claim 7, wherein in step 300, a transformation matrix from the point cloud data of the lidar to the image data of the camera is obtained based on the following steps:
acquiring pixel coordinates of four vertexes of the calibration plate and pixel coordinates of corner points of the connection of all black blocks and white blocks on the calibration plate based on image data of a camera;
acquiring a point cloud plane where the calibration plate is located based on the second point cloud data, and acquiring laser point cloud coordinates of four vertexes of the point cloud plane and laser point cloud coordinates of all corner points of the calibration plate;
solving the pixel coordinate and the laser point cloud coordinate based on an EPnP algorithm to obtain a transformation matrix between the pixel coordinate and the laser point cloud coordinate;
projecting other image data and point cloud data by using the transformation matrix to check whether the point cloud data can be correctly projected into a calibration plate range in the image data, and finely adjusting the transformation matrix according to the deviation until laser point cloud coordinates of all the angular points are projected onto the corresponding image data;
and (3) solving the projection error according to the following formula, and performing optimization solution on the error function to obtain a transformation matrix which minimizes the projection error:
argmin‖KTPL-Puv2
wherein K is an internal reference matrix of the camera, T represents a transformation matrix from the laser radar to the camera, KTPLAs a result of the projection of the point cloud data onto the pixel coordinates of the image data, PuvRepresenting the pixel coordinates in the image data of the calibration plate.
9. A multi-element positioning sensor joint calibration system, comprising a camera, a laser radar, an inertial navigation and time synchronization module, wherein the multi-element positioning sensor joint calibration system performs the multi-element positioning sensor joint calibration method according to any one of claims 1 to 8.
CN202011267228.9A 2020-11-13 2020-11-13 Multi-element positioning sensor combined calibration method and system Pending CN112577517A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011267228.9A CN112577517A (en) 2020-11-13 2020-11-13 Multi-element positioning sensor combined calibration method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011267228.9A CN112577517A (en) 2020-11-13 2020-11-13 Multi-element positioning sensor combined calibration method and system

Publications (1)

Publication Number Publication Date
CN112577517A true CN112577517A (en) 2021-03-30

Family

ID=75122701

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011267228.9A Pending CN112577517A (en) 2020-11-13 2020-11-13 Multi-element positioning sensor combined calibration method and system

Country Status (1)

Country Link
CN (1) CN112577517A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113091771A (en) * 2021-04-13 2021-07-09 清华大学 Laser radar-camera-inertial navigation combined calibration method and system
CN113218435A (en) * 2021-05-07 2021-08-06 复旦大学 Multi-sensor time synchronization method
CN113566833A (en) * 2021-07-28 2021-10-29 上海工程技术大学 Multi-sensor fusion vehicle positioning method and system
CN113848541A (en) * 2021-09-22 2021-12-28 深圳市镭神智能系统有限公司 Calibration method and device, unmanned aerial vehicle and computer readable storage medium
CN114413887A (en) * 2021-12-24 2022-04-29 北京理工大学前沿技术研究院 Method, equipment and medium for calibrating external parameters of sensor
CN114518111A (en) * 2022-03-11 2022-05-20 六安智梭无人车科技有限公司 Laser radar and inertia measurement unit calibration method and system
WO2023034321A1 (en) * 2021-08-31 2023-03-09 Zoox, Inc. Calibrating multiple inertial measurement units
WO2023083271A1 (en) * 2021-11-15 2023-05-19 虹软科技股份有限公司 Data synchronization device and method, and computer readable storage medium
WO2023103143A1 (en) * 2021-12-07 2023-06-15 上海仙途智能科技有限公司 Sensor inspection method and apparatus, electronic device, and readable storage medium
CN116449387A (en) * 2023-06-15 2023-07-18 南京师范大学 Multi-dimensional environment information acquisition platform and calibration method thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807365A (en) * 2017-10-20 2018-03-16 国家林业局昆明勘察设计院 Small-sized digital photography there-dimensional laser scanning device for the unmanned airborne vehicle in low latitude
CN109729277A (en) * 2018-11-19 2019-05-07 魔门塔(苏州)科技有限公司 Multi-sensor collection timestamp synchronizing device
CN110390695A (en) * 2019-06-28 2019-10-29 东南大学 The fusion calibration system and scaling method of a kind of laser radar based on ROS, camera
CN110780285A (en) * 2019-10-24 2020-02-11 深圳市镭神智能系统有限公司 Pose calibration method, system and medium for laser radar and combined inertial navigation
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111678534A (en) * 2019-03-11 2020-09-18 武汉小狮科技有限公司 Combined calibration platform and method combining RGBD binocular depth camera, IMU and multi-line laser radar
CN111735479A (en) * 2020-08-28 2020-10-02 中国计量大学 Multi-sensor combined calibration device and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807365A (en) * 2017-10-20 2018-03-16 国家林业局昆明勘察设计院 Small-sized digital photography there-dimensional laser scanning device for the unmanned airborne vehicle in low latitude
CN109729277A (en) * 2018-11-19 2019-05-07 魔门塔(苏州)科技有限公司 Multi-sensor collection timestamp synchronizing device
CN111678534A (en) * 2019-03-11 2020-09-18 武汉小狮科技有限公司 Combined calibration platform and method combining RGBD binocular depth camera, IMU and multi-line laser radar
CN110390695A (en) * 2019-06-28 2019-10-29 东南大学 The fusion calibration system and scaling method of a kind of laser radar based on ROS, camera
CN110780285A (en) * 2019-10-24 2020-02-11 深圳市镭神智能系统有限公司 Pose calibration method, system and medium for laser radar and combined inertial navigation
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111735479A (en) * 2020-08-28 2020-10-02 中国计量大学 Multi-sensor combined calibration device and method

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113091771A (en) * 2021-04-13 2021-07-09 清华大学 Laser radar-camera-inertial navigation combined calibration method and system
CN113091771B (en) * 2021-04-13 2022-09-23 清华大学 Laser radar-camera-inertial navigation combined calibration method and system
CN113218435A (en) * 2021-05-07 2021-08-06 复旦大学 Multi-sensor time synchronization method
CN113566833A (en) * 2021-07-28 2021-10-29 上海工程技术大学 Multi-sensor fusion vehicle positioning method and system
US11898873B2 (en) 2021-08-31 2024-02-13 Zoox, Inc. Calibrating multiple inertial measurement units
WO2023034321A1 (en) * 2021-08-31 2023-03-09 Zoox, Inc. Calibrating multiple inertial measurement units
CN113848541A (en) * 2021-09-22 2021-12-28 深圳市镭神智能系统有限公司 Calibration method and device, unmanned aerial vehicle and computer readable storage medium
CN113848541B (en) * 2021-09-22 2022-08-26 深圳市镭神智能系统有限公司 Calibration method and device, unmanned aerial vehicle and computer readable storage medium
WO2023083271A1 (en) * 2021-11-15 2023-05-19 虹软科技股份有限公司 Data synchronization device and method, and computer readable storage medium
CN116156073A (en) * 2021-11-15 2023-05-23 虹软科技股份有限公司 Data synchronization device, method thereof and computer readable storage medium
WO2023103143A1 (en) * 2021-12-07 2023-06-15 上海仙途智能科技有限公司 Sensor inspection method and apparatus, electronic device, and readable storage medium
CN114413887A (en) * 2021-12-24 2022-04-29 北京理工大学前沿技术研究院 Method, equipment and medium for calibrating external parameters of sensor
CN114413887B (en) * 2021-12-24 2024-04-02 北京理工大学前沿技术研究院 Sensor external parameter calibration method, device and medium
CN114518111A (en) * 2022-03-11 2022-05-20 六安智梭无人车科技有限公司 Laser radar and inertia measurement unit calibration method and system
CN116449387A (en) * 2023-06-15 2023-07-18 南京师范大学 Multi-dimensional environment information acquisition platform and calibration method thereof
CN116449387B (en) * 2023-06-15 2023-09-12 南京师范大学 Multi-dimensional environment information acquisition platform and calibration method thereof

Similar Documents

Publication Publication Date Title
CN112577517A (en) Multi-element positioning sensor combined calibration method and system
US10860871B2 (en) Integrated sensor calibration in natural scenes
CA3028653C (en) Methods and systems for color point cloud generation
CN109887057B (en) Method and device for generating high-precision map
CN111448476B (en) Technique for sharing mapping data between unmanned aerial vehicle and ground vehicle
CN109931939B (en) Vehicle positioning method, device, equipment and computer readable storage medium
Alonso et al. Accurate global localization using visual odometry and digital maps on urban environments
CN111065043B (en) System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication
CN111670339B (en) Techniques for collaborative mapping between unmanned aerial vehicles and ground vehicles
Heng Automatic targetless extrinsic calibration of multiple 3D LiDARs and radars
CN108603933A (en) The system and method exported for merging the sensor with different resolution
CN110827358A (en) Camera calibration method applied to automatic driving automobile
CN107941167B (en) Space scanning system based on unmanned aerial vehicle carrier and structured light scanning technology and working method thereof
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
CN115027482A (en) Fusion positioning method in intelligent driving
CN113763548A (en) Poor texture tunnel modeling method and system based on vision-laser radar coupling
CN113240813B (en) Three-dimensional point cloud information determining method and device
KR20230003803A (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN117310627A (en) Combined calibration method applied to vehicle-road collaborative road side sensing system
CN116952229A (en) Unmanned aerial vehicle positioning method, device, system and storage medium
CN116907469A (en) Synchronous positioning and mapping method and system for multi-mode data combined optimization
CN115540849A (en) Laser vision and inertial navigation fusion positioning and mapping device and method for aerial work platform
CN112798020A (en) System and method for evaluating positioning accuracy of intelligent automobile
CN113899356B (en) Non-contact mobile measurement system and method
AU2018102199A4 (en) Methods and systems for color point cloud generation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210330