CN112233184B - Laser radar and camera calibration parameter correction method and device based on image registration - Google Patents

Laser radar and camera calibration parameter correction method and device based on image registration Download PDF

Info

Publication number
CN112233184B
CN112233184B CN202010936777.4A CN202010936777A CN112233184B CN 112233184 B CN112233184 B CN 112233184B CN 202010936777 A CN202010936777 A CN 202010936777A CN 112233184 B CN112233184 B CN 112233184B
Authority
CN
China
Prior art keywords
point cloud
image
camera
cloud projection
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010936777.4A
Other languages
Chinese (zh)
Other versions
CN112233184A (en
Inventor
殷国栋
彭湃
徐利伟
庄伟超
耿可可
王金湘
张宁
陈建松
祝小元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changcheng Automobile Industry Changzhou Co ltd
Changcheng Automobile Jiangsu Co ltd
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202010936777.4A priority Critical patent/CN112233184B/en
Publication of CN112233184A publication Critical patent/CN112233184A/en
Application granted granted Critical
Publication of CN112233184B publication Critical patent/CN112233184B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the application discloses a method and a device for calibrating parameters of a laser radar and a camera based on image registration, and belongs to the field of intelligent driving environment perception and the field of robot environment perception. The method comprises the following steps: after calibration of calibration parameters between a camera and a laser radar is completed, acquiring a reference camera image and a reference point cloud projection drawing according to a preset scene, wherein the reference point cloud projection drawing is obtained according to the reference camera image and the calibration parameters; detecting whether the relative attitude between the camera and the laser radar is changed; and if the relative posture is changed, correcting calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection image. The calibration parameter can be corrected by combining image registration and the original calibration parameter, a calibration test environment is not required to be arranged, the calibration parameter correction process is simplified, and the correction efficiency is improved. In addition, the error of the calibrated parameter is small, so that the calibration precision is improved.

Description

Laser radar and camera calibration parameter correction method and device based on image registration
Technical Field
The embodiment of the application relates to the field of intelligent driving environment perception and the field of robot environment perception, in particular to a laser radar and camera calibration parameter correction method and device based on image registration.
Background
Because environmental information which can be sensed by a single type of sensor is limited, a multi-sensor fusion method is generally applied to obtain detailed environmental information in the field of intelligent driving environmental sensing and robot environmental sensing at present, and the premise of obtaining accurate calibration parameters among different sensors is to realize multi-sensor fusion. For calibration of the camera and the lidar, a method of calibrating the camera internal parameter and the lidar to the camera coordinate system external parameter respectively is generally adopted. The calibration method of the internal reference of the current camera is mostly based on Zhangyingyou single-plane checkerboard calibration, and the calibration method from the laser radar to the external reference of the camera coordinate system is mainly divided into two types.
The first type is a calibration method based on feature point matching. For example, the corresponding corner points of the calibration plate under the camera coordinate system and the laser radar coordinate system are manually selected, and then the external reference matrix is solved by using a constraint equation. Due to the sparsity of the laser radar point cloud, the angular point of the calibration plate cannot be accurately scanned, so that the calibration plate needs to be manually adjusted to enable the laser beam to scan the angular point position, and the calibration precision and efficiency are low. In order to reduce the difficulty of extracting the feature points in the laser radar coordinate system and improve the extraction precision, researchers use edge points of a fixed plate under the laser radar coordinate system to fit side lines, and then calculate intersection points of all the side lines to serve as feature points to be matched with angular points under the coordinate system of a camera. In order to acquire a plurality of sets of feature points, this method generally requires the position of a calibration plate to be changed or a plurality of calibration plates to be used; and the edge points need to be manually selected, and a certain workload is provided during calibration.
Another type is a calibration method based on coordinate transformation. For example, the multi-angle positions of the calibration plate under a camera coordinate system and a laser radar coordinate system are respectively calculated, and the external parameters are solved in a coordinate conversion mode. However, this method requires moving the calibration plate repeatedly to obtain the positions of the calibration plate at different angles, so as to obtain more accurate calibration parameters, which is more complicated.
In the running process of an intelligent driving vehicle, vibration and impact are inevitably generated, so that the relative postures of the camera and the laser radar are changed, and the external parameter matrix of the original laser radar camera is not suitable any more. If the external reference matrix is calibrated again, the calibration plate needs to be arranged, the characteristic points need to be selected, and the position of the calibration plate needs to be changed in multiple angles, which is very complicated.
Disclosure of Invention
The embodiment of the application provides a method and a device for calibrating parameters of a laser radar and a camera based on image registration, and the method and the device are used for solving the problems that when the relative postures of the camera and the laser radar are changed, an external parameter matrix needs to be re-calibrated, and the re-calibration process is complicated. The technical scheme is as follows:
in one aspect, a method for calibrating parameters of a laser radar and a camera based on image registration is provided, and the method includes:
after calibration of calibration parameters between a camera and a laser radar is completed, acquiring a reference camera image and a reference point cloud projection drawing according to a preset scene, wherein the reference point cloud projection drawing is obtained according to the reference camera image and the calibration parameters;
detecting whether a relative attitude between the camera and the laser radar is changed;
and if the relative posture is changed, correcting the calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection graph.
In one possible implementation, the acquiring a reference camera image and a reference point cloud projection map according to a predetermined scene includes:
shooting the preset scene through the camera to obtain the reference camera image;
generating, by the lidar, reference point cloud data for the predetermined scene;
and generating the reference point cloud projection drawing according to the reference camera image, the reference point cloud data and the calibration parameters.
In one possible implementation, when the calibration parameters include an internal reference matrix and an external reference matrix, the generating the reference point cloud projection drawing according to the reference camera image, the reference point cloud data, and the calibration parameters includes:
inputting the reference camera image, the reference point cloud data and the calibration parameters into a first formula and a second formula to obtain the reference point cloud projection diagram;
the first formula is
Figure BDA0002672231410000031
The second formula is
Figure BDA0002672231410000032
Wherein (X Y Z) is the coordinate of a data point in the reference point cloud data, I is the reflectivity of the data point, (X Y Z) is the coordinate of the corresponding data point in the camera coordinate system, (u v) is the pixel coordinate of the corresponding pixel point in the pixel coordinate system,
Figure BDA0002672231410000033
is the reference matrix of the said device,
Figure BDA0002672231410000034
is the external reference matrix and z is the depth of the data point in the camera coordinate system.
In one possible implementation, the method further includes:
filling pixel values of pixel points corresponding to the data points according to the depth of each data point, and mapping the filled pixel values by using a color space to obtain the reference point cloud projection graph; or,
inputting the coordinates of each data point in a camera coordinate system into a third formula, filling the pixel values of pixel points corresponding to the data points according to the obtained calculation result, mapping the filled pixel values by using a color space to obtain the reference point cloud projection drawing, wherein the third formula is that the coordinates of each data point in the camera coordinate system are input into the third formula, the pixel values of the pixel points corresponding to the data points are filled according to the obtained calculation result, and the third formula is that the filled
Figure BDA0002672231410000035
Or,
and acquiring a reflectivity value of each data point, filling a pixel value of a pixel point corresponding to the data point according to the reflectivity value of each data point, and mapping the filled pixel value by using a color space to obtain the reference point cloud projection diagram.
In one possible implementation, the predetermined scene includes a predetermined number of objects.
In a possible implementation manner, the correcting the calibration parameter according to the homography matrix obtained by registering the reference camera image and the reference point cloud projection diagram includes:
shooting the preset scene through the camera to obtain an image to be corrected;
generating a transformed point cloud projection diagram according to the reference camera image, the image to be corrected and the reference point cloud projection diagram;
and correcting the calibration parameters according to the point cloud projection image to be corrected and the transformed point cloud projection image.
In one possible implementation, the method further includes:
and adjusting the position of the camera, wherein the similarity between the image to be corrected, which is obtained by shooting the preset scene by the adjusted camera, and the reference camera image exceeds a preset threshold value.
In one possible implementation, the generating a transformed point cloud projection diagram according to the reference camera image, the image to be corrected, and the reference point cloud projection diagram includes:
registering the reference camera image and the image to be corrected to obtain a first homography matrix from the reference camera image to the image to be corrected;
and carrying out perspective transformation on the reference point cloud projection drawing according to the first homography matrix to obtain the transformed point cloud projection drawing.
In a possible implementation manner, the correcting the calibration parameter according to the point cloud projection image to be corrected and the transformed point cloud projection image includes:
registering the point cloud projection image to be corrected and the transformed point cloud projection image to obtain a second homography matrix from the point cloud projection image to be corrected to the transformed point cloud projection image;
and multiplying the second homography matrix, the internal reference matrix and the external reference matrix to obtain corrected calibration parameters.
In one aspect, an apparatus for calibrating parameters of a lidar and a camera based on image registration is provided, the apparatus comprising:
the acquisition module is used for acquiring a reference camera image and a reference point cloud projection image according to a preset scene after calibration of calibration parameters between the camera and the laser radar is completed, wherein the reference point cloud projection image is obtained by calculation according to the reference camera image;
the detection module is used for detecting whether the relative attitude between the camera and the laser radar changes;
and the correction module is used for correcting the calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection image if the relative posture changes.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
after calibration of calibration parameters between the camera and the laser radar is completed, a reference camera image and a reference point cloud projection image can be obtained according to a preset scene, wherein the reference point cloud projection image is obtained according to the reference camera image and the calibration parameters; detecting whether the relative attitude between the camera and the laser radar is changed; and if the relative posture is changed, correcting calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection image. The calibration method comprises the steps of obtaining a homography matrix by combining image registration and original calibration parameters, correcting the calibration parameters according to the homography matrix, fully utilizing the original calibration parameters without re-calibration, and automatically searching characteristic points to finish calibration parameter correction without arranging a calibration test environment, thereby simplifying the calibration parameter correction process and improving the calibration efficiency. In addition, the calibration parameter error after correction is small by combining the registration of the reference camera image and the reference point cloud projection image, so that the correction precision is improved.
In addition, a predetermined number of objects are contained in the predetermined scene, so that a certain number of key points can be detected by using the feature description words during image registration, and the accuracy of image registration is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a method for calibrating a parameter of a lidar and a camera based on image registration according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart illustrating calibration parameter calibration according to an embodiment of the present application;
FIG. 3 is a schematic view of an image with calibration parameter correction provided in an embodiment of the present application;
FIG. 4 is a diagram illustrating matching results of key points of registration provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a result of superimposing a camera image and a point cloud projection map provided by an embodiment of the present application;
fig. 6 is a block diagram of a structure of a lidar and camera calibration parameter correction device based on image registration according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application more clear, the embodiments of the present application will be further described in detail with reference to the accompanying drawings.
Please refer to fig. 1, which illustrates a flowchart of a method for calibrating parameters of a lidar and a camera based on image registration according to an embodiment of the present application, wherein the method for calibrating parameters of the lidar and the camera based on image registration may be applied to an intelligent driving system and a robot system. The laser radar and camera calibration parameter correction method based on image registration can comprise the following steps:
step 101, after calibration of calibration parameters between a camera and a laser radar is completed, a reference camera image and a reference point cloud projection image are obtained according to a preset scene, wherein the reference point cloud projection image is obtained according to the reference camera image and the calibration parameters.
In this embodiment, any calibration method may be used to calibrate the calibration parameters between the camera and the laser radar to obtain the original calibration parameters, and this embodiment does not limit the calibration method. In order to ensure that more feature points are detected, the laser radar may be a radar with 32 lines or more.
After calibration is completed, the relative attitude between the camera and the laser radar changes due to vibration and impact generated during the running process of the vehicle, so that the original calibration parameters are not applicable any more, and therefore the calibration parameters need to be corrected subsequently.
In this embodiment, after the original calibration parameters are obtained, a reference image needs to be obtained according to a predetermined scene, and the original calibration parameters may be corrected according to the reference image. The predetermined scene may be a scene of a fixed location, such as a parking lot, a fixed location in the road, or the predetermined scene may also be a scene of a disposed calibration board, which is not limited in this embodiment. It should be noted that a predetermined number of objects need to be included in a predetermined scene, so that it can be ensured that a certain number of key points can be detected by using the feature descriptors during image registration, thereby improving the accuracy of image registration.
The reference image in this embodiment includes a reference camera image and a reference point cloud projection map, and the reference point cloud projection map is calculated according to the reference camera image and calibration parameters, and the calculation process is described below.
In one possible calculation, acquiring the reference camera image and the reference point cloud projection map according to a predetermined scene may include: shooting a preset scene through a camera to obtain a reference camera image; generating reference point cloud data for a predetermined scene by a laser radar; and generating a point cloud projection diagram according to the reference camera image, the reference point cloud data and the calibration parameters.
When the reference camera image and the reference point cloud data are acquired and the calibration parameters include an internal reference matrix and an external reference matrix, generating a reference point cloud projection drawing according to the reference camera image, the reference point cloud data and the calibration parameters, which may include: inputting the reference camera image, the reference point cloud data and the calibration parameters into a first formula and a second formula to obtain a reference point cloud projection diagram;
the first formula is
Figure BDA0002672231410000071
The second formula is
Figure BDA0002672231410000072
Wherein (X Y Z) is the coordinate of a data point in the reference point cloud data, I is the reflectivity of the data point, (xy Z) is the coordinate of the data point corresponding to (X Y Z) in the camera coordinate system, (u v) is the pixel coordinate of the corresponding pixel point in the pixel coordinate system,
Figure BDA0002672231410000073
is an internal reference matrix, and the reference matrix is,
Figure BDA0002672231410000074
is the external reference matrix and z is the depth of the data point in the camera coordinate system.
The reference point cloud data can be projected onto a camera plane through the first formula and the second formula to obtain a reference point cloud projection diagram. In the present embodiment, three conversion modes are provided, and the three conversion modes are described below.
In the first conversion mode, the pixel values of the pixel points corresponding to the data points are filled according to the depth of each data point, and the filled pixel values are mapped by using a color space to obtain a point cloud projection diagram.
The second conversion mode is that the coordinates of each data point under the camera coordinate system are input into a third formula, the pixel values of pixel points corresponding to the data points are filled according to the obtained calculation result, the filled pixel values are mapped by utilizing a color space, and a reference point cloud projection drawing is obtained, wherein the third formula is
Figure BDA0002672231410000081
And in the third conversion mode, the reflectivity value of each data point is obtained, the pixel value of the pixel point corresponding to the data point is filled according to the reflectivity value of each data point, and the filled pixel value is mapped by utilizing a color space to obtain a point cloud projection image. The reflectivity value of each data point may be obtained, and the obtaining manner is not limited in this embodiment.
Step 102, detecting whether the relative attitude between the camera and the laser radar is changed.
When a significant shift in the data points projected onto the image by the lidar is detected or considered to be observed, the original calibration parameters are determined to need to be corrected.
And 103, if the relative posture is changed, correcting calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection image.
If it is determined that the relative posture between the camera and the lidar is changed, the camera and the lidar need to be placed in a predetermined scene, and the original calibration parameters are corrected in the predetermined scene, and a correction process is described below.
Specifically, correcting the calibration parameter according to the homography matrix obtained by registering the reference camera image and the reference point cloud projection drawing may include the following substeps.
And a substep 1031 of capturing a predetermined scene by a camera to obtain an image to be corrected.
Before the image to be corrected is obtained, the position of the camera can be adjusted, so that the image to be corrected shot by the camera is consistent with the reference camera image as much as possible. Namely, the method further comprises: and adjusting the position of the camera, wherein the similarity between the image to be corrected, which is obtained by shooting a preset scene by the adjusted camera, and the image of the reference camera exceeds a preset threshold value.
It should be noted that when the vehicle is placed under the predetermined scene again, the object under the predetermined scene does not need to be identical to the object in the initial predetermined scene, as long as there are partially identical objects.
And a sub-step 1032 of generating a transformed point cloud projection diagram according to the reference camera image, the image to be corrected and the reference point cloud projection diagram.
The generating of the transformed point cloud projection map according to the reference camera image, the image to be corrected and the reference point cloud projection map may include: registering the reference camera image and the image to be corrected to obtain a first homography matrix H from the reference camera image to the image to be corrected1(ii) a According to the first homography matrix H1And carrying out perspective transformation on the reference point cloud projection image to obtain a transformed point cloud projection image.
And a substep 1033 of correcting the calibration parameters according to the point cloud projection image to be corrected and the transformed point cloud projection image.
Wherein, correcting the calibration parameters according to the point cloud projection image to be corrected and the converted point cloud projection image, may include: registering the point cloud projection image to be corrected and the transformed point cloud projection image to obtain a second homography matrix H from the point cloud projection image to be corrected to the transformed point cloud projection image2(ii) a And multiplying the second homography matrix, the internal reference matrix and the external reference matrix to obtain the corrected calibration parameters.
Taking the second homography matrix as H2The internal reference matrix is
Figure BDA0002672231410000091
The external reference matrix is
Figure BDA0002672231410000092
The calibrated calibration parameter is H2·Mi·Me. Subsequently, a fourth formula should be satisfied between the camera image and the point cloud projection diagram
Figure BDA0002672231410000093
Referring to fig. 2, fig. 2 shows a conversion relationship between images during calibration of calibration parameters, that is, a first homography matrix from a reference camera image to a camera image to be calibrated is calculated, perspective transformation is performed on a reference point cloud projection image according to the first homography matrix to obtain a transformed point cloud projection image, and finally a second homography matrix from the point cloud projection image to be calibrated to the transformed point cloud projection image is calculated to obtain calibrated calibration parameters.
Fig. 3 shows an intermediate image of each step in fig. 2, taking a real-scene image containing a vehicle as an example.
FIG. 4 is a matching result of key points when the point cloud projection image to be corrected is aligned with the transformed point cloud projection image.
The lower diagram in fig. 5 shows the effect of superimposing the reference camera image and the point cloud projection view, the middle diagram shows the effect of superimposing the image to be corrected and the point cloud projection view, and the upper diagram shows the effect of superimposing the corrected camera image and the point cloud projection view.
In summary, according to the laser radar and camera calibration parameter correction method based on image registration provided in the embodiment of the present application, after calibration of calibration parameters between a camera and a laser radar is completed, a reference camera image and a reference point cloud projection drawing are obtained according to a predetermined scene, where the reference point cloud projection drawing is obtained according to the reference camera image and the calibration parameters; detecting whether the relative attitude between the camera and the laser radar is changed; and if the relative posture is changed, correcting calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection image. The calibration method comprises the steps of obtaining a homography matrix by combining image registration and original calibration parameters, correcting the calibration parameters according to the homography matrix, fully utilizing the original calibration parameters without re-calibration, and automatically searching characteristic points to finish calibration parameter correction without arranging a calibration test environment, thereby simplifying the calibration parameter correction process and improving the calibration efficiency. In addition, the corrected calibration parameter error is small by combining the registration of the camera image and the point cloud projection image, so that the correction precision is improved.
In addition, a predetermined number of objects are contained in the predetermined scene, so that a certain number of key points can be detected by using the feature description words during image registration, and the accuracy of image registration is improved.
Please refer to fig. 6, which illustrates a block diagram of a lidar and camera calibration parameter calibration apparatus based on image registration according to an embodiment of the present disclosure, wherein the lidar and camera calibration parameter calibration apparatus based on image registration may be applied to an intelligent driving system and a robot system. The laser radar and camera calibration parameter correction device based on image registration can comprise:
the acquiring module 610 is configured to acquire a reference camera image and a reference point cloud projection map according to a predetermined scene after calibration of calibration parameters between the camera and the laser radar is completed, where the reference point cloud projection map is calculated according to the reference camera image;
a detection module 620, configured to detect whether a relative posture between the camera and the laser radar changes;
and a correcting module 630, configured to correct the calibration parameter according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection image if the relative pose changes.
In an optional embodiment, the obtaining module 610 is further configured to:
shooting a preset scene through a camera to obtain a reference camera image;
generating reference point cloud data for a predetermined scene by a laser radar;
and generating a reference point cloud projection diagram according to the reference camera image, the reference point cloud data and the calibration parameters.
In an alternative embodiment, when the calibration parameters include an internal reference matrix and an external reference matrix, the obtaining module 610 is further configured to:
inputting the reference camera image, the reference point cloud data and the calibration parameters into a first formula and a second formula to obtain a reference point cloud projection diagram;
the first formula is
Figure BDA0002672231410000111
The second formula is
Figure BDA0002672231410000112
Wherein (X Y Z) is the coordinate of a data point in the reference point cloud data, I is the reflectivity of the data point, (xy Z) is the coordinate of the data point corresponding to (X Y Z) in the camera coordinate system, (u v) is the pixel coordinate of the corresponding pixel point in the pixel coordinate system,
Figure BDA0002672231410000121
is an internal reference matrix, and the reference matrix is,
Figure BDA0002672231410000122
is the external reference matrix and z is the depth of the data point in the camera coordinate system.
In an optional embodiment, the apparatus further comprises:
the first filling module is used for filling pixel values of pixel points corresponding to the data points according to the depth of each data point, and mapping the filled pixel values by using a color space to obtain a reference point cloud projection graph; or,
the second filling module is used for inputting the coordinates of each data point in the camera coordinate system into a third formula, filling the pixel values of the pixel points corresponding to the data points according to the obtained calculation result, and mapping the filled pixel values by using a color space to obtain a reference point cloud projection drawing, wherein the third formula is
Figure BDA0002672231410000123
Or,
and the third filling module is used for acquiring the reflectivity value of each data point, filling the pixel value of the pixel point corresponding to the data point according to the reflectivity value of each data point, and mapping the filled pixel value by using a color space to obtain a reference point cloud projection diagram.
In an alternative embodiment, the predetermined scene contains a predetermined number of objects.
In an alternative embodiment, the calibration module 630 is further configured to:
shooting a preset scene through a camera to obtain an image to be corrected;
generating a transformed point cloud projection diagram according to the reference camera image, the image to be corrected and the reference point cloud projection diagram;
and correcting calibration parameters according to the point cloud projection image to be corrected and the transformed point cloud projection image.
In an optional embodiment, the apparatus further comprises:
and the adjusting module is used for adjusting the position of the camera, and the similarity between the image to be corrected, which is obtained by shooting a preset scene by the adjusted camera, and the reference camera image exceeds a preset threshold value.
In an alternative embodiment, the calibration module 630 is further configured to:
registering the reference camera image and the image to be corrected to obtain a first homography matrix from the reference camera image to the image to be corrected;
and carrying out perspective transformation on the reference point cloud projection image according to the first homography matrix to obtain a transformed point cloud projection image.
In an alternative embodiment, the calibration module 630 is further configured to:
registering the point cloud projection image to be corrected and the transformed point cloud projection image to obtain a second homography matrix from the point cloud projection image to be corrected to the transformed point cloud projection image;
and multiplying the second homography matrix, the internal reference matrix and the external reference matrix to obtain the corrected calibration parameters.
In summary, the image registration-based lidar and camera calibration parameter correction apparatus provided in the embodiment of the present application may obtain a reference camera image and a reference point cloud projection map according to a predetermined scene after calibration of calibration parameters between the camera and the lidar is completed, where the reference point cloud projection map is obtained according to the reference camera image and the calibration parameters; detecting whether the relative attitude between the camera and the laser radar is changed; and if the relative posture is changed, correcting calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection image. The calibration method comprises the steps of obtaining a homography matrix by combining image registration and original calibration parameters, correcting the calibration parameters according to the homography matrix, fully utilizing the original calibration parameters without re-calibration, and automatically searching characteristic points to finish calibration parameter correction without arranging a calibration test environment, thereby simplifying the calibration parameter correction process and improving the calibration efficiency. In addition, the corrected calibration parameter error is small by combining the registration of the camera image and the point cloud projection image, so that the correction precision is improved.
In addition, a predetermined number of objects are contained in the predetermined scene, so that a certain number of key points can be detected by using the feature description words during image registration, and the accuracy of image registration is improved.
An embodiment of the present application provides a computer-readable storage medium, having at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, which are loaded and executed by a processor to implement the method for image registration based lidar and camera calibration parameter correction as described above.
One embodiment of the present application provides an intelligent driving system or a robot system, which includes a processor and a memory, where the memory stores at least one instruction, and the instruction is loaded and executed by the processor to implement the laser radar and camera calibration parameter correction method based on image registration.
It should be noted that: in the calibration parameter correction of the lidar and the camera calibration parameter correction device based on image registration provided in the above embodiment, only the division of the above functional modules is used for illustration, and in practical application, the above functions may be distributed by different functional modules according to needs, that is, the internal structure of the lidar and the camera calibration parameter correction device based on image registration is divided into different functional modules to complete all or part of the above described functions. In addition, the image registration-based lidar and camera calibration parameter correction device provided in the above embodiment and the image registration-based lidar and camera calibration parameter correction method embodiment belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description should not be taken as limiting the embodiments of the present application, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the embodiments of the present application should be included in the scope of the embodiments of the present application.

Claims (6)

1. A laser radar and camera calibration parameter correction method based on image registration is characterized by comprising the following steps:
after calibration of calibration parameters between a camera and a laser radar is completed, acquiring a reference camera image and a reference point cloud projection drawing according to a preset scene, wherein the reference point cloud projection drawing is obtained according to the reference camera image and the calibration parameters, and the calibration parameters comprise an internal parameter matrix and an external parameter matrix;
detecting whether a relative attitude between the camera and the laser radar is changed;
if the relative posture is changed, correcting the calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection graph;
the method for acquiring the reference camera image and the reference point cloud projection map according to the preset scene comprises the following steps: shooting the preset scene through the camera to obtain the reference camera image; generating, by the lidar, reference point cloud data for the predetermined scene; generating the reference point cloud projection drawing according to the reference camera image, the reference point cloud data and the calibration parameters;
the correcting the calibration parameters according to the homography matrix obtained by registering the reference camera image and the reference point cloud projection drawing comprises the following steps: shooting the preset scene through the camera to obtain an image to be corrected; generating a transformed point cloud projection diagram according to the reference camera image, the image to be corrected and the reference point cloud projection diagram; correcting the calibration parameters according to the point cloud projection image to be corrected and the transformed point cloud projection image;
the generating of the transformed point cloud projection diagram according to the reference camera image, the image to be corrected and the reference point cloud projection diagram comprises: registering the reference camera image and the image to be corrected to obtain a first homography matrix from the reference camera image to the image to be corrected; performing perspective transformation on the reference point cloud projection drawing according to the first homography matrix to obtain the transformed point cloud projection drawing;
the calibration parameters are corrected according to the point cloud projection image to be corrected and the transformed point cloud projection image, and the calibration parameters comprise: registering the point cloud projection image to be corrected and the transformed point cloud projection image to obtain a second homography matrix from the point cloud projection image to be corrected to the transformed point cloud projection image; and multiplying the second homography matrix, the internal reference matrix and the external reference matrix to obtain corrected calibration parameters.
2. The method of claim 1, wherein said generating the reference point cloud projection view from the reference camera image, the reference point cloud data, and the calibration parameters comprises:
inputting the reference camera image, the reference point cloud data and the calibration parameters into a first formula and a second formula to obtain the reference point cloud projection diagram;
the first formula is
Figure FDA0003038594850000021
The second formula is
Figure FDA0003038594850000022
Wherein (X Y Z) is said groupThe coordinates of a data point in the quasi-point cloud data, I is the reflectivity of the data point, (X Y Z) is the coordinates of the corresponding data point in the camera coordinate system, (u v) is the pixel coordinates of the corresponding pixel point in the pixel coordinate system,
Figure FDA0003038594850000023
is the reference matrix of the said device,
Figure FDA0003038594850000024
is the external reference matrix and z is the depth of the data point in the camera coordinate system.
3. The method of claim 2, further comprising:
filling pixel values of pixel points corresponding to the data points according to the depth of each data point, and mapping the filled pixel values by using a color space to obtain the reference point cloud projection graph; or,
inputting the coordinates of each data point in a camera coordinate system into a third formula, filling the pixel values of pixel points corresponding to the data points according to the obtained calculation result, mapping the filled pixel values by using a color space to obtain the reference point cloud projection drawing, wherein the third formula is that the coordinates of each data point in the camera coordinate system are input into the third formula, the pixel values of the pixel points corresponding to the data points are filled according to the obtained calculation result, and the third formula is that the filled
Figure FDA0003038594850000025
Or,
and acquiring a reflectivity value of each data point, filling a pixel value of a pixel point corresponding to the data point according to the reflectivity value of each data point, and mapping the filled pixel value by using a color space to obtain the reference point cloud projection diagram.
4. The method of claim 1, wherein the predetermined scene contains a predetermined number of objects.
5. The method of claim 1, further comprising:
and adjusting the position of the camera, wherein the similarity between the image to be corrected, which is obtained by shooting the preset scene by the adjusted camera, and the reference camera image exceeds a preset threshold value.
6. A lidar and camera calibration parameter correction device based on image registration is characterized in that the device comprises:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a reference camera image and a reference point cloud projection image according to a preset scene after calibration of calibration parameters between a camera and a laser radar is completed, the reference point cloud projection image is obtained by calculation according to the reference camera image, and the calibration parameters comprise an internal parameter matrix and an external parameter matrix;
the detection module is used for detecting whether the relative attitude between the camera and the laser radar changes;
the calibration module is used for calibrating the calibration parameters according to a homography matrix obtained by registering the reference camera image and the reference point cloud projection drawing if the relative posture changes;
the obtaining module is further configured to: shooting the preset scene through the camera to obtain the reference camera image; generating, by the lidar, reference point cloud data for the predetermined scene; generating the reference point cloud projection drawing according to the reference camera image, the reference point cloud data and the calibration parameters;
the correction module is further configured to: shooting the preset scene through the camera to obtain an image to be corrected; generating a transformed point cloud projection diagram according to the reference camera image, the image to be corrected and the reference point cloud projection diagram; correcting the calibration parameters according to the point cloud projection image to be corrected and the transformed point cloud projection image;
the correction module is further configured to: registering the reference camera image and the image to be corrected to obtain a first homography matrix from the reference camera image to the image to be corrected; performing perspective transformation on the reference point cloud projection drawing according to the first homography matrix to obtain the transformed point cloud projection drawing;
the correction module is further configured to: registering the point cloud projection image to be corrected and the transformed point cloud projection image to obtain a second homography matrix from the point cloud projection image to be corrected to the transformed point cloud projection image; and multiplying the second homography matrix, the internal reference matrix and the external reference matrix to obtain corrected calibration parameters.
CN202010936777.4A 2020-09-08 2020-09-08 Laser radar and camera calibration parameter correction method and device based on image registration Active CN112233184B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010936777.4A CN112233184B (en) 2020-09-08 2020-09-08 Laser radar and camera calibration parameter correction method and device based on image registration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010936777.4A CN112233184B (en) 2020-09-08 2020-09-08 Laser radar and camera calibration parameter correction method and device based on image registration

Publications (2)

Publication Number Publication Date
CN112233184A CN112233184A (en) 2021-01-15
CN112233184B true CN112233184B (en) 2021-06-22

Family

ID=74116109

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010936777.4A Active CN112233184B (en) 2020-09-08 2020-09-08 Laser radar and camera calibration parameter correction method and device based on image registration

Country Status (1)

Country Link
CN (1) CN112233184B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706634B (en) * 2021-09-23 2024-02-23 福建汇川物联网技术科技股份有限公司 Visual calibration method and device, electronic equipment and storage medium
CN114152935B (en) * 2021-11-19 2023-02-03 苏州一径科技有限公司 Method, device and equipment for evaluating radar external parameter calibration precision
CN115267746B (en) * 2022-06-13 2024-06-28 广州文远知行科技有限公司 Positioning method for laser radar point cloud projection errors and related equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559791A (en) * 2013-10-31 2014-02-05 北京联合大学 Vehicle detection method fusing radar and CCD camera signals
CN109360230A (en) * 2018-11-08 2019-02-19 武汉库柏特科技有限公司 A kind of method for registering images and system based on 2D camera Yu 3D camera
CN109947097A (en) * 2019-03-06 2019-06-28 东南大学 A kind of the robot localization method and navigation application of view-based access control model and laser fusion
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9210417B2 (en) * 2013-07-17 2015-12-08 Microsoft Technology Licensing, Llc Real-time registration of a stereo depth camera array
CN108198223B (en) * 2018-01-29 2020-04-07 清华大学 Method for quickly and accurately calibrating mapping relation between laser point cloud and visual image
DE102018215320A1 (en) * 2018-09-10 2020-03-12 Robert Bosch Gmbh Calibration system and calibration method for a vehicle detection device
CN111427026B (en) * 2020-02-21 2023-03-21 深圳市镭神智能系统有限公司 Laser radar calibration method and device, storage medium and self-moving equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559791A (en) * 2013-10-31 2014-02-05 北京联合大学 Vehicle detection method fusing radar and CCD camera signals
CN109360230A (en) * 2018-11-08 2019-02-19 武汉库柏特科技有限公司 A kind of method for registering images and system based on 2D camera Yu 3D camera
CN109947097A (en) * 2019-03-06 2019-06-28 东南大学 A kind of the robot localization method and navigation application of view-based access control model and laser fusion
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data

Also Published As

Publication number Publication date
CN112233184A (en) 2021-01-15

Similar Documents

Publication Publication Date Title
CN112233184B (en) Laser radar and camera calibration parameter correction method and device based on image registration
CN112270713B (en) Calibration method and device, storage medium and electronic device
CN101582165B (en) Camera array calibration algorithm based on gray level image and spatial depth data
US9787960B2 (en) Image processing apparatus, image processing system, image processing method, and computer program
WO2018196391A1 (en) Method and device for calibrating external parameters of vehicle-mounted camera
EP2194725A1 (en) Method and apparatus for correcting a depth image
CN106548489A (en) The method for registering of a kind of depth image and coloured image, three-dimensional image acquisition apparatus
KR20210116507A (en) Calibration method, positioning method, apparatus, electronic device and storage medium
CN113409397A (en) Storage tray detecting and positioning method based on RGBD camera
CN112837383A (en) Camera and laser radar recalibration method and device and computer readable storage medium
CN112270719A (en) Camera calibration method, device and system
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
KR20230003803A (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN113870364B (en) Self-adaptive binocular camera calibration method
CN115797467A (en) Method, device and equipment for detecting calibration result of vehicle camera and storage medium
CN114119682A (en) Laser point cloud and image registration method and registration system
CN113436267A (en) Visual inertial navigation calibration method and device, computer equipment and storage medium
CN112419427A (en) Method for improving time-of-flight camera accuracy
CN115546216B (en) Tray detection method, device, equipment and storage medium
CN116912417A (en) Texture mapping method, device, equipment and storage medium based on three-dimensional reconstruction of human face
CN111563936A (en) Camera external parameter automatic calibration method and automobile data recorder
CN111738035A (en) Method, device and equipment for calculating yaw angle of vehicle
CN112734857B (en) Calibration method for camera internal reference and camera relative laser radar external reference and electronic equipment
CN113740816B (en) Distance measurement error correction method and device for camera module

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240229

Address after: No. 001, Xinqiao Foreign Capital Industrial Park, danbei Town, Danyang City, Zhenjiang City, Jiangsu Province

Patentee after: CHANGCHENG AUTOMOBILE JIANGSU Co.,Ltd.

Country or region after: China

Address before: 211100 No. 2 Southeast University Road, Jiangning District, Nanjing, Jiangsu

Patentee before: SOUTHEAST University

Country or region before: China

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240507

Address after: No. 001, Xinqiao Foreign Capital Industrial Park, danbei Town, Danyang City, Zhenjiang City, Jiangsu Province

Patentee after: CHANGCHENG AUTOMOBILE JIANGSU Co.,Ltd.

Country or region after: China

Patentee after: Changcheng Automobile Industry (Changzhou) Co.,Ltd.

Address before: No. 001, Xinqiao Foreign Capital Industrial Park, danbei Town, Danyang City, Zhenjiang City, Jiangsu Province

Patentee before: CHANGCHENG AUTOMOBILE JIANGSU Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right