CN117572392A - Camera-laser radar joint calibration method and device based on 3D-3D matching point pair - Google Patents
Camera-laser radar joint calibration method and device based on 3D-3D matching point pair Download PDFInfo
- Publication number
- CN117572392A CN117572392A CN202310840622.4A CN202310840622A CN117572392A CN 117572392 A CN117572392 A CN 117572392A CN 202310840622 A CN202310840622 A CN 202310840622A CN 117572392 A CN117572392 A CN 117572392A
- Authority
- CN
- China
- Prior art keywords
- camera
- reference mark
- square reference
- radar
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 88
- 239000011159 matrix material Substances 0.000 claims abstract description 64
- 238000013519 translation Methods 0.000 claims abstract description 31
- 238000001914 filtration Methods 0.000 claims abstract description 24
- 230000003287 optical effect Effects 0.000 claims abstract description 18
- 230000009466 transformation Effects 0.000 claims abstract description 18
- 238000004590 computer program Methods 0.000 claims description 5
- 230000000694 effects Effects 0.000 claims description 4
- 238000005070 sampling Methods 0.000 claims description 4
- 238000010187 selection method Methods 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000004927 fusion Effects 0.000 description 5
- 238000012935 Averaging Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 description 1
- BLRPTPMANUNPDV-UHFFFAOYSA-N Silane Chemical compound [SiH4] BLRPTPMANUNPDV-UHFFFAOYSA-N 0.000 description 1
- 241000446304 Vela Species 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 229910000077 silane Inorganic materials 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
Abstract
The invention provides a camera-laser radar joint calibration method and device based on a 3D-3D matching point pair, comprising the following steps: acquiring two calibration plates with specific shapes and sizes, wherein each calibration plate is provided with a binary square reference mark for estimating the posture of the camera; identifying square reference marks and angular points thereof to obtain a transformation matrix of the centers of the square reference marks and the optical centers of the cameras, and obtaining coordinates of the angular points of the square reference marks under a camera coordinate system; acquiring the square reference mark initial point cloud information through a radar, and filtering radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information; obtaining coordinates of corner points of the square reference mark under the radar coordinate system based on the cloud information of the edge points of the square reference mark; and obtaining a rotation translation matrix of the two coordinate systems based on the coordinates of the reference mark corner points under the two coordinate systems. The invention solves the problems of easy noise interference, poor robustness and the like in the general calibration method.
Description
Technical Field
The invention relates to the fields of sensor space synchronization and robot perception, in particular to a camera-laser radar combined calibration method and device based on a 3D-3D matching point pair.
Background
The calibration technology is the basis of multi-sensor information fusion widely adopted in the current robot and unmanned fields. The multi-sensor information fusion adopts a plurality of sensors to provide redundant information, so that error measurement is reduced, and the instantaneity, the robustness and the accuracy of automatic control or remote control are improved. The camera and laser radar information fusion is one of multi-sensor information fusion, and a camera-laser radar joint standard is used for fusing camera and laser radar data to obtain external parameters between the camera and the laser radar.
The main purpose of the joint calibration of the camera and the laser radar is to calculate external parameters between the camera and the laser radar, namely a rotation translation matrix of two sensor coordinate systems. In real-time scenarios, such as the autopilot field, high precision calibration techniques are required to achieve minimum error external calibration, which can lead to erroneous fusion data, which can be fatal to both the vehicle itself and other vehicles or pedestrians around. At present, high-efficiency and high-precision camera-laser radar combined calibration and detection of calibration quality are still hot spots studied by expert students in various complex environments such as indoor and outdoor strong wind, strong light and other scenes with more unknown factors.
At present, aiming at the problem of joint calibration of a high-precision camera and a laser radar, the existing method comprises the following steps:
[1] the literature (Dhall, ankit, et al, "LiDAR-camera calibration using 3D-3D point correspondences." arxivprenntarxiv: 1705.09785 (2017)) proposes a 3D pairing method for a laser radar and a vision camera, which innovatively introduces a method for obtaining 3D pose information of the vision camera relative to a calibration plate through ArUco codes, but the back-end processing does not consider miscellaneous points generated by interference of sensor noise and the like, so that the calibration result is poor in reliability and low in accuracy. Meanwhile, the method for effectively evaluating the calibration result is not provided.
[2] The literature (Martin Velas, michal silane, ZDenek Matema, adam Herout Calibration of RGBCamera With Velodyne LiDAR. Https:// www.github.com/robofit/but velodyne /) proposes a laser radar and visual camera external parameter calibration method only adopting one frame of data, which has higher accuracy, but poorer generalization, and is represented by higher requirements on the input data density, thus being only applicable to 64-line laser radars with dense laser point cloud data, but not to 32-line and 16-line laser radars which are widely applied.
Summary of the inventionas known from the prior art in the related art, the existing method for solving the problem of joint calibration of a high-precision camera and a laser radar has the following problems: 1. the mixed points generated by the interference of sensor noise and the like are not considered, the reliability and the accuracy of the calibration result are poor, and the accuracy of the calibration result cannot be effectively evaluated; 2. poor generalization.
Disclosure of Invention
In view of the above, the invention provides a method and a device for jointly calibrating a camera and a laser radar based on a 3D-3D matching point pair, which are used for solving the problems of easiness in noise interference, poor robustness, long calibration time, low accuracy, complex manual operation and the like in the conventional joint calibration of the camera and the laser radar.
In order to achieve the above object, the following solutions have been proposed:
a camera-laser radar joint calibration method based on a 3D-3D matching point pair comprises the following steps:
acquiring two calibration plates with specific shapes and sizes, wherein each calibration plate is provided with a binary square reference mark for estimating the posture of the camera;
identifying binary square reference marks and corner points thereof on a calibration plate, and obtaining a transformation matrix of a square reference mark center and a camera optical center based on the binary square reference marks and the corner points thereof to obtain coordinate information of the square reference mark corner points under a camera coordinate system;
acquiring the square reference mark initial point cloud information through a radar, and filtering radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information;
acquiring coordinate information of a square reference mark corner point under a radar coordinate system based on the square reference mark edge point cloud information;
and obtaining a rotation translation matrix between the camera coordinate system and the radar coordinate system based on the coordinate information of the reference mark corner points under the camera coordinate system and the radar coordinate system.
According to the camera-laser radar joint calibration method based on the 3D-3D matching point pair, the binary square reference mark used for camera attitude estimation on the calibration plate is an ArUco mark, the binary square reference mark and the angular point thereof on the calibration plate are identified, a transformation matrix of the square reference mark center and the camera optical center is obtained based on the binary square reference mark and the angular point thereof, and coordinate information of the square reference mark angular point under a camera coordinate system is obtained, and the method comprises the following steps:
designating a dictionary to store size information and coding information of the ArUco mark, and creating the ArUco mark;
identifying an ArUco-marked corner list and ArUco-marked IDs by a camera;
extracting the edge marked by ArUco to obtain a transformation matrix of a coordinate system taking the center marked by ArUco as an origin and a coordinate system taking the optical center of a camera as the origin;
and obtaining coordinate information of a reference mark, namely an ArUco mark corner point, under the camera coordinate system based on the transformation matrix.
According to the camera-laser radar joint calibration method based on the 3D-3D matching point pair, the method for obtaining the square reference mark initial point cloud information through the radar, filtering the radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information comprises the following steps:
filtering the square reference mark initial point cloud information acquired by the radar according to the reflection intensity, and judging the square reference mark initial point cloud information to be square reference mark edges when the reflection intensity is obviously changed, so as to obtain a filtered square reference mark edge point cloud data set C filtered 。
According to the camera-laser radar joint calibration method based on the 3D-3D matching point pair, which is provided by the invention, the coordinate information of the square reference mark corner point under the radar coordinate system is obtained based on the square reference mark edge point cloud information, and the method comprises the following steps:
removal of C filtered Overlapping point clouds at the middle position to obtain a point cloud data set C with the overlapping point clouds removed delet ;
The RANSAC plane fitting method is adopted to remove the point cloud data set C after overlapping point clouds delet Denoising to obtain a denoised point cloud data set C RANSAC ;
The denoising processed point cloud data set C RANSAC By projection matrix Mat proj Projecting onto a camera 2D plane;
marking line segments on a 2D plane, selecting radar edge points, marking the line segments by drawing quadrangles around each line segment, and obtaining a point cloud data set C near the square reference mark edge selected ;
Based on point cloud data set C selected Obtaining radar coordinates by adopting RANSAC line fittingCoordinate information of the corner points of the square fiducial marks is tied.
According to the camera-laser radar joint calibration method based on the 3D-3D matching point pair, which is provided by the invention, the coordinate information of the reference mark corner points below the camera coordinate system and the radar coordinate system is based, and a rotation translation matrix between the camera and the radar is obtained, and the method comprises the following steps:
selecting minimum sample points from reference mark corner points under a camera coordinate system and a radar coordinate system respectively, and obtaining a minimum point set S of the minimum sample points by using a random selection method m ;
Estimating the minimum point set S by using iterative nearest point method m Is a model of (2);
the model is promoted to a total point set, and the number of point sets, namely the number of inner points, which accords with the model within a preset threshold value is calculated;
comparing the running times with the running times upper limit when the actual proportion of the total points of the internal points to the input points is larger than a set proportionality constant, and taking an average value of running results of the minimum sample point sets under the two stored coordinate systems and outputting the proportion of the internal points to the total points of the input point sets when the running times upper limit k is reached, namely the internal point rate;
based on the average value of the running results of the minimum sample point set under two coordinate systems, a preliminary rotation translation matrix Rt between the camera and the radar system is obtained 0 。
According to the camera-laser radar combined calibration method based on the 3D-3D matching point pair, after coordinate information of a reference mark, namely an ArUco mark corner point, below a camera coordinate system is obtained, the method further comprises the following steps: and (3) calculating the translation of the last column of the transformation matrix by carrying out pose estimation on the identified ArUco mark so as to judge whether a large error exists between the true distance between the square reference mark center and the camera optical center.
According to the camera-laser radar combined calibration method based on the 3D-3D matching point pair, the accuracy of a radar coordinate system is judged by comparing the edge length of the square reference mark identified by the radar with the actual length of the square reference mark.
According to the present inventionThe invention provides a camera-laser radar combined calibration method based on 3D-3D matching point pairs, which is characterized in that line segments are marked on a 2D plane, radar edge points are selected, line segments are marked by drawing quadrangles around each line segment, and a point cloud data set C near the edges of square reference marks is obtained selected Comprising:
obtaining a Point cloud data set C RANSAC Then, eight sides of the square reference marks are framed, and four sides of each square reference mark are framed clockwise from the upper left side;
when the error of each edge is larger than a preset threshold, the accuracy of the 3D pose is regarded as not reaching the standard, the frame selection is performed again, and when the error of each edge is smaller than the preset threshold, the accuracy of the 3D pose is regarded as reaching the standard, and then the point cloud data set C containing the square reference mark edge is obtained selected 。
According to the camera-laser radar joint calibration method based on the 3D-3D matching point pair, the method further comprises the following steps of:
fitting and removing irrelevant external points by using LO-RANSAC (Locally Optimized Random Sample Consensus, random sampling consensus algorithm) to obtain a rotation translation matrix Rt between a camera coordinate system and a radar coordinate system after removing irrelevant external points 1 And calibrating the error value.
According to the camera-laser radar combined calibration method based on the 3D-3D matching point pair, the accuracy of the calibration result is judged through reprojection after the rotation translation matrix and the calibration error value are obtained between the camera coordinate system and the radar coordinate system after irrelevant external points are removed.
According to the camera-laser radar combined calibration method based on the 3D-3D matching point pair, which is provided by the invention, the accuracy of the calibration result is judged through reprojection, and the method comprises the following steps:
the rotation translation matrix Rt to be obtained 1 As a projection matrix, projecting the 3D point cloud obtained by the radar at the current moment onto a camera image at the current moment;
if the phenomenon of ghost or other bad re-projection effect of the projected camera image is found, the image is processed by a translation matrix Rt 0 And performing fine adjustment rotation on the medium value.
The invention also provides a camera-laser radar combined calibration device based on the 3D-3D matching point pair, which comprises the following components:
the reference mark acquisition module is used for acquiring binary square reference marks and angular points thereof, which are used for estimating the posture of the camera, on the calibration plate;
the filtering module is used for filtering the reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information;
and the coordinate information acquisition module is used for acquiring the coordinate information of the specific point of the square reference mark based on the square reference mark edge point cloud information.
The invention also provides electronic equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the camera-laser radar joint calibration method based on the 3D-3D matching point pair when executing the program.
The invention also provides a non-transitory computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements a camera-lidar joint calibration method based on a 3D-3D matching point pair as described in any one of the above.
According to the camera-laser radar combined calibration method based on the 3D-3D matching point pair, the camera system is matched with the radar system 3D-3D point pair, and the three-key combined calibration of the camera and the laser radar is completed by adopting a related point cloud filtering method. Compared with a 2D-3D calibration method, the 3D-3D Point pair matching method reduces noise caused by manually marking 2D pixels or executing a PnP (Persponsive-n-Point) algorithm, and is more accurate in calibration result and more suitable for the calibration of the low-density laser radar. Aiming at three main steps of solving a calibration matrix, re-projecting and manually fine-tuning the matrix, the invention designs a visual starting program based on QT-ROS development, adds a one-key starting automatic calibration function, reduces the operation complexity and simultaneously accelerates the calibration speed. In order to adapt to the problems of different degrees of noise and the like of radar point clouds under different scenes, the method for filtering the point clouds is added with the methods for filtering the point clouds, such as plane filtering, removing repeated points, removing extraneous points of LO-RANSAC and the like, and has better robustness and accuracy in practical tests. Finally, the accuracy of the method is further improved by averaging a plurality of results obtained after a plurality of scans of the radar.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a camera-laser radar joint calibration method based on a 3D-3D matching point pair;
FIG. 2 is a schematic diagram showing the implementation of a camera-lidar joint calibration method based on 3D-3D matching point pairs
FIG. 3 is a schematic diagram of a camera-lidar combined calibration device based on a 3D-3D matching point pair;
fig. 4 is a schematic structural diagram of an electronic device provided by the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Specific embodiments of the present invention are described below in conjunction with fig. 1-4.
Fig. 1 is a flowchart of a camera-lidar joint calibration method based on a 3D-3D matching point pair, where as shown in fig. 1, the camera-lidar joint calibration method based on the 3D-3D matching point pair includes:
s100, acquiring two calibration plates with specific shapes and sizes, wherein each calibration plate is provided with a binary square reference mark for estimating the posture of a camera;
s110, identifying a binary square reference mark and a corner point thereof on the calibration plate, and obtaining a transformation matrix of a square reference mark center and a camera optical center based on the binary square reference mark and the corner point thereof to obtain coordinate information of a specific point of the calibration plate under a camera coordinate system;
s120, acquiring the square reference mark initial point cloud information through a radar, and filtering radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information;
s130, obtaining coordinate information of a square reference mark corner point under a radar coordinate system based on square reference mark edge point cloud information;
and S140, obtaining a rotation translation matrix between the camera coordinate system and the radar coordinate system based on the coordinate information of the reference mark corner points below the camera coordinate system and the radar coordinate system.
The following describes the above steps in detail:
s100, acquiring two calibration plates with specific shapes and sizes, wherein each calibration plate is provided with a binary square reference mark for estimating the posture of a camera;
specifically, in the embodiment of the invention, a binary square reference mark for estimating the posture of the camera is an ArUco mark, and in the embodiment of the invention, two square calibration plates with 60cm x 60cm are adopted, and calibration paper with ArUco codes is adhered on the calibration plates. The size of the binary square reference mark can be equal to or smaller than that of the calibration plate, and the difference is that when the binary square reference mark is smaller than that of the calibration plate, the edge of the binary square reference mark is not completely coincident with the edge of the calibration plate, and the edges of the binary square reference mark identified by the camera and the laser radar are not the edges of the calibration plate; when the binary square reference mark is equal to the calibration plate, the edge of the binary square reference mark coincides with the edge of the calibration plate, and the edges of the calibration plate identified by the camera and the laser radar are also the edges of the binary square reference mark. For ease of description, the embodiments of the present invention are described with reference to binary square fiducial marks that are equal in size to the calibration plate. For facilitating laser radar scanning, two calibration plates are placed in an inclined mode during calibration, the inclination angle between one edge of each calibration plate and the ground plane is about 45 degrees, the placement distance of each calibration plate is judged according to radar performance, four rings are guaranteed to be arranged on the calibration plate, and meanwhile the two calibration plates are required to be located in the areas on two sides of a camera visual field.
S110, identifying binary square reference marks and corner points thereof on a calibration plate, and obtaining a transformation matrix of a square reference mark center and a camera optical center based on the binary square reference marks and the corner points thereof to obtain coordinate information of the square reference mark corner points under a camera coordinate system;
specifically, an 'Extrinsic' option in a visual panel is started, arUco codes and corner points thereof on a calibration plate are identified through OpenCV, edges of the ArUco codes are extracted, and a transformation [ R|t ] matrix of the center of a code disc and the optical center of a camera is given. The method comprises the following specific steps:
step one: the dictionary is specified to store the information such as the size and the code of the ArUco code, thereby creating an ArUco mark of a selected dictionary, and the ArUco mark can be drawn after the dictionary is determined.
Step two: the ArUco-marked corner list and ArUco-marked IDs were detected by a camera, with each ArUco-marked, its four corners all returning in their original order starting from the upper right corner and rotating clockwise.
Step three: and performing pose estimation on the identified ArUco mark, visually identifying, generating a camera coordinate system to judge whether the calibration plate is stable, outputting a coordinate system with the ArUco mark center as an origin and a transformation matrix [ R|t ] matrix of the coordinate system with the camera optical center as the origin, and judging whether a large error exists in the real distance between the ArUco mark center and the camera optical center by calculating the translation of the last column of the [ R|t ] matrix.
Through the steps, the coordinates of the reference mark corner points under the camera coordinate system can be obtained.
S120, acquiring the square reference mark initial point cloud information through a radar, and filtering radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information;
specifically, in this embodiment, the calibration plate edge is detected according to the radar reflection variation intensity filtering, that is, the obtained radar initial point cloud is filtered according to the reflection intensity thereof, so as to obtain the filtered calibration plate edge point cloud data set C filtered Namely, when the reflection intensity of the light source changes obviously, the light source is judged to be the edge of the calibration plate.
S130, obtaining coordinate information of a square reference mark corner point under a radar coordinate system based on square reference mark edge point cloud information;
specifically, a calibration plate edge point cloud data set C is obtained filtered And then, carrying out RANSAC (Random Sample Consensus, random sampling coincidence algorithm) line fitting on the frame selection boundary to obtain 3D information of the edge of the calibration plate, adopting a plane RANSAC to remove the impurity points which do not belong to the calibration plate, and comparing the displayed edge length of the plate with the actual length of the plate to judge the accuracy of the radar 3D pose, wherein the method comprises the following specific steps:
step one: passing the filtered 3D Lei Dadian cloud through a projection matrix Mat proj Projection onto a camera 2D plane, i.e. 3D radar point cloud coordinate matrix poiht= [ x, y, z]And converting the Pixel coordinate matrix pixel= (u, v) into a 2D Pixel coordinate matrix, and conveniently visually selecting the boundary of the calibration plate. The transformation relationship in the 3D coordinate space is as follows:
Pixel=Mat proj *Point
step two: marking line segments in the radar 3D point cloud, and selecting radar edge points. ROS nodes allow for manual marking of line segments by drawing polygons around each line segment. The method comprises the following specific steps:
after a complete image is obtained, eight sides of two calibration plates are selected in a frame mode by clicking a mouse once and other keys once, and four sides of each calibration plate are selected in a clockwise frame mode from the upper left side;
in the embodiment of the invention, the true length of the calibration plate is known to be 60cm, and the maximum error is set to be 0.03cm. Each frame is covered with a plate, a visual window appears on the interface, at this time, the numerical value on each side is observed, if the numerical value is larger than the range of 0.6+/-0.03 cm, and the accuracy of the 3D pose is not up to standard, the frame selection is invalid, the frame selection is repeated until the frame selection is qualified, and finally, a point cloud data set C which comprises two calibration plates and is near eight sides can be obtained selected 。
Step three: in order to ensure the accuracy of the following RANSAC fitting boundary, the iteration rate is improved, and the overlapping point clouds of the initial selection point cloud set position are removed. Computing point cloud data set C selected If the distance dist between each point and other points is less than or equal to 0.000001, the two points are considered to be repeated, and one point is deleted. Finally, obtaining a point cloud data set C after deleting the repeated points delet 。
Step four: taking the influence of the radar point cloud projected by a calibration plate on an object behind the calibration plate on the boundary selection of the calibration plate into consideration, after the point cloud data selected in the previous step are acquired, denoising the acquired radar point cloud by adopting a RANSAC planar fitting method, and removing the radar point cloud which does not belong to the calibration plate; the method comprises the following specific steps:
at point cloud data set C delet 3 points are selected at random each time, the corresponding plane equation ax+by+cz+d=0 is calculated, and the set C is calculated delet Euclidean distance d of each point of (C) to the plane i If d i Less than the set threshold d threshold I.e. d i <d threshold The point is considered the inner point, otherwise the outer point, in this example the threshold is set to 0.015. And repeating the steps, recording the number of the inner points each time, and determining whether the iteration is finished or not according to the judgment factors calculated after each iteration is finished. After the iteration is finished, the coefficient corresponding to the plane with the largest number of inner points is the final model parameter. Removing the outer points of the plane under the final model parameters, and assembling the set C consisting of the points in the plane RANSAC The point clouds employed as RANSAC line fits.
Step five: and (3) adopting RANSAC line fitting to obtain four boundaries selected by the frame, and calculating intersection points between every two straight lines to obtain four corner points of a calibration plate.
Through the steps, the coordinates of the corner points of the calibration plate, namely the square reference mark corner points, under the radar coordinate system can be obtained.
And S140, obtaining a rotation translation matrix between the camera coordinate system and the radar coordinate system based on the coordinate information of the reference mark corner points below the camera coordinate system and the radar coordinate system.
After the matrix of [ R|t ] of each pairing point in the camera system and the calibration plate system (hereinafter referred to as camera group) and the radar system and the calibration plate system (hereinafter referred to as radar group) for the calibration plate is obtained, LO-RANSAC (Locally Optimized Random Sample Consensus, random sampling consensus algorithm) is added to fit and remove irrelevant outer points, and the matrix of [ R|t ] of the rotation translation matrix of the radar and the camera and the calibration error value are obtained. The method comprises the following specific steps:
step one: selecting the minimum sample point number from the total point set of the corner points of the calibration plate, namely the square reference mark corner points, and obtaining a minimum point set S of the minimum sample point number by using a random selection method m ;
Step two: estimating the coincidence of the minimum set of points S using the ICP (Iterative Closest Point), iterative closest point method m The matching formula of the ICP method is as follows:
wherein q i For the coordinate information of the ith minimum point set under the camera coordinates, p i The coordinate information of the ith minimum point set under a laser radar coordinate system is represented by R, a rotation matrix and t, a translation vector;
step three: the model is promoted to a total point set, and the number of point sets, namely the number of inner points, which accords with the model within a preset threshold value is calculated;
step four: comparing the set proportionality constant with the proportion of the number of the internal points to the total point number actually, thereby judging whether the model is good or not;
in this embodiment, the following method is adopted for judging whether the model is good or not: if the proportion of the interior points is larger than the preset proportion, obtaining accurate interior points and a model through a local optimization algorithm, namely, taking the current interior point set as an input point set, repeatedly executing the steps one to three for designated times, and simultaneously updating the upper limit k of the operation times to optimize; if the conditions are not satisfied, repeating the first step to the third step. The update rule of the operation times upper limit k is as follows:
wherein, p is the default confidence, w is the proportion of the interior points, and m is the input appointed operation times.
Step five: comparing the running times with the running times upper limit, and when the running times upper limit k is reached, averaging the stored running results and outputting the proportion of the inner points to the elements of the input point set, namely the inner point rate;
through the steps, a preliminary rotation translation matrix Rt between the camera coordinate system and the radar coordinate system after automatic calibration is obtained 0 。
In the embodiment of the invention, a preliminary rotation translation matrix Rt between a camera coordinate system and a radar coordinate system after automatic calibration is obtained 0 Then, the accuracy of the calibration result is judged through re-projection, and if certain deviation exists, rt can be adjusted through fine adjustment 0 The matrix comprises the following specific steps:
clicking the "ExRT Transport" option of the visual panel, and automatically calibrating the obtained rotation translation matrix Rt 0 As a projection matrix, projecting the 3D point cloud obtained by the radar at the current moment onto a camera image at the current moment, and observing and judging the reprojection effect;
if the reprojection effect is found to be poor, the rotation translation matrix Rt can be rotated 0 And fine-tuning the medium value, wherein the fine-tuning step can click on a project Test option of the visual panel, and a progress bar dragging type parameter tuning interface appears. The adjustable parameter matrix is as follows:
param=[x,y,z,roll,pitch,raw] T
wherein for a 2D point cloud in a camera view, x represents a pan left and right, y represents a pan up and down, z represents a depth position, roll represents a vertical rotation, pitch represents a horizontal rotation, and raw represents a pitch rotation.
By fine tuning, the final rotation translation matrix Rt between the camera and the radar system is obtained 1 。
The camera-laser radar combined calibration device based on the 3D-3D matching point pair provided by the invention is described below, and the camera-laser radar combined calibration device based on the 3D-3D matching point pair described below and the camera-laser radar combined calibration method based on the 3D-3D matching point pair described above can be correspondingly referred to each other.
Fig. 3 is a schematic structural diagram of a camera-lidar combined calibration device based on a 3D-3D matching point pair, where, as shown in fig. 3, the camera-lidar combined calibration device based on a 3D-3D matching point pair provided by the invention includes:
a reference mark acquisition module 310, configured to acquire a binary square reference mark and its angular point for camera pose estimation on a calibration board;
the filtering module 320 is configured to filter the reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information;
the coordinate information obtaining module 330 is configured to obtain coordinate information of a specific point of the square reference mark based on the square reference mark edge point cloud information.
Fig. 4 illustrates a physical schematic diagram of an electronic device, as shown in fig. 4, which may include: processor 410, communication interface (Communications Interface) 420, memory 430 and communication bus 440, wherein processor 410, communication interface 420 and memory 430 communicate with each other via communication bus 440. The processor 410 may invoke logic instructions in the memory 430 to perform a camera-lidar joint calibration method based on 3D-3D matching point pairs, the method comprising: acquiring two calibration plates with specific shapes and sizes, wherein each calibration plate is provided with a binary square reference mark for estimating the posture of the camera; identifying binary square reference marks and corner points thereof on a calibration plate, and obtaining a transformation matrix of a square reference mark center and a camera optical center based on the binary square reference marks and the corner points thereof to obtain coordinate information of the square reference mark corner points under a camera coordinate system; acquiring the square reference mark initial point cloud information through a radar, and filtering radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information; acquiring coordinate information of a square reference mark corner point under a radar coordinate system based on the square reference mark edge point cloud information; and obtaining a rotation translation matrix between the camera coordinate system and the radar coordinate system based on the coordinate information of the reference mark corner points under the camera coordinate system and the radar coordinate system.
Further, the logic instructions in the memory 430 described above may be implemented in the form of software functional units and may be stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In still another aspect, the present invention further provides a non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor, is implemented to perform the method for joint calibration of a camera and a lidar based on a 3D-3D matching point pair provided by the above methods, the method comprising: acquiring two calibration plates with specific shapes and sizes, wherein each calibration plate is provided with a binary square reference mark for estimating the posture of the camera; identifying binary square reference marks and corner points thereof on a calibration plate, and obtaining a transformation matrix of a square reference mark center and a camera optical center based on the binary square reference marks and the corner points thereof to obtain coordinate information of the square reference mark corner points under a camera coordinate system; acquiring the square reference mark initial point cloud information through a radar, and filtering radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information; acquiring coordinate information of a square reference mark corner point under a radar coordinate system based on the square reference mark edge point cloud information; and obtaining a rotation translation matrix between the camera coordinate system and the radar coordinate system based on the coordinate information of the reference mark corner points under the camera coordinate system and the radar coordinate system.
The invention aims to provide a camera-laser radar combined calibration method, a device, electronic equipment and a storage medium based on a 3D-3D matching point pair, which are used for solving the problems of easiness in noise interference, poor robustness, long calibration time, low accuracy, complex manual operation and the like in a general calibration method, and three-key combined calibration of a camera and a laser radar is completed by matching a camera system with a calibration plate system and a radar system with the 3D-3D point pair of the calibration plate system and adding a related point cloud filtering method. Compared with a 2D-3D calibration method, the 3D-3D Point pair matching reduces noise caused by manually marking 2D pixels or executing a PnP (Persponsive-n-Point) algorithm, so that a calibration result is more accurate and is more suitable for the calibration of a low-density laser radar. Aiming at three main steps of solving a calibration matrix, re-projecting and manually fine-tuning the matrix, the invention designs a visual starting program based on QT-ROS development, adds a one-key starting automatic calibration function, reduces the operation complexity and simultaneously accelerates the calibration speed. In order to adapt to the problems of different degrees of noise and the like of radar point clouds under different scenes, the invention adds the point cloud filtering methods of adding plane filtering, removing repeated points, removing extraneous points of LO-RANSAC and the like, and has better robustness and accuracy in practical tests. Finally, the accuracy of the method is further improved by averaging a plurality of results obtained after a plurality of scans of the radar.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.
Claims (14)
1. A camera-laser radar joint calibration method based on a 3D-3D matching point pair is characterized by comprising the following steps:
acquiring two calibration plates with specific shapes and sizes, wherein each calibration plate is provided with a binary square reference mark for estimating the posture of the camera;
identifying binary square reference marks and corner points thereof on a calibration plate, and obtaining a transformation matrix of a square reference mark center and a camera optical center based on the binary square reference marks and the corner points thereof to obtain coordinate information of the square reference mark corner points under a camera coordinate system;
acquiring the square reference mark initial point cloud information through a radar, and filtering radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information;
acquiring coordinate information of a square reference mark corner point under a radar coordinate system based on the square reference mark edge point cloud information;
and obtaining a rotation translation matrix between the camera coordinate system and the radar coordinate system based on the coordinate information of the reference mark corner points under the camera coordinate system and the radar coordinate system.
2. The camera-laser radar joint calibration method based on the 3D-3D matching point pair according to claim 1, wherein the binary square reference mark used for camera attitude estimation on the calibration board is an ArUco mark, the binary square reference mark and its angular point on the calibration board are identified, a transformation matrix of a square reference mark center and a camera optical center is obtained based on the binary square reference mark and its angular point, and coordinate information of a square reference mark angular point under a camera coordinate system is obtained, and the method comprises the following steps:
designating a dictionary to store size information and coding information of the ArUco mark, and creating the ArUco mark;
identifying an ArUco-marked corner list and ArUco-marked IDs by a camera;
extracting the edge marked by ArUco to obtain a transformation matrix of a coordinate system taking the center marked by ArUco as an origin and a coordinate system taking the optical center of a camera as the origin;
and obtaining coordinate information of a reference mark, namely an ArUco mark corner point, under the camera coordinate system based on the transformation matrix.
3. The camera-laser radar joint calibration method based on the 3D-3D matching point pair according to claim 1, further characterized in that the obtaining the square reference mark initial point cloud information by the radar, filtering radar reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information, includes:
filtering the square reference mark initial point cloud information acquired by the radar according to the reflection intensity, and judging the square reference mark initial point cloud information to be square reference mark edges when the reflection intensity is obviously changed, so as to obtain a filtered square reference mark edge point cloud data set C filtered 。
4. The camera-laser radar joint calibration method based on the 3D-3D matching point pair of claim 3, wherein the obtaining coordinate information of the square reference mark corner point under the radar coordinate system based on the square reference mark edge point cloud information comprises:
removal of C filtered Overlapping point clouds at the middle position to obtain a point cloud data set C with the overlapping point clouds removed delet ;
The RANSAC plane fitting method is adopted to remove the point cloud data set C after overlapping point clouds delet Denoising to obtain a denoised point cloud data set C RANSAC ;
The denoising processed point cloud data set C RANSAC By projection matrix Mat proj Projecting onto a camera 2D plane;
marking line segments on a 2D plane, selecting radar edge points, marking the line segments by drawing quadrangles around each line segment, and obtaining a point cloud data set C near the square reference mark edge selected ;
Based on point cloud data set C selected And (5) adopting RANSAC line fitting to obtain coordinate information of the reference mark corner points under the radar coordinate system.
5. The method for calibrating the camera-laser radar joint based on the 3D-3D matching point pair according to claim 1, wherein the method for obtaining the rotation translation matrix between the camera and the radar based on the coordinate information of the reference mark corner points below the camera coordinate system and below the radar coordinate system comprises the following steps:
selecting minimum sample points from reference mark corner points under a camera coordinate system and a radar coordinate system respectively, and obtaining a minimum point set S of the minimum sample points by using a random selection method m ;
Estimating the minimum point set S by using iterative nearest point method m Is a model of (2);
the model is promoted to a total point set, and the number of point sets, namely the number of inner points, which accords with the model within a preset threshold value is calculated;
comparing the running times with the running times upper limit when the actual proportion of the total points of the internal points to the input points is larger than a set proportionality constant, and taking an average value of running results of the minimum sample point sets under the two stored coordinate systems and outputting the proportion of the internal points to the total points of the input point sets when the running times upper limit k is reached, namely the internal point rate;
based on the average value of the running results of the minimum sample point set under two coordinate systems, a preliminary rotation translation matrix Rt between the camera and the radar system is obtained 0 。
6. The camera-lidar joint calibration method based on the 3D-3D matching point pair according to claim 2, further comprising, after obtaining the coordinate information of the reference mark corner point below the camera coordinate system: and (3) calculating the translation of the last column of the transformation matrix by carrying out pose estimation on the identified ArUco mark so as to judge whether a large error exists between the true distance between the square reference mark center and the camera optical center.
7. The camera-lidar joint calibration method based on the 3D-3D matching point pair of claim 4, wherein the camera-lidar joint calibration method is characterized in that,
and judging the accuracy of the radar coordinate system by comparing the edge length of the square reference mark identified by the radar with the actual length of the square reference mark.
8. The camera-lidar joint calibration method based on the 3D-3D matching point pair according to claim 4, further characterized in that the marking of line segments on the 2D plane, selecting radar edge points, marking line segments by drawing a quadrangle around each line segment, obtaining a point cloud data set C near the square reference mark edge selected Comprising:
obtaining a Point cloud data set C RANSAC Then, eight sides of the square reference marks are framed, and four sides of each square reference mark are framed clockwise from the upper left side;
when the error of each edge is larger than a preset threshold, the accuracy of the 3D pose is regarded as not reaching the standard, the frame selection is performed again, and when the error of each edge is smaller than the preset threshold, the accuracy of the 3D pose is regarded as reaching the standard, and then the point cloud data set C containing the square reference mark edge is obtained selected 。
9. The method for calibrating the camera-laser radar based on the 3D-3D matching point pair according to claim 1, wherein the method further comprises, after obtaining the rotation translation matrix between the camera coordinate system and the radar coordinate system based on the coordinate information of the reference mark corner points below the camera coordinate system and below the radar coordinate system:
fitting and removing irrelevant external points by adopting a random sampling consistent algorithm to obtain a rotation translation matrix Rt between a camera coordinate system and a radar coordinate system after the irrelevant external points are removed 1 And calibrating the error value.
10. The camera-laser radar joint calibration method based on the 3D-3D matching point pair according to claim 9, wherein the accuracy of the calibration result is judged through reprojection after a rotation translation matrix and a calibration error value are obtained between a camera coordinate system and a radar coordinate system after extraneous points are removed.
11. The camera-lidar joint calibration method based on the 3D-3D matching point pair of claim 10, wherein the determining the accuracy of the calibration result by the re-projection comprises:
the rotation translation matrix Rt to be obtained 1 As a projection matrix, projecting the 3D point cloud obtained by the radar at the current moment onto a camera image at the current moment;
if the phenomenon of ghost or other bad re-projection effect of the projected camera image is found, the image is processed by a translation matrix Rt 0 And performing fine adjustment rotation on the medium value.
12. A camera-lidar joint calibration device based on a 3D-3D matching point pair, the device comprising:
the reference mark acquisition module is used for acquiring binary square reference marks and angular points thereof, which are used for estimating the posture of the camera, on the calibration plate;
the filtering module is used for filtering the reflection intensity of the square reference mark initial point cloud information to obtain square reference mark edge point cloud information;
and the coordinate information acquisition module is used for acquiring the coordinate information of the specific point of the square reference mark based on the square reference mark edge point cloud information.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the camera-lidar joint calibration method based on 3D-3D matching point pairs according to any of claims 1 to 11 when the program is executed by the processor.
14. A non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor, implements a camera-lidar joint calibration method based on 3D-3D matching point pairs according to any of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310840622.4A CN117572392A (en) | 2023-07-10 | 2023-07-10 | Camera-laser radar joint calibration method and device based on 3D-3D matching point pair |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310840622.4A CN117572392A (en) | 2023-07-10 | 2023-07-10 | Camera-laser radar joint calibration method and device based on 3D-3D matching point pair |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117572392A true CN117572392A (en) | 2024-02-20 |
Family
ID=89863090
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310840622.4A Pending CN117572392A (en) | 2023-07-10 | 2023-07-10 | Camera-laser radar joint calibration method and device based on 3D-3D matching point pair |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117572392A (en) |
-
2023
- 2023-07-10 CN CN202310840622.4A patent/CN117572392A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107316325B (en) | Airborne laser point cloud and image registration fusion method based on image registration | |
CN110135455B (en) | Image matching method, device and computer readable storage medium | |
CN109785379B (en) | Method and system for measuring size and weight of symmetrical object | |
CN107958482B (en) | Three-dimensional scene model construction method and device | |
US10636151B2 (en) | Method for estimating the speed of movement of a camera | |
KR101533182B1 (en) | 3d streets | |
CN110807809B (en) | Light-weight monocular vision positioning method based on point-line characteristics and depth filter | |
CN110473221B (en) | Automatic target object scanning system and method | |
US11082633B2 (en) | Method of estimating the speed of displacement of a camera | |
CN109255808B (en) | Building texture extraction method and device based on oblique images | |
JP6860620B2 (en) | Information processing equipment, information processing methods, and programs | |
CN113327296B (en) | Laser radar and camera online combined calibration method based on depth weighting | |
CN115661262A (en) | Internal and external parameter calibration method and device and electronic equipment | |
CN113253246B (en) | Calibration method for laser radar and camera | |
CN108629742B (en) | True ortho image shadow detection and compensation method, device and storage medium | |
CN110766731A (en) | Method and device for automatically registering panoramic image and point cloud and storage medium | |
CN112132971B (en) | Three-dimensional human modeling method, three-dimensional human modeling device, electronic equipment and storage medium | |
CN112017259B (en) | Indoor positioning and image building method based on depth camera and thermal imager | |
CN117572392A (en) | Camera-laser radar joint calibration method and device based on 3D-3D matching point pair | |
CN115965712A (en) | Building two-dimensional vector diagram construction method, system, equipment and storage medium | |
CN114494582A (en) | Three-dimensional model dynamic updating method based on visual perception | |
CN116704138B (en) | Method and device for establishing oblique photography three-dimensional model | |
CN117523568B (en) | Bridge height detection method, device, equipment and medium based on unmanned ship | |
CN117388831B (en) | Camera and laser radar combined calibration method and device, electronic equipment and medium | |
CN116580074B (en) | Three-dimensional reconstruction method based on multi-sensor fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |