CN117392237A - Robust laser radar-camera self-calibration method - Google Patents

Robust laser radar-camera self-calibration method Download PDF

Info

Publication number
CN117392237A
CN117392237A CN202311381487.8A CN202311381487A CN117392237A CN 117392237 A CN117392237 A CN 117392237A CN 202311381487 A CN202311381487 A CN 202311381487A CN 117392237 A CN117392237 A CN 117392237A
Authority
CN
China
Prior art keywords
edge
laser radar
current
reflection intensity
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311381487.8A
Other languages
Chinese (zh)
Inventor
项志宇
沈雷
徐维庆
赵诗雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Pan Asia Technical Automotive Center Co Ltd
Original Assignee
Zhejiang University ZJU
Pan Asia Technical Automotive Center Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU, Pan Asia Technical Automotive Center Co Ltd filed Critical Zhejiang University ZJU
Priority to CN202311381487.8A priority Critical patent/CN117392237A/en
Publication of CN117392237A publication Critical patent/CN117392237A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a robust laser radar-camera combined self-calibration method. Firstly, preprocessing data acquired by a laser radar to obtain a dense point cloud map and a reflection intensity map, extracting geometric edges from the point cloud map, and extracting intensity edges from the reflection intensity map; pixel edge lines are then extracted from the camera image. And (3) performing quick initial value search on the calibrated external parameters to realize initial registration of two edges under the common view area of the laser radar and the camera. Projecting points on the edge line of the lidar under the camera image further optimizes the external parameters between the lidar and the camera by minimizing the point-line distance residual from the radar edge points to the image edge line. The invention realizes the combined external parameter self-calibration independent of the calibration plate by fully utilizing the geometrical and intensity edge of the laser radar and the correspondence of the camera image edge, has better scene adaptability, high calibration precision and good reliability, and provides a foundation for the automatic driving technology of laser radar and camera fusion.

Description

Robust laser radar-camera self-calibration method
Technical Field
The invention relates to a laser radar-camera self-calibration method in the field of multi-sensor fusion of intelligent vehicles, in particular to a robust laser radar-camera self-calibration method.
Background
With the rapid development of the automatic driving field, the multi-sensor fusion technology becomes a great trend. The laser radar can capture 3D information in the environment in real time, measures the space distance between an object and the laser radar, has higher precision and visual field range, and is one of the mainstream sensors of the current automatic driving perception module. The camera can capture color information in the environment, can obtain rich and dense 2D environment information in the visual field range, and is an indispensable component in the current automatic driving perception module. The existing automatic driving perception technology has the difficulty of improving the precision and the robustness under various complex scenes. The visual 3D information perception capability of the laser radar and the strong 2D environment perception capability of the camera form good complementation, and the automatic driving technology of the laser radar and the camera is fused, so that the method becomes one of the current research hotspots.
The calibration method of the laser radar and the camera needs to calibrate the translational rotation external parameters between the laser radar and the camera. As the most basic module of the multi-sensor fusion technology, the calibration method needs to have higher calibration precision and robustness. At present, the mainstream algorithm requires that under a specific scene, calibration effects of cameras and laser radars can be achieved by means of calibration plates such as a checkerboard which are manually placed in a space, data are collected, and calibration tasks cannot be well completed for daily indoor complex environments or outdoor road environments when no calibration plate exists, so that errors of subsequent multi-sensor fusion are caused. Therefore, how to realize online high-precision self-calibration of the laser radar and the camera in an actual natural scene without a calibration plate is a problem which needs to be solved urgently.
Disclosure of Invention
In order to solve the problems in the background art, the invention aims to provide a robust laser radar-camera combined calibration method based on scene edge characteristics, which is suitable for common indoor and outdoor road environments.
According to the method, the spatial edge characteristics and the reflection intensity edge characteristics of the laser radar point cloud are introduced in the calibration stage, and when the spatial edge characteristics of the laser radar are weak, the reflection intensity edge characteristics are used as supplements, so that the characteristic acquisition of a scene with a weak 3D structure can be effectively improved, and the calibration robustness is improved; when the spatial edge features are extracted, a strategy of self-adapting grid side length is adopted, the integrity and the richness of the fitting plane are reserved, and the feature extraction speed is improved. And the initial laser radar and the camera external parameter matrix are obtained by using coarse calibration, so that the conditions that the direct optimization consumes longer time or the optimization result is not converged are effectively reduced. In the optimization stage, introducing a 3D distance weight, and reducing the influence of the matching at the remote position of the distance sensor on the calibration caused by larger error; and in the later stage of iterative optimization, the error matching pair with larger residual error is removed, and the probability of error in the optimization direction is reduced, so that the optimal solution can be converged in various scenes. The whole method has higher robustness, and lays a foundation for improving the environment perception performance of laser radar-camera fusion.
The technical scheme adopted by the invention comprises the following steps:
1) Preprocessing the data acquired by the camera and the laser radar respectively to obtain image edge characteristics and laser radar edge characteristics;
2) Based on the image edge characteristics and the laser radar edge characteristics, performing primary iterative optimization registration on an external reference matrix between a camera and a laser radar by utilizing ICP registration errors of the two edges to obtain a primary calibration external reference matrix with the minimum registration errors;
3) Projecting the laser radar edge features to a camera image coordinate system by using the primary calibration external parameter matrix to obtain 2D laser radar edge features; and performing iterative optimization registration of the external reference matrix again according to the laser radar edge characteristics and the 2D laser radar edge characteristics so as to align the laser radar edge with the image edge, thereby obtaining a final external reference matrix.
The step 1) specifically comprises the following steps:
1.1 Generating a reflection intensity graph according to the point cloud data acquired by the laser radar, extracting edges on the reflection intensity graph, and recovering 3D coordinates of each edge point in the reflection intensity graph, thereby being used as the edge characteristics of the laser radar reflection intensity;
1.2 Accumulating the point clouds acquired by the laser radar to obtain a dense point cloud map;
1.3 Dividing 3D grids of the dense point cloud map, adaptively adjusting the grid side length by a method of fitting planes in each grid so as to obtain different planes, then calculating the boundary line between the different planes and taking the boundary line as a spatial edge, and counting 3D coordinates of the spatial edge and taking the 3D coordinates as laser radar spatial edge characteristics;
1.4 The laser radar reflection intensity edge characteristics and the laser radar space edge characteristics are aggregated to obtain laser radar edge characteristics;
1.5 A 2D edge of the camera image is obtained by fitting and is noted as an image edge feature.
After the edge is extracted from the reflection intensity graph, and the 3D coordinates of each edge point in the reflection intensity graph are restored, so that the edge is used as the edge characteristic of the laser radar reflection intensity, specifically:
firstly, generating a mask map covered by a laser radar point cloud according to the field angle range of the laser radar and the image size of a reflection intensity map;
and then, calculating the filling rate of the reflection intensity graph in the mask graph, if the filling rate is smaller than the preset filling rate, reducing the width W and the height H of the reflection intensity graph, filling holes in pixels in the mask graph in the reduced reflection intensity graph, extracting the edges of the reflection intensity graph after filling the holes and taking the edges as reflection intensity edges, and recovering the 3D coordinates of the points on the reflection intensity edges and taking the points as laser radar reflection intensity edge characteristics.
The 1.3) is specifically:
1.3.1 Setting the side length of the initial grid and performing 3D voxel grid division on the dense point cloud map;
1.3.2 Performing plane fitting on the point clouds in each voxel grid by using a random sampling consistency RANSAC fitting method to obtain a fitting plane, counting the point cloud duty ratio of the point clouds in the current grid of the number of points occupied in the current fitting plane and recording the point cloud duty ratio as the point duty ratio of the current fitting plane, executing 1.3.3) if the point duty ratio of the current fitting plane is larger than or equal to a first threshold value, otherwise executing 1.3.4);
1.3.3 Marking the points occupied by the current fitting plane as the same plane and removing the points from the current voxel grid, and then continuously performing plane fitting on the residual point cloud in the current voxel grid by using a random sampling consistency RANSAC fitting method to obtain different planes and removing the points corresponding to the fitted planes until the residual point cloud in the current voxel grid is smaller than a second threshold value, so as to obtain all the planes in the current voxel grid;
1.3.4 Marking the points occupied by the current fitting plane as the same plane and removing the points from the current voxel grid, equally dividing the current voxel grid, carrying out plane fitting on the point clouds in each sub-grid based on the residual point clouds in the current voxel grid, executing 1.3.3 if the point occupation ratio of the fitting plane in one sub-grid is greater than or equal to a first threshold value, otherwise, continuing equally dividing the current sub-grid until the equal division times threshold value is reached, and executing 1.3.3 on all the sub-grids, thereby obtaining all the planes in the current voxel grid;
1.3.5 Traversing the remaining voxel grids, repeating 1.3.2) -1.3.4) until all planes in all voxel grids are obtained;
1.3.6 Calculating an included angle of any two planes in each voxel grid, fitting an intersecting line segment of the current two planes and taking the intersecting line segment as a spatial edge when the included angle is in a preset angle interval, traversing and calculating all planes to obtain all the spatial edges, and counting 3D coordinates of all the spatial edges and taking the 3D coordinates as laser radar spatial edge characteristics.
The 1.4) is specifically:
and in the laser radar reflection intensity edge characteristics, when laser radar space edge characteristic points exist around each laser radar reflection intensity edge characteristic point, deleting the laser radar reflection intensity edge characteristic point, and forming laser radar edge characteristics by the rest laser radar reflection intensity edge characteristics and all the laser radar space edge characteristics.
The 2) is specifically:
2.1 Setting an initial external parameter matrix, a search radius and a search step length, and taking one parameter of six parameters of the initial external parameter matrix as an initial search degree of freedom;
2.2 Adding a forward search step length and a reverse search step length to the current search degree of freedom in the current external reference matrix, fixing other five parameters in the current external reference matrix, obtaining a new external reference matrix and taking the new external reference matrix as the external reference matrix for the next search; projecting the edge feature of the laser radar to a 2D pixel coordinate system based on the new external parameter matrix to obtain the edge feature of the 2D laser radar, counting ICP registration errors of the edge feature of the 2D laser radar and the edge feature of the image and recording the ICP registration errors as the ICP registration errors corresponding to the current new external parameter matrix if the pixel coordinate of the edge feature of the 2D laser radar does not exceed the range of the camera image, and executing 2.3); otherwise, directly executing the step 2.3);
2.3 Repeating 2.2) according to the current external reference matrix, searching the current searching degree of freedom until reaching the searching radius, obtaining a plurality of external reference matrixes under the current searching degree of freedom and corresponding ICP registration errors, and taking the searching degree of freedom in the external reference matrix with the minimum ICP registration error as the optimal searching degree of freedom and fixing;
2.4 After changing parameters in the external parameter matrix, obtaining new searching degrees of freedom, repeating 2.2) -2.3), obtaining optimal searching degrees of freedom corresponding to different searching degrees of freedom, thereby obtaining an optimal external parameter matrix under the current searching radius and searching step length and taking the optimal external parameter matrix as an initial external parameter matrix for the next searching;
2.5 Reducing the searching radius and the searching step length, repeating 2.2) -2.4) according to the current initial external reference matrix until the searching radius and the searching step length are reduced to the preset searching radius and the preset searching step length or the preset reduction times are reached, obtaining the final optimal external reference matrix and marking the final optimal external reference matrix as the initial calibrated external reference matrix.
The 3) is specifically as follows:
3.1 Taking the initial calibration external parameter matrix as an initial external parameter matrix;
3.2 Using the current initial external parameter matrix to project the edge features of the laser radar to a 2D pixel coordinate system to obtain the edge features of the 2D laser radar; converting the image edge characteristics into 2D image edge characteristic point clouds;
3.3 Searching for ith 2D laser radar edge feature in 2D image edge feature point cloud by using K nearest neighborThe nearest k 2D image edge features within the preset pixel distance of (1) form an image edge line by the k 2D image edge features, and the midpoint of the image edge line is calculated>And covariance, decomposing eigenvalue of covariance, and adding line eigenvector corresponding to maximum eigenvalue>The direction vector is marked as an image edge line;
3.4 Direction vector based on image edge line, ith 2D lidar edge feature L) edgi And the midpoint of the image edge lineDistance construction point-line distance residual +.>From the ith lidar edge feature and the midpoint of the corresponding image edge line +.>Composition ithMatching the characteristics;
3.5 Repeating 3.3) -3.4), traversing and calculating to obtain feature matching pairs corresponding to the edge features of the residual laser radar and point-line distance residual errors, and optimizing a current initial external reference matrix according to all the feature matching pairs and the corresponding point-line distance residual errors, so that residual errors among all the feature matching pairs are minimized, and obtaining a current optimal external reference matrix and taking the current optimal external reference matrix as an initial external reference matrix of the next time;
3.6 Repeating 3.2) -3.6) according to the current initial external parameter matrix, and performing repeated iterative optimization of the external parameter matrix to obtain the final optimal external parameter matrix.
The point-line distance residualThe formula of (2) is as follows:
in 3.5), the formula of the current optimal extrinsic matrix z is as follows:
b is the matching pair number of all the laser radars and cameras; s is(s) i Represents the adjustment coefficient, w, of the ith pair of matching pairs i Represents the distance weight, d, of the i-th pair of matched pairs th Representing residual threshold, II 2 A squaring operation representing a numerical value of two norms, || represents an absolute value taking operation.
In 3.6), in the multiple iterative optimization of the extrinsic matrix, after each several iterative optimizations, the residual threshold d th Can be gradually reduced.
Compared with the prior art, the invention has the following beneficial effects:
(1) Under the condition that a specific calibration object is not needed, the online self-adaptive self-calibration of external parameters between the laser radar and the camera can be realized, and the long-term reliability of the performance of the multi-sensor fusion task in the automatic driving vehicle is improved;
(2) The method extracts the spatial edge and the reflection intensity edge characteristics for calibration, and improves the characteristic extraction stability and the calibration robustness;
(3) The self-adaptive grid size is adopted in the space edge extraction, and the planes in the grids are fitted from large to small, so that the accuracy and the speed of the space edge feature extraction are improved; 2-dimensional reflection intensity images are projected through spherical coordinates in the point cloud, a reflection intensity edge map is extracted, and feature extraction efficiency is improved;
(4) In the iterative optimization stage, the distance weight is introduced, and the point-line distance residual error constructed by the matching pair far away from the laser radar and the camera has larger error, so that the optimization stability is improved by setting smaller weight; meanwhile, the error matching pair with overlarge residual error is removed, the guide optimization is prevented from running to the error direction, the iteration optimization convergence is accelerated, and finally the precision of the calibration method is improved.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a plot of lidar reflection intensity for an embodiment;
FIG. 3 is a laser radar reflection intensity edge map of an embodiment;
FIG. 4 is a lidar spatial edge feature of an embodiment;
FIG. 5 is an example indoor scene lidar edge feature projection result;
FIG. 6 is an example of an outdoor scene lidar edge feature projection result;
FIG. 7 is a projection view of laser radar point clouds before and after indoor scene calibration in an embodiment;
fig. 8 is a projection view of laser radar point clouds before and after outdoor scene calibration according to an embodiment.
Detailed Description
The invention is further described below with reference to the drawings and examples.
The calibration method of the present invention will be described more clearly with reference to one embodiment.
The embodiment of the invention and the implementation process of the invention are as follows:
as shown in fig. 1, the present invention includes the steps of:
1) After preprocessing data collected by a camera and a laser radar respectively, respectively obtaining image edge characteristics and laser radar edge characteristics, wherein the laser radar edge characteristics comprise laser radar space edge characteristics and laser radar reflection intensity edge characteristics;
the step 1) is specifically as follows:
1.1 Generating a reflection intensity graph according to the point cloud data acquired by the laser radar, extracting edges on the reflection intensity graph, and recovering 3D coordinates of each edge point in the reflection intensity graph, thereby being used as the edge characteristics of the laser radar reflection intensity;
1.1 In the step of generating a reflection intensity map according to the point cloud data acquired by the laser radar, the reflection intensity map is shown in fig. 2, and specifically is:
firstly, converting point cloud data acquired by a laser radar from a Cartesian coordinate system to a spherical coordinate system to obtain coordinates of the spherical coordinate system, wherein a coordinate conversion calculation formula is as follows:
wherein x, y and z are 3-dimensional coordinate values of each point in the point cloud under a Cartesian coordinate system, and r, θ and phi are 3-dimensional coordinate values of each point under a spherical coordinate system;
and then, by combining the coordinates of the spherical coordinate system, projecting the points of the laser radar point cloud to a 2D plane to obtain a reflection intensity graph, wherein the calculation formula of each pixel coordinate in the reflection intensity graph is as follows:
wherein u and v are 2 coordinate values of the pixel, W and H are the width and height of the reflection intensity map, fov, respectively W And fov H Is the horizontal angle of view and the vertical angle of view of the lidar. The horizontal and vertical field angles of the lidar used in this embodiment are respectively: fov W =70.4 and fov H =77.2。
After the edge is extracted from the reflection intensity graph, and the 3D coordinates of each edge point in the reflection intensity graph are restored, so that the edge is taken as the edge characteristic of the laser radar reflection intensity, and the method specifically comprises the following steps:
firstly, generating a mask image covered by a laser radar point cloud according to a known laser radar field angle range and the image size of a preset reflection intensity image; in this embodiment, the horizontal and vertical angles of view of the lipox Avia radar are 70.4 ° and 77.2 ° (the scanning coverage area thereof is one ellipse), respectively, and the width and height of the reflection intensity image are set to W (1400) and H (1400), respectively.
And then, calculating the filling rate of the reflection intensity graph in the mask graph, if the filling rate is smaller than the preset filling rate (for example, 80%), reducing the width W and the height H of the reflection intensity graph, filling holes in pixels in the mask graph in the reduced reflection intensity graph, extracting the canny edge of the reflection intensity graph after filling the holes and taking the canny edge as the reflection intensity edge, and recovering the 3D coordinates of points on the reflection intensity edge and taking the points as laser radar reflection intensity edge characteristics. In use, for areas within the reflected intensity map where there is no coverage of lidar points, i.e. voids, such points, even if edge features, are not added to the reflected intensity edge features, since the points are void filled, there are no actual real 3D coordinates (x, y, z) corresponding to them, as shown in fig. 3.
1.2 Accumulating the point clouds collected by the static laser radar to obtain a dense point cloud map; due to the non-repeated scanning characteristic of the Livox Avia laser radar, the point cloud map with dense environment in the scanning range can be obtained by superposing a plurality of frames of laser radar point clouds by standing.
1.3 Dividing 3D grids of the dense point cloud map, adaptively adjusting the grid side length by a method of fitting planes in each grid so as to obtain different planes, then calculating the boundary line between the different planes and taking the boundary line as a spatial edge, and counting 3D coordinates of the spatial edge and taking the 3D coordinates as laser radar spatial edge characteristics;
1.3 Specifically:
1.3.1 Setting an initial larger fixed grid side length (such as 3 m) and performing 3D voxel grid division on the dense point cloud map;
1.3.2 Performing plane fitting on the point cloud in each voxel grid by using a random sampling consistency RANSAC (Random Sample Consensus) fitting method to obtain a fitting plane, counting the duty ratio of the point cloud in the current grid of the number of points occupied in the current fitting plane and recording the duty ratio as the point duty ratio of the current fitting plane, executing 1.3.3 if the point duty ratio of the current fitting plane is greater than or equal to a first threshold value, otherwise executing 1.3.4);
1.3.3 Marking the points occupied by the current fitting plane as the same plane and removing the points from the current voxel grid, and then continuously performing plane fitting on the residual point cloud in the current voxel grid by using a random sampling consistency RANSAC fitting method to obtain different planes and removing the points corresponding to the fitted planes until the residual point cloud in the current voxel grid is smaller than a second threshold value, so as to obtain all the planes in the current voxel grid;
1.3.4 Indicating that a plurality of planes exist in the current grid, marking points occupied by the current fitting plane as the same plane, removing the points from the current volume grid, equally dividing the current volume grid, for example, reducing the side length of the grid to be half of the original length, namely equally dividing the current grid space into 8 sub-grids, carrying out plane fitting on the point cloud in each sub-grid by using a random sampling consistency RANSAC fitting method based on the residual point cloud in the current volume grid, if the point occupation ratio of the fitting plane in one sub-grid is greater than or equal to a first threshold value, executing 1.3.3), otherwise continuing equally dividing the current sub-grid until the equal division number threshold value is reached, in the embodiment, dividing the current sub-grid for 3 times at most, namely executing the sub-grid dividing process for 3 times at most, and executing 1.3.3 on all sub-grids, thereby obtaining all planes in the current volume grid;
1.3.5 Traversing the remaining voxel grids, repeating 1.3.2) -1.3.4) until all planes in all voxel grids are obtained;
1.3.6 Calculating an included angle of any two planes in each voxel grid, fitting an intersecting line segment of the current two planes and taking the intersecting line segment as a spatial edge when the included angle is in a preset angle interval, traversing and calculating all planes to obtain all the spatial edges, and counting 3D coordinates of all the spatial edges and taking the 3D coordinates as laser radar spatial edge characteristics. In this embodiment, the preset angle interval is set to be greater than 30 °. In the fitted planar structure, the larger planes are fitted in a large grid and the smaller planes are fitted in a small grid, corresponding to the extracted edge features as shown in fig. 4.
1.4 The laser radar reflection intensity edge characteristics and the laser radar space edge characteristics are aggregated to obtain laser radar edge characteristics;
1.4 Specifically:
in the laser radar reflection intensity edge characteristics, when laser radar space edge characteristic points exist around each laser radar reflection intensity edge characteristic point, deleting the laser radar reflection intensity edge characteristic point, namely saving all space edge characteristics and part of reflection intensity edge characteristics as supplements, forming laser radar edge characteristics by the rest laser radar reflection intensity edge characteristics and all laser radar space edge characteristics together, and reserving the attribute (namely the reflection intensity edge or the space edge) of each characteristic point.
1.5 Using canny operator fitting to obtain 2D edges of the camera image and noting as image edge featuresIts coordinates are +.>
2) Based on the image edge characteristics and the laser radar edge characteristics, performing primary iteration optimization registration on an external reference matrix between a camera and the laser radar by using ICP (Iterative Closest Point) registration errors of the two edges to obtain a primary calibration external reference matrix with the minimum registration error;
2) The method comprises the following steps:
2.1 Setting an initial external parameter matrix, a search radius (R) and a search step length (an angle step length delta R and a translation step length delta t), and taking one parameter of six parameters of the initial external parameter matrix as an initial search degree of freedom; in specific implementation, selecting a default external parameter matrix as an initial external parameter matrix, wherein the default external parameter matrix is an external parameter matrix under the condition that the radar xyz direction is front upper left and the camera xyz direction is front lower right; or may be the initial extrinsic matrix if a coarse extrinsic matrix is present. The initial external parameters have larger errors, and the projected edge graph is shown in fig. 5 and 6, wherein the thinnest black solid line point is a camera edge characteristic, the medium-thickness gray solid line point is a laser radar reflection intensity edge characteristic, and the thickest black solid line point is a laser radar space edge characteristic;
2.2 Adding a forward search step length and a reverse search step length to the current search degree of freedom in the current external reference matrix, fixing other five parameters in the current external reference matrix, obtaining a new external reference matrix and taking the new external reference matrix as the external reference matrix for the next search;
external parameter matrixIs a 4×4 matrix, divided into rotation matrices +.>And translation matrix->Decomposing maleThe formula is as follows:
in the operation process, the alignment of the coordinates is avoided, the 3D coordinates are respectively rotated and translated to edges in sequence, and the conversion formula is as follows:P c the coordinates of the laser radar 3-dimensional point projected under the camera coordinate system.
Calculating a rotation matrix in the extrinsic matrix, a matrix R rotating around an xyz rotation axis x 、R y 、R z The definition is as follows:
the extrinsic matrix between the lidar and the camera is:
converting the laser radar 3D coordinates to a camera coordinate system:
carrying out image normalization projection on the converted laser radar edge 3D point coordinates:
using camera references (f) x ,f y ,c x ,c y ,k 1 ,k 2 ,k 3 ,p 1 ,p 2 ) And (3) distortion correction:
wherein x is o 、y o For the xy coordinates normalized in the z direction, r o Is the coordinates (x) o ,y o ) Distance from origin, x C ,y C ,z C For the laser radar three-dimensional point projection to the coordinates under the camera coordinate system, tx, ty, tz are the translational components of the external parameters between the camera and the radar, and the internal parameters of the pinhole camera are f x ,f y ,c x ,c y ,k 1 ,k 2 ,k 3 ,p 1 ,p 2 The composition is obtained by calibration in advance, wherein f x ,f y C is the focal length of the camera x ,c y K is the center coordinate of the origin of the camera in the pixel coordinate system 1 ,k 2 ,k 3 ,p 1 ,p 2 The first distortion parameter and the fifth distortion parameter of the pinhole camera are respectively x distorted And y distorted The camera coordinate system coordinates after the distortion removal.
Finally, the coordinate L of the 3D coordinate of the edge point of the laser radar under the camera pixel coordinate system can be obtained edge
Projecting the edge features of the laser radar to a 2D pixel coordinate system based on the new external parameter matrix to obtain the edge features of the 2D laser radarIts coordinates are +.>If the pixel coordinates of the two-dimensional laser radar do not exceed the range of the camera image, counting ICP registration errors of the 2D laser radar edge features and the image edge features, and recording the ICP registration errors as ICP registration errors corresponding to the current new external parameter matrix, and aiming at the camera edge point C edge Construction of a 2-dimensional kd-Tree (k-dimensional Tree) for each lidar edge Point +.>Find the nearest camera edge point +.>Calculating a pixel distance residual d between two points ij Satisfies the following conditionsIf d ij <d th The current matching is considered to be effective, the residual error is reserved, and all laser radar edge points L are accumulated edge Is used as the ICP registration error. And then executing 2.3); otherwise, skipping the search, namely directly executing 2.3);
2.3 Repeating 2.2) according to the current external reference matrix, searching the current searching degree of freedom until reaching the searching radius, obtaining a plurality of external reference matrixes under the current searching degree of freedom and corresponding ICP registration errors, and taking the searching degree of freedom in the external reference matrix with the minimum ICP registration error as the optimal searching degree of freedom and fixing;
2.4 After changing parameters in the external parameter matrix, obtaining new searching degrees of freedom, repeating 2.2) -2.3), obtaining optimal searching degrees of freedom corresponding to different searching degrees of freedom, thereby obtaining an optimal external parameter matrix under the current searching radius and searching step length and taking the optimal external parameter matrix as an initial external parameter matrix for the next searching; in this embodiment, the sequential searches are performed in the order of [ roll, pitch, law, tx, ty, tz ]. For rotation angles roll (roll angle rotated about the x-axis), pitch (pitch angle rotated about the y-axis) and yaw (yaw angle rotated about the z-axis), forward and reverse steps are added to one of the rotation angles at a time, for example: roll = roll ± δr; translation tx (x-direction translation), ty (y-direction translation), tz (z-direction translation), for example: tx=tx±δt. Adding forward and reverse step sizes for multiple times in sequence according to the sequence of [ roll, pitch, yaw, tx, ty, tz ], searching only one degree of freedom at a time, and fixing the values of the other 5 degrees of freedom; until the search radius is reached, for example, the last step of tz is added as a result of tz=tz±r×δt.
2.5 Reducing the searching radius and the searching step length, repeating 2.2) -2.4) according to the current initial external reference matrix until the searching radius and the searching step length are reduced to the preset searching radius and the preset searching step length or the preset reduction times are reached, obtaining the final optimal external reference matrix and marking the final optimal external reference matrix as the initial calibrated external reference matrix. In the embodiment, three searches are executed together, wherein the first round of angle is 0-360 degrees, the step length is 45 degrees, the translation is 0-2m, and the step length is 0.5m; the second round of angle is 0-90 degrees step length is 15 degrees, translation is 0-0.5m, step length is 0.1m; the first round angle is 0-15 degrees, the step length is 3 degrees, the translation is 0-0.1m, and the step length is 0.02m;
3) Projecting the laser radar edge features to a camera image coordinate system by using the primary calibration external parameter matrix to obtain 2D laser radar edge features; and constructing a point-line pixel distance residual according to the laser radar edge characteristics and the 2D laser radar edge characteristics, and performing iterative optimization registration on the point-line pixel distance residual again to align the laser radar edge with the image edge so as to obtain a final external reference matrix.
3) The method comprises the following steps:
3.1 Taking the initial calibration external parameter matrix as an initial external parameter matrix;
3.2 Using the current initial external parameter matrix to project the edge features of the laser radar to a 2D pixel coordinate system to obtain the edge features of the 2D laser radar; converting the image edge characteristics into 2D image edge characteristic point clouds, specifically constructing a 2D kd-tree for the 2D image edge characteristic point clouds by means of a PCL (Point Cloud Library) point cloud processing library, and searching adjacent points by using K neighbors;
3.3 Searching for ith 2D laser radar edge feature in 2D image edge feature point cloud by using K nearest neighborK 2D image edge features nearest within a preset pixel distance, in this embodiment, k=5, at least 5 image feature points are required. Constituting an image edge line from k 2D image edge features, calculating the midpoint +.>Covariance, midpointIs +.>Performing eigenvalue decomposition on covariance, and adding line eigenvector corresponding to maximum eigenvalue>The direction vector denoted as image edge line, wherein line feature vector +.>Is +.>
3.4 Direction vector based on image edge line, ith 2D lidar edge featureAnd midpoint of the image edge line->Distance construction point-line distance residual +.>From the ith lidar edge feature and the midpoint of the corresponding image edge line +.>Form the ith feature matching pair whose attribute is set to +.>
Point-to-line distance residualThe formula of (2) is as follows:
3.5 Repeating 3.3) -3.4), traversing and calculating to obtain feature matching pairs corresponding to the edge features of the residual laser radar and point-line distance residual errors, and optimizing a current initial external reference matrix according to all the feature matching pairs and the corresponding point-line distance residual errors, so that residual errors among all the feature matching pairs are minimized, and obtaining a current optimal external reference matrix and taking the current optimal external reference matrix as an initial external reference matrix of the next time;
3.5 The formula of the current optimal extrinsic matrix x) is as follows:
wherein x= [ roll, pitch, yaw, tx, ty, tz ]]N is the matching pair number of all the laser radars and cameras; s is(s) i Represents the adjustment coefficient, w, of the ith pair of matching pairs i Represents the distance weight, d, of the i-th pair of matched pairs th Representing residual threshold, II 2 Representing the square of the numerical value two norms, || represents taking absolute value operations.
Because of the area of the camera that is farther from the sensor, each pixel corresponds to a larger physical size, such as an object of the same size, that occupies a larger pixel in the image when placed closer to the camera, and that may occupy only a smaller number of pixels in the image when placed fartherThere are several points; therefore, the matching pair far away from the camera and the laser radar has larger error, and the invention passes through the edge characteristic point of the laser radarCalculating distance weight w i Weighting the point-line distance residual errors so that the further the matching pair forms, the smaller the residual error weight; meanwhile, the error matching pair with overlarge residual errors needs to be removed, when the laser radar edge feature points are projected into an image pixel coordinate system, the laser radar edge points and the nearest camera edge points are distant from an instructor, and when the laser radar edge features and the camera edge features possibly represent different points in a physical space, the current matching pair is indicated to be the error matching, and the error matching pair is specifically expressed as a point-line distance residual error->Greater than threshold d th The initial threshold is set to d th =25, the invention is achieved by adjusting the coefficient s i Realizing the method.
3.6 Repeating 3.2) -3.6) according to the current initial external parameter matrix, and performing repeated iterative optimization of the external parameter matrix to obtain the final optimal external parameter matrix.
3.6 In the multi-iteration optimization of the extrinsic matrix, in order to avoid the influence of the matching with larger error on the optimization direction, after each several times of iterative optimization, the residual threshold d is calculated th Can be gradually reduced. In this embodiment, the optimization is done 2 times per iteration, d th =d th -1。
And projecting the laser radar edge into the image by using the calibrated external parameters as shown in the right diagram of fig. 5 and the right diagram of fig. 6, wherein the two edges almost completely coincide, and projecting the point cloud into the image as shown in fig. 7 and 8, wherein the laser radar point cloud and the image are basically coincident.
Fig. 5 and 6 show projection views of the feature edges before and after calibration, corresponding to the visualization of the radar edges projected into the image in the indoor scene (fig. 5) and the outdoor scene (fig. 6), respectively, wherein the thinnest black solid line points are camera edge features, the medium-thick gray solid line points are laser radar reflection intensity edge features, and the thickest black solid line points are laser radar space edge features. The left graph shows the effect before calibration, and a certain superposition error exists between the two edges before calibration; the right graph is an effect graph calibrated by the method, and two edges can be seen to be almost completely overlapped after being calibrated by the method.
Fig. 7 and 8 show projection views of laser radar point clouds before and after calibration, a sequence in which calibration is not performed in a scene is selected, calibration results are used to respectively correspond to a visual view of radar points projected into an image under an indoor scene (fig. 7) and an outdoor scene (fig. 8), the left view is an effect view before calibration, the right view is an effect view after calibration by using the method, the point clouds and the image are basically misaligned, errors still exist in partial regions, such as checkerboard regions in fig. 7, window regions on the right side of fig. 8, and window frames of the point clouds and window frames of the image are misaligned; the laser radar points and the images are almost completely overlapped after the method is calibrated, and the accuracy of the calibration effect of the method is qualitatively demonstrated.
The effect of the method is verified by collecting indoor complex environment data. The sensors used include the lipox Avia series lidar and hikvrersion series camera, both of which are rigidly fixed to the same platform. The platform was held by hand for rigid operation, and data collected by standing for about 1 minute was stored in rosbag for off-line calibration. The frequency of the laser radar data is 10Hz, the frequency of the camera data is 10Hz, and the two are subjected to time synchronization triggering acquisition. The whole data set comprises 1 indoor scene and 1 outdoor scene, and is data acquired by the same handheld platform, and the effect of the method is evaluated by calibrating 2 data sets and counting the mean value and standard deviation of calibration results.
Table 1 lists the calibration results of the method under this indoor and outdoor complex environmental data set. The columns of the table are the rotational and translational components of the calibration external matrix, and the calibration external parameters of the method under 2 total complex environment data sets indoors and outdoors are counted, wherein the results are listed in the form of mean value +/-standard deviation. Since the external parameter values of the calibration between the laser radar and the camera are difficult to obtain, the standard deviation of the calibration results of a plurality of scenes is generally used as an evaluation standard to reflect the accuracy and the robustness of the calibration method laterally. As can be seen from the table, the translational calibration precision of the method can reach the centimeter level, and the rotational angle calibration precision is less than 1 degree.
Table 1 shows the self-calibration results of the method of the present invention in indoor and outdoor environments
Data set tx(m) ty(m) tz(m) roll(°) pitch(°) yaw(°)
Indoor unit -0.001±0.006 0.069±0.009 -0.07±0.004 90.21±0.03 -0.52±0.05 89.57±0.12
Outdoor unit -0.009±0.014 0.048±0.024 -0.09±0.03 89.82±0.21 -0.63±0.11 89.06±0.08
Average of -0.007±0.011 0.059±0.013 -0.82±0.04 90.01±0.25 -0.58±0.10 89.3±0.31
Finally, it should be noted that the above-mentioned embodiments and descriptions are only illustrative of the technical solution of the present invention and are not limiting. It will be understood by those skilled in the art that various modifications and equivalent substitutions may be made to the present invention without departing from the spirit and scope of the present invention as defined in the appended claims.

Claims (10)

1. A robust laser radar-camera combined self-calibration method is characterized by comprising the following steps:
1) Preprocessing the data acquired by the camera and the laser radar respectively to obtain image edge characteristics and laser radar edge characteristics;
2) Based on the image edge characteristics and the laser radar edge characteristics, performing primary iterative optimization registration on an external reference matrix between a camera and a laser radar by utilizing ICP registration errors of the two edges to obtain a primary calibration external reference matrix with the minimum registration errors;
3) Projecting the laser radar edge features to a camera image coordinate system by using the primary calibration external parameter matrix to obtain 2D laser radar edge features; and performing iterative optimization registration of the external reference matrix again according to the laser radar edge characteristics and the 2D laser radar edge characteristics so as to align the laser radar edge with the image edge, thereby obtaining a final external reference matrix.
2. The method for robust lidar-camera joint self-calibration according to claim 1, wherein the step 1) specifically comprises:
1.1 Generating a reflection intensity graph according to the point cloud data acquired by the laser radar, extracting edges on the reflection intensity graph, and recovering 3D coordinates of each edge point in the reflection intensity graph, thereby being used as the edge characteristics of the laser radar reflection intensity;
1.2 Accumulating the point clouds acquired by the laser radar to obtain a dense point cloud map;
1.3 Dividing 3D grids of the dense point cloud map, adaptively adjusting the grid side length by a method of fitting planes in each grid so as to obtain different planes, then calculating the boundary line between the different planes and taking the boundary line as a spatial edge, and counting 3D coordinates of the spatial edge and taking the 3D coordinates as laser radar spatial edge characteristics;
1.4 The laser radar reflection intensity edge characteristics and the laser radar space edge characteristics are aggregated to obtain laser radar edge characteristics;
1.5 A 2D edge of the camera image is obtained by fitting and is noted as an image edge feature.
3. The method for robust lidar-camera joint self-calibration according to claim 2, wherein after the edges are extracted from the reflection intensity map, the 3D coordinates of each edge point in the reflection intensity map are recovered, so as to be used as the edge characteristics of the lidar reflection intensity, specifically:
firstly, generating a mask map covered by a laser radar point cloud according to the field angle range of the laser radar and the image size of a reflection intensity map;
and then, calculating the filling rate of the reflection intensity graph in the mask graph, if the filling rate is smaller than the preset filling rate, reducing the width W and the height H of the reflection intensity graph, filling holes in pixels in the mask graph in the reduced reflection intensity graph, extracting the edges of the reflection intensity graph after filling the holes and taking the edges as reflection intensity edges, and recovering the 3D coordinates of the points on the reflection intensity edges and taking the points as laser radar reflection intensity edge characteristics.
4. A robust lidar-camera joint self-calibration method according to claim 2, characterized in that 1.3) is specifically:
1.3.1 Setting the side length of the initial grid and performing 3D voxel grid division on the dense point cloud map;
1.3.2 Performing plane fitting on the point clouds in each voxel grid by using a random sampling consistency RANSAC fitting method to obtain a fitting plane, counting the point cloud duty ratio of the point clouds in the current grid of the number of points occupied in the current fitting plane and recording the point cloud duty ratio as the point duty ratio of the current fitting plane, executing 1.3.3) if the point duty ratio of the current fitting plane is larger than or equal to a first threshold value, otherwise executing 1.3.4);
1.3.3 Marking the points occupied by the current fitting plane as the same plane and removing the points from the current voxel grid, and then continuously performing plane fitting on the residual point cloud in the current voxel grid by using a random sampling consistency RANSAC fitting method to obtain different planes and removing the points corresponding to the fitted planes until the residual point cloud in the current voxel grid is smaller than a second threshold value, so as to obtain all the planes in the current voxel grid;
1.3.4 Marking the points occupied by the current fitting plane as the same plane and removing the points from the current voxel grid, equally dividing the current voxel grid, carrying out plane fitting on the point clouds in each sub-grid based on the residual point clouds in the current voxel grid, executing 1.3.3 if the point occupation ratio of the fitting plane in one sub-grid is greater than or equal to a first threshold value, otherwise, continuing equally dividing the current sub-grid until the equal division times threshold value is reached, and executing 1.3.3 on all the sub-grids, thereby obtaining all the planes in the current voxel grid;
1.3.5 Traversing the remaining voxel grids, repeating 1.3.2) -1.3.4) until all planes in all voxel grids are obtained;
1.3.6 Calculating an included angle of any two planes in each voxel grid, fitting an intersecting line segment of the current two planes and taking the intersecting line segment as a spatial edge when the included angle is in a preset angle interval, traversing and calculating all planes to obtain all the spatial edges, and counting 3D coordinates of all the spatial edges and taking the 3D coordinates as laser radar spatial edge characteristics.
5. The robust lidar-camera joint self-calibration method according to claim 2, wherein 1.4) is specifically:
and in the laser radar reflection intensity edge characteristics, when laser radar space edge characteristic points exist around each laser radar reflection intensity edge characteristic point, deleting the laser radar reflection intensity edge characteristic point, and forming laser radar edge characteristics by the rest laser radar reflection intensity edge characteristics and all the laser radar space edge characteristics.
6. The robust lidar-camera joint self-calibration method according to claim 1, wherein the 2) is specifically:
2.1 Setting an initial external parameter matrix, a search radius and a search step length, and taking one parameter of six parameters of the initial external parameter matrix as an initial search degree of freedom;
2.2 Adding a forward search step length and a reverse search step length to the current search degree of freedom in the current external reference matrix, fixing other five parameters in the current external reference matrix, obtaining a new external reference matrix and taking the new external reference matrix as the external reference matrix for the next search; projecting the edge feature of the laser radar to a 2D pixel coordinate system based on the new external parameter matrix to obtain the edge feature of the 2D laser radar, counting ICP registration errors of the edge feature of the 2D laser radar and the edge feature of the image and recording the ICP registration errors as the ICP registration errors corresponding to the current new external parameter matrix if the pixel coordinate of the edge feature of the 2D laser radar does not exceed the range of the camera image, and executing 2.3); otherwise, directly executing the step 2.3);
2.3 Repeating 2.2) according to the current external reference matrix, searching the current searching degree of freedom until reaching the searching radius, obtaining a plurality of external reference matrixes under the current searching degree of freedom and corresponding ICP registration errors, and taking the searching degree of freedom in the external reference matrix with the minimum ICP registration error as the optimal searching degree of freedom and fixing;
2.4 After changing parameters in the external parameter matrix, obtaining new searching degrees of freedom, repeating 2.2) -2.3), obtaining optimal searching degrees of freedom corresponding to different searching degrees of freedom, thereby obtaining an optimal external parameter matrix under the current searching radius and searching step length and taking the optimal external parameter matrix as an initial external parameter matrix for the next searching;
2.5 Reducing the searching radius and the searching step length, repeating 2.2) -2.4) according to the current initial external reference matrix until the searching radius and the searching step length are reduced to the preset searching radius and the preset searching step length or the preset reduction times are reached, obtaining the final optimal external reference matrix and marking the final optimal external reference matrix as the initial calibrated external reference matrix.
7. The robust lidar-camera joint self-calibration method of claim 1, wherein the 3) is specifically:
3.1 Taking the initial calibration external parameter matrix as an initial external parameter matrix;
3.2 Using the current initial external parameter matrix to project the edge features of the laser radar to a 2D pixel coordinate system to obtain the edge features of the 2D laser radar; converting the image edge characteristics into 2D image edge characteristic point clouds;
3.3 Searching for ith 2D laser radar edge feature in 2D image edge feature point cloud by using K nearest neighborThe nearest k 2D image edge features within the preset pixel distance of (1) form an image edge line by the k 2D image edge features, and the midpoint of the image edge line is calculated>And covariance, decomposing eigenvalue of covariance, and adding line eigenvector corresponding to maximum eigenvalue>The direction vector is marked as an image edge line;
3.4 Direction vector based on image edge line, ith 2D lidar edge featureAnd midpoint of the image edge line->Distance construction point-line distance residual +.>From the ith lidar edge feature and the midpoint of the corresponding image edge line +.>Forming an ith feature matching pair;
3.5 Repeating 3.3) -3.4), traversing and calculating to obtain feature matching pairs corresponding to the edge features of the residual laser radar and point-line distance residual errors, and optimizing a current initial external reference matrix according to all the feature matching pairs and the corresponding point-line distance residual errors, so that residual errors among all the feature matching pairs are minimized, and obtaining a current optimal external reference matrix and taking the current optimal external reference matrix as an initial external reference matrix of the next time;
3.6 Repeating 3.2) -3.6) according to the current initial external parameter matrix, and performing repeated iterative optimization of the external parameter matrix to obtain the final optimal external parameter matrix.
8. A robust lidar-camera joint self-calibration method according to claim 7, characterized in that the point-line distance residual errorThe formula of (2) is as follows:
9. the robust lidar-camera joint self-calibration method of claim 7, wherein in 3.5), the formula of the current optimal extrinsic matrix x is as follows:
wherein n is the matching pair number of all the laser radars and cameras; s is(s) i Represents the adjustment coefficient, w, of the ith pair of matching pairs i Represents the distance weight, d, of the i-th pair of matched pairs th Representing residual threshold, II 2 A squaring operation representing a numerical value of two norms, || represents an absolute value taking operation.
10. The method as claimed in claim 7, wherein in 3.6), the residual threshold d is obtained after each of the iterative optimizations of the extrinsic matrix for several times th Can be gradually reduced.
CN202311381487.8A 2023-10-24 2023-10-24 Robust laser radar-camera self-calibration method Pending CN117392237A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311381487.8A CN117392237A (en) 2023-10-24 2023-10-24 Robust laser radar-camera self-calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311381487.8A CN117392237A (en) 2023-10-24 2023-10-24 Robust laser radar-camera self-calibration method

Publications (1)

Publication Number Publication Date
CN117392237A true CN117392237A (en) 2024-01-12

Family

ID=89466220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311381487.8A Pending CN117392237A (en) 2023-10-24 2023-10-24 Robust laser radar-camera self-calibration method

Country Status (1)

Country Link
CN (1) CN117392237A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117826129A (en) * 2024-03-04 2024-04-05 南京航空航天大学 On-orbit external parameter calibration method for monocular camera and laser radar

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117826129A (en) * 2024-03-04 2024-04-05 南京航空航天大学 On-orbit external parameter calibration method for monocular camera and laser radar

Similar Documents

Publication Publication Date Title
CN105160702B (en) The stereopsis dense Stereo Matching method and system aided in based on LiDAR point cloud
CN110211043B (en) Registration method based on grid optimization for panoramic image stitching
EP2847741B1 (en) Camera scene fitting of real world scenes for camera pose determination
CN107886547B (en) Fisheye camera calibration method and system
CN109118544B (en) Synthetic aperture imaging method based on perspective transformation
CN113192179A (en) Three-dimensional reconstruction method based on binocular stereo vision
CN117392237A (en) Robust laser radar-camera self-calibration method
CN108919319A (en) Sea island reef satellite image Pillarless caving localization method and system
CN112669458A (en) Method, device and program carrier for ground filtering based on laser point cloud
CN104751451B (en) Point off density cloud extracting method based on unmanned plane low latitude high resolution image
CN114529615B (en) Radar calibration method, device and storage medium
CN114463521B (en) Building target point cloud rapid generation method for air-ground image data fusion
CN115359130A (en) Radar and camera combined calibration method and device, electronic equipment and storage medium
Magri et al. Bending the doming effect in structure from motion reconstructions through bundle adjustment
CN114998532B (en) Three-dimensional image visual transmission optimization method based on digital image reconstruction
CN116402713A (en) Electric three-dimensional point cloud completion method based on two-dimensional image and geometric shape
CN115965712A (en) Building two-dimensional vector diagram construction method, system, equipment and storage medium
CN114563000B (en) Indoor and outdoor SLAM method based on improved laser radar odometer
CN113589263B (en) Method and system for jointly calibrating multiple homologous sensors
CN113240755B (en) City scene composition method and system based on street view image and vehicle-mounted laser fusion
CN116524109A (en) WebGL-based three-dimensional bridge visualization method and related equipment
CN102236893A (en) Space-position-forecast-based corresponding image point matching method for lunar surface image
CN111508067B (en) Lightweight indoor modeling method based on vertical plane and vertical line
CN114782357A (en) Self-adaptive segmentation system and method for transformer substation scene
CN114445415A (en) Method for dividing a drivable region and associated device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination