CN114742898A - Combined calibration method and system for laser radar and camera - Google Patents

Combined calibration method and system for laser radar and camera Download PDF

Info

Publication number
CN114742898A
CN114742898A CN202210379970.1A CN202210379970A CN114742898A CN 114742898 A CN114742898 A CN 114742898A CN 202210379970 A CN202210379970 A CN 202210379970A CN 114742898 A CN114742898 A CN 114742898A
Authority
CN
China
Prior art keywords
points
calibration plate
calibration
point
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210379970.1A
Other languages
Chinese (zh)
Inventor
孔阳
朱奕霖
徐世友
成慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN202210379970.1A priority Critical patent/CN114742898A/en
Publication of CN114742898A publication Critical patent/CN114742898A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention belongs to the technical field of sensor combined calibration, and particularly relates to a method for extracting calibration plate corner points in laser radar point cloud data, which comprises the following steps: extracting edge points of a calibration plate in the point cloud data; clustering the edge points, screening the obtained clusters to obtain initial clusters, and marking as potential calibration plate objects; denoising the potential calibration board object to obtain a denoised potential calibration board object; projecting the potential calibration plate object subjected to noise reduction to a plane, and further screening the initial clusters by combining the real area of the calibration plate to obtain target clusters; and performing linear fitting on the edges of the target clusters through a linear fitting algorithm to obtain a plurality of edge lines, solving the intersection points of the plurality of edge lines, and obtaining the calibration plate angular points. The corner point extraction precision of the invention is higher, so that the combined calibration of the laser radar and the camera is more reliable.

Description

Combined calibration method and system for laser radar and camera
Technical Field
The invention belongs to the technical field of sensor combined calibration, and particularly relates to a combined calibration method and system of a laser radar and a camera.
Background
The development of robots and automated driving techniques has an important limiting factor in the ability to sense the environment, usually sensors. At present, sensors for sensing environmental information mainly comprise a camera and a laser radar.
The camera performs two-dimensional imaging on the environment in a picture or video stream mode to acquire continuous color, illumination and semantic information, and then performs corresponding algorithm processing to acquire information representing the environment, so that the technology is mature, the cost is low, however, the camera imaging has high requirements on illumination conditions, the result of the algorithm is seriously influenced by an excessively bright or dark environment, and the camera is limited by the characteristic that the imaging principle is insensitive to scale information, and is difficult to accurately measure the environment; laser radar is through the transmission laser beam, catch the signal (the echo) that reflects back through the object again, carry out the perception to the environment with the principle of handling the reflected signal and solving the range finding result, its measurement accuracy is higher, simultaneously because laser radar belongs to initiative perception type sensor, the laser beam of transmission can be in the frequency channel of choosing meticulously, there is better interference killing feature, consequently can be in the violent change of luminosity, normal use in the environment of too light or no light, however laser radar is insensitive to the environmental perception that the structure is similar, the point cloud data is also comparatively sparse discontinuity (general mechanical type laser radar) generally, and no matter mechanical type laser radar or solid-state laser radar's cost all highly occupies constantly at present. Based on the above reasons, the solution adopted by the automatic driving or the robot at the present stage is to combine and use the camera and the lidar to perform fusion and complementation on the data of the camera and the lidar, so as to obtain a relatively accurate environmental perception result, and the accurate position relationship between the camera and the lidar needs to be known in the process of data fusion, namely calibration, which is very important for data fusion and subsequent perception results.
The calibration of cameras and laser radars can be generally divided into two categories, namely calibration object-based and target calibration object-based, and for application scenes pursuing accuracy, the calibration object-based method is still mainstream at present. However, most of the current methods based on calibration objects cannot adaptively and fully automatically complete the calibration process, and still require manual intervention. In a joint calibration method for a laser radar and a camera provided in the prior art, calibration scene data is respectively collected by the laser radar and the camera when a target is located in a common field of view of the laser radar and the camera, and point cloud data and image data are acquired, wherein each target has at least three straight edges and at least three target angular points, and any two adjacent straight edges intersect with the corresponding target angular points; extracting point cloud angular points corresponding to each mark angular point on the mark plate in the point cloud data to obtain three-dimensional coordinates of each mark angular point; extracting image corner points corresponding to each mark plate corner point on the mark plate in the image data to obtain two-dimensional coordinates of each mark plate corner point; and solving by a PnP (multipoint perspective) method based on point pair data composed of the three-dimensional coordinates and the two-dimensional coordinates passing through each reticle corner point to obtain external parameters between the laser radar and the camera. However, the scheme has low precision for extracting angular points in a complex environment, especially for a mechanically scanned laser radar, so that the final calibration precision is not sufficient, and the environment perception capability of the sensor system is affected finally.
Disclosure of Invention
The invention provides a calibration plate angular point extraction method in laser radar point cloud data and a combined calibration method of a laser radar and a camera for overcoming at least one defect in the prior art, wherein a noise model is established to remove the distance measurement error of the point cloud data, then a point contained in a cluster is backfilled and then a plane is fitted, and further screening is carried out through two indexes of plane fitting degree and the occupied area of local points; the method optimizes the edge extraction based on the reflection intensity, thereby realizing the automatic screening of the calibration objects in the complex environment, and having better self-adaptability and higher precision.
In order to solve the technical problems, the invention adopts the technical scheme that:
the method for extracting the angular point of the calibration plate in the laser radar point cloud data comprises the following steps:
extracting edge points of a calibration plate in the point cloud data;
clustering the edge points, screening the obtained clusters to obtain initial clusters, and marking as potential calibration plate objects;
denoising the potential calibration board object to obtain a denoised potential calibration board object;
projecting the potential calibration plate object subjected to noise reduction to a plane, and further screening the initial clusters by combining the real area of the calibration plate to obtain target clusters;
and performing linear fitting on the edges of the target clusters through a linear fitting algorithm to obtain a plurality of edge lines, solving the intersection points of the plurality of edge lines, and obtaining the calibration plate angular points.
In the scheme, when the calibration plate angular points in the point cloud data are extracted, the potential calibration plate object is subjected to noise reduction, and the potential calibration plate object subjected to noise reduction is further screened according to the real area of the calibration plate, so that more accurate calibration plate angular points can be obtained subsequently; therefore, the calibration precision of the laser radar and the camera can be further improved.
Preferably, the extracting of the edge points of the calibration plate in the point cloud data specifically includes the following steps:
adjusting the point cloud sequence to be row-first;
recording the number of points of each line of the current frame;
and classifying all the points based on the single-point smoothness evaluation quantity to obtain all the left edge points and the right edge points and form an edge point set.
Preferably, the clustering of the edge points is specifically clustering in an edge point set by using a single join algorithm, and information at least stored in each cluster is as follows: axis, maximum and minimum values on the axis, interval of index of point on each line, and coordinates of all points in the cluster under the laser radar cartesian coordinate system.
Preferably, the laser radar is a mechanical rotary laser radar that scans repeatedly, and the abrupt change is updated when the index interval of the point on each line is obtained by clustering through a single-link algorithm.
Preferably, the step of screening the obtained clusters to obtain initial clusters specifically comprises the following steps: solving each clustering bounding box through the maximum value and the minimum value on the x axis, the y axis and the z axis of the laser radar in a Cartesian coordinate system, and carrying out primary screening according to the clustering bounding boxes;
filling all original points into the clusters after the initial screening according to the index interval of the points on each line;
performing plane fitting on each cluster to respectively obtain a three-dimensional plane normal vector of the fitted calibration plate under a laser radar Cartesian coordinate system and all local points contained in the plane;
and screening clusters with the local point ratio larger than a first set threshold value to obtain initial clusters.
Preferably, the above denoising the potential calibration board object is specifically to denoise a ranging result of a midpoint of the potential calibration board object by using a three-dimensional plane normal vector, and further find coordinates of a denoised point.
Preferably, after denoising the potential calibration board object, the following steps are further performed: for each potential calibration plate object, performing binarization processing on all points based on the reflection intensity;
edge points of the potential calibration plate object are extracted based on the result of the binarization.
Preferably, the projecting the denoised potential calibration plate object to a plane and further screening the initial clusters by combining the real area of the calibration plate object to obtain the target clusters specifically includes the following steps:
solving a three-dimensional plane normal vector
Figure BDA0003592395030000031
Axial angle transformation vector to z-axis
Figure BDA0003592395030000032
Transforming vectors according to shaft angle
Figure BDA0003592395030000033
Solving a corresponding three-dimensional rotation matrix R;
finding translation vectors by averaging all local points in a potential calibration plate object
Figure BDA0003592395030000034
Using three-dimensional rotation matrix R and translation vector
Figure BDA0003592395030000039
Forming projective transformation matrices
Figure BDA0003592395030000035
Wherein R isTA transposed matrix denoted as R that projects all intra-office points within the potential calibration plate object onto a two-dimensional plane; solving convex hull of inner point of local inner point by convex hull algorithm
Figure BDA0003592395030000036
And find the convex hull
Figure BDA0003592395030000037
Area of (A)
Figure BDA0003592395030000038
Comparison of areas
Figure BDA0003592395030000041
And (5) further screening the initial clusters and the real area of the calibration plate to obtain target clusters.
Preferably, the method further comprises optimizing the obtained calibration corner point, and specifically comprises the following steps:
processing the corner points of the obtained calibration plate by using the real size of the calibration plate:
Figure BDA0003592395030000042
Figure BDA0003592395030000043
wherein, the matrix
Figure BDA0003592395030000044
Is a two-dimensional transformation matrix satisfying lie group SE (2), a00、a01、a10、a11、b0、b1Being an element of a matrix T, the matrix T*Is the result of the optimization of the matrix T, c*For the optimized corner point, ciTo obtain the corner points of the calibration plate,
Figure BDA0003592395030000045
i is a serial number of the currently processed corner point for a real corner point matrix template constructed according to the size information of the calibration plate;
transforming matrices using projections
Figure BDA0003592395030000046
And converting the optimized corner points into three-dimensional corner point coordinates.
The joint calibration method of the laser radar and the camera comprises the following steps:
respectively extracting calibration plates in the laser radar point cloud data and the camera image data;
extracting the calibration plate angular points in the laser radar point cloud data by adopting the calibration plate angular point extraction method in the laser radar point cloud data;
extracting calibration board angular points in camera image data;
and solving by utilizing the calibration plate angular points in the point cloud data and the calibration plate angular points in the image data through a PnP algorithm to obtain the joint calibration of the laser radar and the camera.
Compared with the prior art, the beneficial effects are:
1) backfilling all original points of the clusters after the primary screening, performing plane fitting, and screening according to the result of the plane fitting so as to realize automatic extraction of the calibration plate in a complex environment;
2) denoising the potential calibration plate object to adapt to sparse point cloud data and improve the precision of angular point extraction;
3) extracting edge points of the potential calibration plate object by using a reflection intensity binarization-based method, thereby further improving the extraction reliability of the angular points;
4) and optimizing the angular point of the obtained calibration plate by combining the real size of the calibration plate so as to improve the calibration precision integrally.
Drawings
Fig. 1 is a schematic diagram of a calibration plate extracted from point cloud data of a calibration plate corner extraction method in laser radar point cloud data in embodiment 1 of the present invention;
fig. 2 is a schematic diagram of a plurality of edge lines and corresponding corner points obtained by fitting with a calibration plate corner point extraction method in laser radar point cloud data in embodiment 1 of the present invention;
fig. 3 is a schematic diagram of an edge of a calibration plate and a corresponding corner obtained by optimizing the obtained calibration plate corner in the calibration plate corner extraction method in the laser radar point cloud data in embodiment 1 of the present invention;
fig. 4 is a schematic diagram of extracting edge points of a potential calibration plate object based on a binarization result in a calibration plate corner point extraction method in laser radar point cloud data according to embodiment 2 of the present invention;
fig. 5 is a schematic block diagram of a flow of a joint calibration method for a laser radar and a camera in embodiment 3 of the present invention;
FIG. 6 is a schematic block diagram of the overall connection of the combined calibration system of the laser radar and the camera in embodiment 3 of the present invention;
fig. 7 is a schematic block diagram of connection of image data systems of a combined calibration system for a laser radar and a camera in embodiment 3 of the present invention;
fig. 8 is a schematic block diagram of a connection of a point cloud data system of a combined calibration system of a laser radar and a camera in embodiment 3 of the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the invention; for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted. The positional relationships depicted in the drawings are for illustrative purposes only and are not to be construed as limiting the invention.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if there are terms such as "upper", "lower", "left", "right", "long", "short", etc., indicating orientations or positional relationships based on the orientations or positional relationships shown in the drawings, it is only for convenience of description and simplicity of description, but does not indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation, and be operated, and therefore, the terms describing the positional relationships in the drawings are only used for illustrative purposes and are not to be construed as limitations of the present patent, and specific meanings of the terms may be understood by those skilled in the art according to specific situations.
The technical scheme of the invention is further described in detail by the following specific embodiments in combination with the attached drawings:
example 1:
fig. 1 to 3 show a first embodiment of a method for extracting calibration corner points in lidar point cloud data, which includes the following steps:
s1: extracting edge points of a calibration plate in the point cloud data;
s2: clustering the edge points, screening the obtained clusters to obtain initial clusters, and marking as potential calibration plate objects;
s3: denoising the potential calibration board object to obtain a denoised potential calibration board object;
s4: projecting the potential calibration plate object subjected to noise reduction to a plane, and further screening the initial clusters by combining the real area of the calibration plate to obtain target clusters;
s5: and performing linear fitting on the edges of the target clusters through a linear fitting algorithm to obtain a plurality of edge lines, solving the intersection points of the plurality of edge lines, and obtaining the calibration plate angular points.
It should be noted that the size of the calibration plate is a known amount, which can be measured before calibration, which is well known to those skilled in the art; the calibration plate is preferably a transparent structural support calibration plate that avoids unwanted reflections of the lidar signal by the results.
In addition, the sequence of each step in this embodiment is only for reference, and is not to be understood as a limitation of this embodiment, and those skilled in the art can reasonably change the sequence to realize the same function as this embodiment.
In addition, before extracting the edge of the calibration plate in the point cloud data, the calibration plate in the point cloud data should also be identified, which is an easily conceivable content.
In this embodiment, the step of extracting the edge points of the calibration plate in the point cloud data in step S1 specifically includes the following steps:
s11: adjusting the point cloud sequence to be line-first;
s12: recording the number of points of the current frame per line
Figure BDA0003592395030000061
Where row is the identity of the row;
s13: classifying all the points based on the single-point smoothness evaluation quantity to obtain all left edge points and right edge points and form an edge point set; specifically, the smoothness evaluation amount is
Figure BDA0003592395030000062
And
Figure BDA0003592395030000063
wherein
Figure BDA0003592395030000064
Characterizes the point piIn the smooth condition of the neighborhood K on the left side and the neighborhood K on the right side after n points are spaced, wherein K and n are the number of the points, in a specific embodiment, the number of the points is respectively selected
Figure BDA0003592395030000065
And
Figure BDA0003592395030000066
based on
Figure BDA0003592395030000067
And
Figure BDA0003592395030000068
the following judgment relationship is established:
when in use
Figure BDA0003592395030000069
And
Figure BDA00035923950300000610
when the two are all larger than the thickness of the calibration plate, the point p is considerediIs a salient point;
when in use
Figure BDA00035923950300000611
Greater than the thickness of the calibration plate but
Figure BDA00035923950300000612
When the thickness of the plate is less than the thickness of the calibration plate, the point p is considerediIs the left edge point;
when in use
Figure BDA00035923950300000613
Greater than the thickness of the calibration plate but
Figure BDA00035923950300000614
When the thickness is smaller than the thickness of the calibration plate, the point p is considered to bei+nIs the right margin point;
when in use
Figure BDA00035923950300000615
And
Figure BDA00035923950300000616
when the two are all smaller than the thickness of the calibration plate, the point p is considerediIs a non-edge point;
performing the above judgment on all the points, finding out all the left edge points and the right edge points, andputting the left edge point and the right edge point into the edge point set
Figure BDA00035923950300000617
The thickness of the calibration plate is a known parameter, which is known when in use, as is well known to those skilled in the art.
In this embodiment, the clustering of the edge points in step S2 is specifically performed by using a single join algorithm to collect edge points
Figure BDA0003592395030000071
The clustering is carried out, and at least the stored information of each cluster is as follows: axes, maximum values and minimum values on the axes, and the index intervals of points on each row and the coordinates of all points in a cluster under a laser radar Cartesian coordinate system; among the parameters of the single-join algorithm, the proximity threshold τ may be set to be one quarter of the diagonal of the calibration board. Of course, the clustering may be performed by using a multi-connection algorithm, a DBSCAN (density-based clustering algorithm), or other clustering algorithms, which is not limited herein.
The point cloud data in this embodiment is obtained by a mechanical rotary laser radar that scans repeatedly, the index of the point on each line is always 0 right ahead, and is increased clockwise one circle to the maximum value near 0, and at this time, the index interval of the point on each line is updated when clustering is performed through a single link algorithm. In a specific embodiment, a variable cross is additionally stored in each cluster to record the current index interval [ vmin,vmax]Whether the mutation area right in front is crossed is defaulted to False, namely the mutation area is not crossed, the index interval is still linear at the moment, when a new index value v is come, if the variable cross is True, namely the index interval crosses the mutation area right in front, and the size of the current index value v exceeds the number of current row points
Figure BDA0003592395030000072
One half of (i.e.
Figure BDA0003592395030000073
Then pass through
Figure BDA0003592395030000074
Obtaining adjusted index value v*If, for example,
Figure BDA0003592395030000075
then v is held*V; adjusted index value v*The following may occur:
1) adjusted index value v*Minus the minimum value v of the current index intervalminThen still more than the current row point number S[row]Is one half, i.e.
Figure BDA0003592395030000076
Meaning that the original v has run through half and the original index interval vmin,vmax]Maximum value v ofminMinimum value vminHowever, since the original v and the original index section contain an index 0 together, it is necessary to mark "True" and update the minimum value v of the index sectionmin=min(v*,vmin);
2) Adjusted index value v*Minus the minimum value v of the current index intervalminThe number of rear lines is less than the current number of line points S[row]Is one half, i.e.
Figure BDA0003592395030000077
Meaning that the original v is less than half, but the original index interval vmin,vmax]That is, the original v and the original index interval contain the index 0 together means that the cross is marked as True and the minimum value v of the index interval needs to be updatedmin=vmin-S[row]Maximum value v of index sectionmax=max(v*,vmax-S[row]);
If the adjusted index value v*Otherwise, it indicates that the adjusted index value v*And the original index interval [ v ]min,vmax]Are not present across the mutation region and therefore only linear comparisons of sizes are required. Of course, it should be noted that if the point cloud data is not obtained by a mechanical lidar, but a solid-state lidar is used, this step may not be necessary.
The step S2 of the present embodiment of screening the obtained clusters to obtain initial clusters specifically includes the following steps:
s21: solving each clustering bounding box through the maximum value and the minimum value on the x axis, the y axis and the z axis of the laser radar in a Cartesian coordinate system, and performing primary screening according to the clustering bounding boxes; specifically, clusters that do not conform to the volume range are excluded, and in this embodiment, clusters whose volume is less than the real volume of the calibration plate or 4 times greater than the real volume of the calibration plate are excluded, and the exclusion condition may be set according to the actual situation, which is not limited herein; s22: filling all original points into the clusters after the initial screening according to the index interval of the points on each line; s23: performing plane fitting on each cluster respectively to obtain a three-dimensional plane normal vector of the fitted calibration plate under a laser radar Cartesian coordinate system
Figure BDA0003592395030000081
And all the local points contained in the plane;
s24: and screening clusters with the local point ratio of more than 80% to obtain initial clusters. The local point occupation ratio is more than 80% and is only a reference threshold value, and the first set threshold value can be changed according to the requirement in the specific implementation process.
Because the main noise source of the laser radar is the ranging error, in this embodiment, the denoising of the potential calibration board object is specifically to denoise the ranging result of the midpoint of the potential calibration board object, and then find the coordinates of the denoised point. Specifically, any point in the set point cloud data Cartesian coordinate systemCThe raw measurement of p ═ x, y, z is typically described as a spherical coordinate systemSp ═ α, ω, R), where
Figure BDA0003592395030000082
Is the direction of the angle of the azimuth,
Figure BDA0003592395030000083
for a lifting angle, R is a distance measurement result, and x, y and z are values of points in a point cloud data Cartesian coordinate system; consider the ranging result
Figure BDA0003592395030000084
Wherein
Figure BDA0003592395030000085
If delta R is the error, then the normal vector is calculated according to the three-dimensional plane
Figure BDA0003592395030000086
Determining the true value of the distance measurement result
Figure BDA0003592395030000087
Figure BDA0003592395030000088
Thus, the coordinates of the noise-reduced point can be obtained
Cp*=x*,y*,z*The coordinates of (a) are:
Figure BDA0003592395030000089
in step S4 in this embodiment, projecting the noise-reduced potential calibration board object onto a plane, and further screening the initial clusters by combining the real area of the calibration board to obtain target clusters specifically includes the following steps:
s41: solving a three-dimensional plane normal vector
Figure BDA00035923950300000810
Axial angle transformation vector to z-axis
Figure BDA00035923950300000811
S42: transforming vectors according to shaft angle
Figure BDA00035923950300000812
Solving a corresponding three-dimensional rotation matrix R;
s43: solving the translation vector by the mean value of the coordinates of all local points in the potential calibration plate object on the x axis, the y axis and the z axis respectively
Figure BDA00035923950300000813
S44: using three-dimensional rotation matrix R and translation vector
Figure BDA0003592395030000091
Forming projective transformation matrices
Figure BDA0003592395030000092
Wherein R isTA transposed matrix denoted as R that projects all intra-office points within the potential calibration plate object onto a two-dimensional plane;
s45: solving convex hull of inner point of local inner point by convex hull algorithm
Figure BDA0003592395030000093
And find the convex hull
Figure BDA0003592395030000094
Area of (2)
Figure BDA0003592395030000095
Figure BDA0003592395030000096
Wherein x, y and z are values of points in a point cloud data Cartesian coordinate system,
Figure BDA0003592395030000097
is a convex hull
Figure BDA0003592395030000098
The number of the points is contained, i is the serial number of the point currently processed;
s46: comparison of areas
Figure BDA0003592395030000099
Further screening the initial clusters with the real area of the calibration plate to obtain target clusters; specifically, initial clusters having an area a that is less than three-quarters or greater than four-fifths of the true area of the calibration plate are removed.
Because there will be a certain error in the fitting of straight lines and a certain error in the corner points, this embodiment further includes step S6:
s6: optimizing the obtained angular point of the calibration plate;
step S6 specifically includes the following steps:
s61: processing the corner points of the obtained calibration plate by using the real size of the calibration plate:
Figure BDA00035923950300000910
Figure BDA00035923950300000911
wherein, the matrix
Figure BDA00035923950300000912
Is a two-dimensional transformation matrix satisfying lie group SE (2), a00、a01、a10、a11、b0、b1Being an element of a matrix T, the matrix T*Is the result of the optimization of the matrix T, c*For the optimized corner point, ciTo obtain the corner points of the calibration plate,
Figure BDA00035923950300000913
i is a serial number of the currently processed corner point for a real corner point matrix template constructed according to the size information of the calibration plate;
s62: transforming matrices using projections
Figure BDA00035923950300000914
Is inversely transformed
Figure BDA00035923950300000915
And converting the optimized corner points into three-dimensional corner point coordinates.
In this embodiment, a calibration board with 42 angular points in 7 rows and 6 columns is adopted, each small grid is a square with a side length of 5 cm, the periphery of the checkerboard is sequentially a white side with a width of 5 cm and a black side with a width of 5 cm, the lower left corner is taken as a first angular point, each angular point is arranged in a row-by-row manner, and a matrix including coordinates of all angular points is obtained
Figure BDA0003592395030000101
The three-dimensional corner coordinate of the three-dimensional corner coordinate obtained finally
Figure BDA0003592395030000102
After the point cloud data calibration plate angular points and the image data calibration plate angular points obtained by the steps and the one-to-one corresponding relation between the point cloud data calibration plate angular points and the image data calibration plate angular points, the joint calibration of the laser radar and the camera can be obtained by utilizing a PnP (multipoint perspective imaging) algorithm for solving.
In the scheme, when the calibration plate angular points in the point cloud data are extracted, the potential calibration plate object is subjected to noise reduction, and the potential calibration plate object subjected to noise reduction is further screened according to the real area of the calibration plate, so that more accurate calibration plate angular points can be obtained subsequently; therefore, the calibration precision of the laser radar and the camera can be further improved, and particularly the mechanical scanning laser radar with sparse point cloud data is obtained.
Example 2:
as shown in fig. 4, a second embodiment of a joint calibration method for a laser radar and a camera is shown, and the difference between this embodiment and embodiment 1 is only that, in this embodiment, after denoising a potential calibration board object, edge extraction based on laser reflection intensity is further performed, which specifically includes the following steps:
for each potential calibration plate object, performing binarization processing on all points based on the reflection intensity;
edge points of the potential calibration plate object are extracted based on the result of the binarization.
The method specifically comprises the following steps: for each potential calibration board object, counting the reflection intensities of all the points in the potential calibration board object, arranging the points in the potential calibration board object from low to high, taking the first 20% as a segmentation threshold value to carry out binarization on all the points in the potential calibration board object, namely setting the reflection intensity higher than the segmentation threshold value as the highest intensity, and setting the reflection intensity lower than the segmentation threshold value as the lowest intensity; then traverse the points of each row from left to right, for point piIf point pi-1Ratio point piAnd point pi+1All have a high reflection intensity, or point pi+1Ratio point piAnd point pi-1If the reflection intensities are all high, the point p is determined as an edge point of the reflection intensity.
Example 3:
fig. 5 shows an embodiment of a joint calibration method for a laser radar and a camera, which includes the following steps:
respectively extracting calibration plates in the laser radar point cloud data and the camera image data;
extracting calibration plate angular points in the laser radar point cloud data by adopting a calibration plate angular point extraction method in the laser radar point cloud data in the embodiment 1 or the embodiment 2;
extracting a calibration board corner point in camera image data;
and solving by utilizing the calibration plate angular points in the point cloud data and the calibration plate angular points in the image data through a PnP algorithm to obtain the combined calibration of the laser radar and the camera.
In a specific embodiment, the joint calibration of the laser radar and the camera obtained by solving through the PnP algorithm by using the calibration plate corner points in the point cloud data and the calibration plate corner points in the image data specifically adopts an iterative algorithm based on RANSAC, and the method comprises the following steps:
forming a three-two-dimensional corner pair by the calibration plate corner points (which are three-dimensional corner points) in the series of point cloud data obtained in the step and the calibration plate corner points (which are two-dimensional corner points) in the camera image data;
selecting 4 pairs of three-dimensional-two-dimensional angle point pairs in each iteration, and solving by using an EPnP algorithm to obtain rough coordinate transformation of the three-dimensional-two-dimensional angle point pairs;
projecting all three-dimensional corner points to a camera image coordinate system according to the rough coordinate transformation of the three-dimensional-two-dimensional corner point pair, and calculating a reprojection error;
separating the local inner point and the local outer point according to the reprojection error, namely, taking the three-dimensional-two-dimensional angle point pair with the error smaller than a certain threshold value as the local inner point, and taking the three-dimensional-two-dimensional angle point pair with the error larger than the certain threshold value as the local outer point;
solving more optimal coordinate transformation and updating reprojection errors for all the local interior points through an EPnP algorithm; if the better coordinate transformation is solved and the failure occurs, the initial result is retained and the iteration is carried out for a plurality of times until the times reach the requirement or the reprojection error is small enough;
and the obtained optimal coordinate transformation is the transformation relation from the optimal laser radar coordinate system to the camera coordinate system in the current frame data, namely the calibration of the laser radar and the camera is completed.
Of course, the iterative algorithm based on RANSAC is only a reference implementation in this embodiment, and cannot be understood as a limitation to this solution, and a person skilled in the art may certainly use other corresponding algorithms to complete this operation; in addition, the number of three-dimensional to two-dimensional corner pairs is not to be construed as limiting.
In this embodiment, before the calibration plate corner points in the camera image data are extracted, distortion correction is performed on the camera image data, so that the extracted calibration plate corner points can correspond to the calibration plate corner points of the point cloud data, and further, the calibration is prevented from being affected.
In the embodiment, the calibration board corner points in the camera image data can be extracted by combining two tools, namely findchessboardcorrers () and cornerSubPix () given in an OpenCV tool library to obtain the corner points; of course, the skilled person can also extract the camera image data in other ways, which is not limited herein.
Fig. 6 to 8 show an embodiment of a combined calibration system for a laser radar and a camera, which is used for implementing the combined calibration method for a laser radar and a camera, and includes a calibration module, a point cloud data system and an image data system, both of which are in communication connection with the calibration module; the point cloud data system comprises a laser radar, an edge point extraction module, a clustering screening module, a noise reduction module, a screening module and a fitting module which are sequentially in communication connection, wherein the edge point extraction module is in communication connection with the laser radar, and the fitting module is in communication connection with a calibration module; the image data system comprises a camera and an image corner extraction module in communication connection with the camera, and the image corner extraction module is also in communication connection with the calibration module;
the laser radar and the camera are used for scanning the environment of the same calibration plate and respectively obtaining point cloud data and image data;
the edge point extraction module is used for extracting edge points of a calibration plate in the point cloud data;
the cluster screening module is used for clustering the edge points and screening the obtained clusters to obtain potential calibration plate objects;
the noise reduction module is used for reducing noise of the potential calibration board object;
the screening module is used for projecting the denoised potential calibration board object to a plane and further screening the initial clusters by combining the real area of the calibration board to obtain target clusters;
the fitting module is used for performing linear fitting on the edges of the target clusters through a linear fitting algorithm to obtain a plurality of edge lines, solving the intersection points of the plurality of edge lines to obtain the angular points of the calibration plate;
the image angular point module is used for extracting calibration plate angular points in the camera image data;
and the calibration module is used for solving by utilizing a PnP algorithm to obtain the joint calibration of the laser radar and the camera according to the calibration plate angular point obtained by the point cloud data and the calibration plate angular point obtained by the image data.
The lidar in this embodiment may be a solid state lidar or a mechanical rotary lidar, which is not limited herein.
The camera in this embodiment may be a visible light grayscale camera, a color camera, an infrared camera, or the like, and is not limited herein.
The number of the lidar and the cameras in the embodiment is not limited to one, and may be any number as long as the lidar and the cameras have corresponding calibration board views, which is well known to those skilled in the art and is not limited herein.
The present invention has been described with reference to flowchart illustrations or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application, and it is understood that each flow or block of the flowchart illustrations or block diagrams, and combinations of flows or blocks in the flowchart illustrations or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. A method for extracting calibration plate angular points in laser radar point cloud data is characterized by comprising the following steps:
extracting edge points of a calibration plate in the point cloud data;
clustering the edge points, screening the obtained clusters to obtain initial clusters, and marking as potential calibration plate objects;
denoising the potential calibration board object to obtain a denoised potential calibration board object;
projecting the potential calibration plate object subjected to noise reduction to a plane, and further screening the initial clusters by combining the real area of the calibration plate to obtain target clusters;
and performing linear fitting on the edges of the target clusters through a linear fitting algorithm to obtain a plurality of edge lines, solving the intersection points of the plurality of edge lines, and obtaining the angular points of the calibration plate.
2. The method for extracting angular points of a calibration plate in lidar point cloud data according to claim 1, wherein the extracting edge points of the calibration plate in the point cloud data specifically comprises the following steps:
adjusting the point cloud sequence to be row-first;
recording the number of points of each line of the current frame;
and classifying all the points based on the single-point smoothness evaluation quantity to obtain all the left edge points and the right edge points and form an edge point set.
3. The method for extracting calibration board corner points from lidar point cloud data according to claim 2, wherein the clustering of edge points is specifically performed in the edge point set by a single join algorithm or a multiple join algorithm or DBSCAN, and each cluster at least stores information: maximum and minimum values on the x axis, y axis and z axis of the laser radar Cartesian coordinate system, the interval of the index of the point on each line and the coordinates of all the points in the cluster.
4. The method for extracting calibration plate corner points in laser radar point cloud data according to claim 3, wherein the step of screening the obtained clusters to obtain initial clusters specifically comprises the following steps:
solving each clustering bounding box through the maximum value and the minimum value on the x axis, the y axis and the z axis of the laser radar in a Cartesian coordinate system, and carrying out primary screening according to the clustering bounding boxes;
filling all original points into the clusters after the initial screening according to the index interval of the points on each line;
performing plane fitting on each cluster to respectively obtain a three-dimensional plane normal vector of the fitted calibration plate under a laser radar Cartesian coordinate system and all local points contained in the plane;
and screening clusters with the local point ratio larger than a first set threshold value to obtain initial clusters.
5. The method for extracting calibration plate corner points in lidar point cloud data according to claim 4, wherein the denoising of the potential calibration plate object is specifically to denoise a ranging result of a midpoint of the potential calibration plate object by using the three-dimensional plane normal vector, and further to find coordinates of a denoised point.
6. The method of claim 5, wherein after denoising the potential calibration plate object, the following steps are further performed:
for each potential calibration plate object, performing binarization processing on all points based on the reflection intensity;
edge points of the potential calibration plate object are extracted based on the result of the binarization.
7. The method for extracting the angular point of the calibration plate in the laser radar point cloud data according to claim 6, wherein the step of projecting the noise-reduced potential calibration plate object to a plane and further screening the initial cluster by combining the real area of the calibration plate to obtain the target cluster specifically comprises the following steps:
calculating the axial angle transformation vector from the normal vector of the three-dimensional plane to the z-axis
Figure FDA0003592395020000021
According to the axleAngular transformed vector
Figure FDA0003592395020000022
Solving a corresponding three-dimensional rotation matrix R;
finding translation vectors by averaging all local points in a potential calibration plate object
Figure FDA0003592395020000023
Using three-dimensional rotation matrix R and translation vector
Figure FDA0003592395020000024
Forming projective transformation matrices
Figure FDA0003592395020000025
Wherein R isTA transposed matrix denoted as R that projects all intra-office points within the potential calibration plate object onto a two-dimensional plane;
solving convex hull of inner point of local inner point by convex hull algorithm
Figure FDA0003592395020000026
And find the convex hull
Figure FDA0003592395020000027
Area of (2)
Figure FDA0003592395020000028
Comparison of areas
Figure FDA0003592395020000029
And further screening the initial clusters and the real areas of the calibration plates to obtain target clusters.
8. The method for extracting calibration plate corner points in lidar point cloud data according to claim 7, further comprising optimizing the obtained calibration plate corner points, specifically comprising the steps of:
processing the corner points of the obtained calibration plate by using the real size of the calibration plate:
Figure FDA00035923950200000210
Figure FDA00035923950200000211
wherein, the matrix
Figure FDA00035923950200000212
Is a two-dimensional transformation matrix satisfying lie group SE (2), a00、a01、a10、a11、b0、b1Being an element of a matrix T, the matrix T*Is the result of the optimization of the matrix T, c*For the optimized corner point, ciTo obtain the corner points of the calibration plate,
Figure FDA00035923950200000213
i is a serial number of the currently processed corner point for a real corner point matrix template constructed according to the size information of the calibration plate;
using the projective transformation matrix
Figure FDA0003592395020000031
And converting the optimized corner points into three-dimensional corner point coordinates.
9. A combined calibration method for a laser radar and a camera is characterized by comprising the following steps:
respectively extracting calibration plates in the laser radar point cloud data and the camera image data;
extracting calibration plate corner points in the lidar point cloud data by using a calibration plate corner point extraction method in the lidar point cloud data according to any one of claims 1 to 8;
extracting calibration board angular points in camera image data;
and solving by utilizing the calibration plate angular points in the point cloud data and the calibration plate angular points in the image data through a PnP algorithm to obtain the joint calibration of the laser radar and the camera.
10. The joint calibration method of the laser radar and the camera as claimed in claim 9, wherein the joint calibration of the laser radar and the camera obtained by solving through the PnP algorithm by using the calibration plate corner points in the point cloud data and the calibration plate corner points in the image data specifically uses an iterative algorithm based on RANSAC, comprising the steps of:
forming a three-dimensional and two-dimensional corner pair by the calibration plate corner points in the point cloud data and the calibration plate corner points in the image data; selecting a plurality of pairs of three-dimensional and two-dimensional angle point pairs in each iteration, and solving by using an EPnP algorithm to obtain rough coordinate transformation of the three-dimensional and two-dimensional angle point pairs;
projecting all three-dimensional corner points to a camera image coordinate system according to the rough coordinate transformation of the three-dimensional-two-dimensional corner point pair, and calculating a reprojection error;
separating local inner points and local outer points according to the reprojection error;
and solving the optimal coordinate transformation of all the local points through an EPnP algorithm and updating the reprojection error to obtain the optimal coordinate transformation, namely the calibration of the laser radar and the camera.
CN202210379970.1A 2022-04-12 2022-04-12 Combined calibration method and system for laser radar and camera Pending CN114742898A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210379970.1A CN114742898A (en) 2022-04-12 2022-04-12 Combined calibration method and system for laser radar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210379970.1A CN114742898A (en) 2022-04-12 2022-04-12 Combined calibration method and system for laser radar and camera

Publications (1)

Publication Number Publication Date
CN114742898A true CN114742898A (en) 2022-07-12

Family

ID=82280964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210379970.1A Pending CN114742898A (en) 2022-04-12 2022-04-12 Combined calibration method and system for laser radar and camera

Country Status (1)

Country Link
CN (1) CN114742898A (en)

Similar Documents

Publication Publication Date Title
CN108932736B (en) Two-dimensional laser radar point cloud data processing method and dynamic robot pose calibration method
CN111553859B (en) Laser radar point cloud reflection intensity completion method and system
CN107844750B (en) Water surface panoramic image target detection and identification method
CN109300162B (en) Multi-line laser radar and camera combined calibration method based on refined radar scanning edge points
WO2021098608A1 (en) Calibration method for sensors, device, system, vehicle, apparatus, and storage medium
CN109598765B (en) Monocular camera and millimeter wave radar external parameter combined calibration method based on spherical calibration object
Kang et al. Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model
CN111429533B (en) Camera lens distortion parameter estimation device and method
CN109859226B (en) Detection method of checkerboard corner sub-pixels for graph segmentation
CN110689562A (en) Trajectory loop detection optimization method based on generation of countermeasure network
CN109211198B (en) Intelligent target detection and measurement system and method based on trinocular vision
CN111308448A (en) Image acquisition equipment and radar external parameter determination method and device
CN115761550A (en) Water surface target detection method based on laser radar point cloud and camera image fusion
CN109118544B (en) Synthetic aperture imaging method based on perspective transformation
CN110189375B (en) Image target identification method based on monocular vision measurement
CN111191485A (en) Parking space detection method and system and automobile
CN112097732A (en) Binocular camera-based three-dimensional distance measurement method, system, equipment and readable storage medium
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN112731436B (en) Multi-mode data fusion travelable region detection method based on point cloud up-sampling
CN111739031A (en) Crop canopy segmentation method based on depth information
CN115359130B (en) Radar and camera combined calibration method and device, electronic equipment and storage medium
CN110084743A (en) Image mosaic and localization method based on more air strips starting track constraint
CN111354047A (en) Camera module positioning method and system based on computer vision
CN114578328A (en) Automatic calibration method for spatial positions of multiple laser radars and multiple camera sensors
CN117784161A (en) ROS camera and laser radar fusion target detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination