CN113256696B - External parameter calibration method of laser radar and camera based on natural scene - Google Patents

External parameter calibration method of laser radar and camera based on natural scene Download PDF

Info

Publication number
CN113256696B
CN113256696B CN202110716414.4A CN202110716414A CN113256696B CN 113256696 B CN113256696 B CN 113256696B CN 202110716414 A CN202110716414 A CN 202110716414A CN 113256696 B CN113256696 B CN 113256696B
Authority
CN
China
Prior art keywords
stage
particles
particle
representing
particle swarm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110716414.4A
Other languages
Chinese (zh)
Other versions
CN113256696A (en
Inventor
王宇茹
李健
孙毅
杨晓慧
王杰
孙振平
史美萍
叶磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202110716414.4A priority Critical patent/CN113256696B/en
Publication of CN113256696A publication Critical patent/CN113256696A/en
Application granted granted Critical
Publication of CN113256696B publication Critical patent/CN113256696B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Geometry (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a laser radar and camera external reference calibration method based on a natural scene, which is characterized in that based on radar point cloud and camera images which are acquired simultaneously, pedestrians in the natural scene are taken as targets, and through point cloud pedestrian detection and image pedestrian instance segmentation, central points and vertexes of corresponding pedestrians in the point cloud and the image are extracted, so that key point matching of the point cloud and the image is obtained. Taking the key points as initial points of the particle swarm, initializing a translation vector and a rotation matrix by using a particle swarm algorithm, performing iteration on the key point positions for a limited number of times by using a random particle swarm algorithm, then directly optimizing a rotation vector and a feature matrix by using the random particle swarm algorithm, and finally converging to obtain a stable rotation matrix and a stable translation vector. The problem that the point cloud and the image can not be strictly corresponding is solved, and more accurate key point positions are obtained, so that more accurate external reference results are obtained, and the interpretability is higher.

Description

External parameter calibration method of laser radar and camera based on natural scene
Technical Field
The invention relates to the technical field of computer vision and robots, in particular to a laser radar and camera external parameter calibration method based on a natural scene.
Background
As the application of the laser radar and the camera in the unmanned system becomes more and more extensive, the method of using the multi-sensor fusion to improve the perception level of the unmanned system gets more and more attention. The camera can obtain high resolution RGB pictures with sufficient color, texture etc information but lacking depth information and being sensitive to light. Radar can provide three dimensional spatial information of a target but with low resolution and no color information. The combination of the two can well make up respective defects and more comprehensively react the characteristics of the target. Because the camera and the radar are in different coordinate systems, and the information fusion of the camera and the radar needs a coordinate transformation matrix between the radar and the camera, accurate external reference calibration is the key for accurate fusion of the multiple sensors.
For external reference calibration of cameras and radars, a chessboard and other specific calibration objects are used in the traditional calibration method, the methods often need to extract lines or surfaces and the like of the calibration objects, and the method based on the artificial calibration objects takes long time for early preparation and processing and needs specific calibration scenes. Geometric features such as planes, line segments, parallel line vanishing points and the like in a natural scene are used for realizing a non-target calibration method, high requirements are provided for accurate line and surface extraction in images and point clouds, and the characteristics of the lines, the surfaces and the like often exist in artificial environments such as indoor wall surfaces, buildings and the like, so that the calibration scene of the method is limited. To reduce the dependency on features in the scene, there are methods that do not utilize feature matching but utilize self-motion estimation to compute the external parameters, motion estimation based methods require accurate and sufficient self-motion state estimation while requiring the presence of three-dimensional geometric features and trackable visual features in the scene, the lack of these features can lead to positional slippage in LiDAR range estimation, affecting the accuracy of the calibration results, and motion estimation based methods are easier to move and calibrate at robotic platforms, but are not suitable for vehicle type platforms, such as unmanned vehicles, because vehicles do not easily make sufficient motion in a short time. In addition, with the success of deep learning in recent years, methods for applying deep learning to camera-radar external parameter estimation also appear, however, a network-based method needs a depth true value as supervision data, an accurate depth true value is often difficult to obtain in a real scene, the method of the deep network has poor interpretability, and the generalization capability has certain limitation.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a laser radar and camera external parameter calibration method based on a natural scene, which overcomes the problem that point cloud and images can not be strictly corresponding, and obtains more accurate key point positions, thereby obtaining more accurate external parameter results.
In order to achieve the above object, the present invention provides a method for calibrating external parameters of a laser radar and a camera based on a natural scene, comprising the following steps:
step 1, acquiring a laser point cloud and a camera image which are synchronous and have pedestrians, and acquiring detection frames of different individual pedestrians in the laser point cloud and masks of the different individual pedestrians in the camera image;
step 2, establishing mapping between the laser point cloud and pedestrians in the camera image based on the detection frame and the mask to obtain a plurality of 2D-3D matching point pairs of the camera image and the laser point cloud;
step 3, inputting the 2D-3D matching point pairs serving as first-stage particles into a first-stage particle swarm optimization model, iteratively optimizing the 2D-3D matching point pairs, and obtaining initial external parameters based on the optimized 2D-3D matching point pairs;
and 4, inputting the initial external parameters serving as second-stage particles into a second-stage particle swarm optimization model for iterative optimization to obtain actual external parameters between the camera and the laser radar.
In one embodiment, in step 1:
inputting the laser point cloud into a pointpilars point cloud detection network, and outputting detection frames of different individual pedestrians;
the camera images are input into an example segmentation network MRCNN, and masks of different individual pedestrians are output.
In one embodiment, the specific process of step 2 is:
step 2.1, converting the laser point cloud into a depth map, and determining coordinates corresponding to the radar point cloud and a detection frame according to pixel values in the detection frame in the depth map;
step 2.2, corresponding the depth map to the pedestrian in the camera image through manual marking to obtain pedestrian mapping of the laser point cloud and the camera image, and then extracting the central point and the head vertex of the pedestrian in the laser point cloud and the camera image respectively to obtain a plurality of groups of matching point pairs, wherein each group of matching point pairs comprises a 2D point in the camera image and a 3D point in the laser point cloud;
step 2.3, extracting a plurality of groups of 2D points and 3D points with mapping relations in the camera image, around the corresponding 2D points and around the corresponding 3D points in the laser point cloud respectively for each group of matching point pairs in the step 2.2 to obtain a plurality of new matching point pairs;
and 2.4, integrating the matching point pairs obtained in the step 2.2 and the step 2.3 to obtain the 2D-3D matching point pair in the step 2.
In one embodiment, in step 3, the iterative optimization of the first-stage particle swarm optimization model specifically includes the following steps:
step 3.1, constructing first-stage particles based on all the 2D-3D matching point pairs;
step 3.2, calculating a first-stage evaluation function of each first-stage particle;
step 3.3, taking the first-stage particle with the lowest first-stage evaluation function as the optimal first-stage particle in the current population, and calculating the updating speed of each iteration of each first-stage particle;
step 3.4, updating the first-stage particles based on the updating speed;
step 3.5, calculating a second-stage evaluation function of each first-stage particle after the particle is updated;
step 3.6, extracting the first-stage particles with the lowest second-stage evaluation function, and expanding the first-stage particles to obtain particle swarm of next iteration;
and 3.7, judging whether the iteration times are maximum, if so, outputting the external parameter corresponding to the optimal particle in the current population as an initial external parameter, and otherwise, returning to the step 3.2 after one iteration.
In one embodiment, the iterative optimization of the first-stage particle swarm optimization model specifically comprises the following steps:
step 3.1, constructing first-stage particles based on all the 2D-3D matching point pairs:
Figure 879376DEST_PATH_IMAGE001
in the formula (I), the compound is shown in the specification,
Figure 156774DEST_PATH_IMAGE002
a particle swarm representing a first-stage particle swarm optimization model,
Figure 980373DEST_PATH_IMAGE003
is shown as
Figure 103050DEST_PATH_IMAGE004
The coordinates of the 2D points in the set of 2D-3D matching point pairs,
Figure 328495DEST_PATH_IMAGE005
is shown as
Figure 511215DEST_PATH_IMAGE004
The coordinates of the 3D points in the set of 2D-3D matching point pairs,
Figure 822110DEST_PATH_IMAGE006
representing the first stage in a particle swarm optimization model
Figure 748478DEST_PATH_IMAGE004
The number of the first-stage particles,
Figure 828430DEST_PATH_IMAGE007
representing the total number of the 2D-3D matching point pairs and the total number of first-stage particles in the first-stage particle swarm optimization model;
step 3.2, calculating a first-stage evaluation function of each first-stage particle:
Figure 182051DEST_PATH_IMAGE008
in the formula (I), the compound is shown in the specification,
Figure 750DEST_PATH_IMAGE009
is shown as
Figure 465230DEST_PATH_IMAGE004
A first-stage evaluation function for the first-stage particles, K being a camera intrinsic parameter,
Figure 665267DEST_PATH_IMAGE010
is as follows
Figure 455368DEST_PATH_IMAGE011
The corresponding external parameter of the first-stage particle, P represents the coordinate of any 3D point in the detection frame,
Figure 475277DEST_PATH_IMAGE012
representing the set of all 3D points in all detection boxes, HSS is a binary function, represented as:
Figure 743447DEST_PATH_IMAGE013
in the formula (I), the compound is shown in the specification,
Figure 797991DEST_PATH_IMAGE014
representing the mask in the camera image corresponding to the detection box,
Figure 227835DEST_PATH_IMAGE015
to represent
Figure 735040DEST_PATH_IMAGE016
The point P is projected to fall on the mask,
Figure 806901DEST_PATH_IMAGE017
indicating that the 3D point P does not fall in the mask after projection;
step 3.3, taking the first-stage particle with the lowest first-stage evaluation function as the optimal first-stage particle in the current population, and calculating the updating speed of each iteration of each first-stage particle:
Figure 450372DEST_PATH_IMAGE018
in the formula (I), the compound is shown in the specification,
Figure 582276DEST_PATH_IMAGE019
is shown as
Figure 311198DEST_PATH_IMAGE020
The first stage particles are in
Figure 186750DEST_PATH_IMAGE021
The update rate at the time of the sub-iteration,
Figure 215886DEST_PATH_IMAGE022
representing the optimal first phase particles in the current population,
Figure 253112DEST_PATH_IMAGE023
representing the first phase particles that are optimal for the historical iteration,
Figure 266067DEST_PATH_IMAGE024
in order to be the step size,
Figure 679731DEST_PATH_IMAGE025
is a random number, and is a random number,
Figure 566303DEST_PATH_IMAGE026
wherein, in the step (A),
Figure 508851DEST_PATH_IMAGE027
=1,2,···,
Figure 212365DEST_PATH_IMAGE028
Figure 164140DEST_PATH_IMAGE028
expressing the maximum iteration number of the first-stage particle swarm optimization model
Figure 902289DEST_PATH_IMAGE029
When the ratio is not less than 1,
Figure 281318DEST_PATH_IMAGE030
=0;
and 3.4, updating the first-stage particles based on the updating speed:
Figure 472128DEST_PATH_IMAGE031
step 3.5, calculating the second-stage evaluation function of each first-stage particle after the particle update:
Figure 227594DEST_PATH_IMAGE032
in the formula (I), the compound is shown in the specification,
Figure 85829DEST_PATH_IMAGE033
is shown as
Figure 635759DEST_PATH_IMAGE034
A second-stage evaluation function for the first-stage particles, K being a camera intrinsic parameter,
Figure 313865DEST_PATH_IMAGE035
indicating updated the second
Figure 873022DEST_PATH_IMAGE036
The corresponding external parameter of the first-stage particle, P represents the coordinate of any 3D point in the detection frame,
Figure 320184DEST_PATH_IMAGE037
representing the set of all 3D points in all detection boxes,
Figure 306595DEST_PATH_IMAGE038
is a binary function expressed as:
Figure 737576DEST_PATH_IMAGE039
in the formula (I), the compound is shown in the specification,
Figure 569266DEST_PATH_IMAGE040
representing the mask in the camera image corresponding to the detection box,
Figure 136513DEST_PATH_IMAGE041
indicating that the 3D point P falls within the mask after projection,
Figure 28246DEST_PATH_IMAGE042
indicating that the 3D point P does not fall in the mask after projection;
step 3.6, extracting the first-stage particles with the lowest second-stage evaluation function
Figure 680944DEST_PATH_IMAGE043
,1≤A≤
Figure 316325DEST_PATH_IMAGE044
And is based on
Figure 3658DEST_PATH_IMAGE045
Expanding to obtain the particle swarm of the next iteration
Figure 328942DEST_PATH_IMAGE046
Figure 468936DEST_PATH_IMAGE047
Figure 908008DEST_PATH_IMAGE048
Figure 449847DEST_PATH_IMAGE049
Figure 948962DEST_PATH_IMAGE050
Figure 576252DEST_PATH_IMAGE051
In the formula (I), the compound is shown in the specification,
Figure 553436DEST_PATH_IMAGE052
is the step length;
step 3.7, judge
Figure 215361DEST_PATH_IMAGE053
If yes, outputting an external parameter corresponding to the optimal first-stage particle in the current population as an initial external parameter, otherwise, commanding
Figure 619798DEST_PATH_IMAGE054
=
Figure 203226DEST_PATH_IMAGE054
+1, and taking the optimal first-stage particle in the current population as the optimal first-stage particle for historical iteration, and returning to the step 3.2.
In one embodiment, in step 4, the iterative optimization of the second-stage particle swarm optimization model specifically includes the following steps:
step 4.1, guiding expansion based on the gradient of each variable in the initial external parameter to obtain a particle swarm of a second-stage particle swarm optimization model;
step 4.2, calculating a first-stage evaluation function of each second-stage particle;
4.3, taking the second-stage particle with the lowest first-stage evaluation function as the optimal second-stage particle in the current population, and calculating the updating speed of each second-stage particle in each iteration;
4.4, updating the second-stage particles based on the updating speed;
step 4.5, calculating a second-stage evaluation function of each second-stage particle after the particle is updated;
and 4.6, extracting second-stage particles with the lowest second-stage evaluation function, judging whether the iteration frequency is the maximum or not, if so, outputting the second-stage particles with the lowest second-stage evaluation function as actual external parameters between the camera and the laser radar, otherwise, iterating once and updating the second-stage particles with the optimal historical iteration, and returning to the step 4.2.
In one embodiment, the iterative optimization of the second-stage particle swarm optimization model specifically comprises the following steps:
step 4.1, guiding expansion based on the gradient of each variable in the initial external parameters to obtain a particle swarm of the second-stage particle swarm optimization model:
Figure 249679DEST_PATH_IMAGE055
Figure 500532DEST_PATH_IMAGE056
Figure 341449DEST_PATH_IMAGE057
Figure 412173DEST_PATH_IMAGE058
Figure 996738DEST_PATH_IMAGE059
Figure 102097DEST_PATH_IMAGE060
Figure 848337DEST_PATH_IMAGE061
in the formula (I), the compound is shown in the specification,
Figure 937515DEST_PATH_IMAGE062
a particle swarm representing a second stage particle swarm optimization model,
Figure 794613DEST_PATH_IMAGE063
representing the second stage in the particle swarm optimization model
Figure 285637DEST_PATH_IMAGE064
The number of the second-stage particles is,
Figure 205707DEST_PATH_IMAGE065
is shown as
Figure 516603DEST_PATH_IMAGE064
The azimuth angle in the individual second-stage particles,
Figure 177391DEST_PATH_IMAGE066
is shown as
Figure 257343DEST_PATH_IMAGE067
The tumbling angle in the individual second stage particles,
Figure 610964DEST_PATH_IMAGE068
is shown as
Figure 143576DEST_PATH_IMAGE067
The roll angle in the individual second-stage particles,
Figure 608056DEST_PATH_IMAGE069
is shown as
Figure 808093DEST_PATH_IMAGE064
The translation vector in each of the second-stage particles,
Figure 67036DEST_PATH_IMAGE070
representing the total number of second-stage particles in the second-stage particle swarm optimization model,
Figure 352524DEST_PATH_IMAGE071
indicating the azimuth angle in the initial outer reference,
Figure 620694DEST_PATH_IMAGE072
indicating the roll angle in the initial outer reference,
Figure 675238DEST_PATH_IMAGE073
indicating the roll angle in the initial profile,
Figure 105082DEST_PATH_IMAGE074
representing the translation vector in the initial outer reference,
Figure 877866DEST_PATH_IMAGE075
is the step length;
step 4.2, calculating the first-stage evaluation function of each second-stage particle:
Figure 684148DEST_PATH_IMAGE076
in the formula (I), the compound is shown in the specification,
Figure 593198DEST_PATH_IMAGE077
is shown as
Figure 459523DEST_PATH_IMAGE064
A first-stage evaluation function of second-stage particles, K being a camera parameter, P representing coordinates of any 3D point in the detection box,
Figure 188444DEST_PATH_IMAGE078
representing the set of all 3D points in all detection boxes, HSS is a binary function, represented as:
Figure 63997DEST_PATH_IMAGE079
in the formula (I), the compound is shown in the specification,
Figure 561974DEST_PATH_IMAGE080
representing the mask in the camera image corresponding to the detection box,
Figure 599200DEST_PATH_IMAGE081
indicating that the 3D point P falls within the mask after projection,
Figure 80997DEST_PATH_IMAGE082
indicating that the 3D point P does not fall in the mask after projection;
step 4.3, taking the second-stage particle with the lowest first-stage evaluation function as the optimal second-stage particle in the current population, and calculating the updating speed of each second-stage particle in each iteration:
Figure 229082DEST_PATH_IMAGE083
in the formula (I), the compound is shown in the specification,
Figure 109794DEST_PATH_IMAGE084
is shown as
Figure 317922DEST_PATH_IMAGE085
The second stage particles are in
Figure 287015DEST_PATH_IMAGE086
The update rate at the time of the sub-iteration,
Figure 238790DEST_PATH_IMAGE087
representing the optimal second-stage particles in the current population,
Figure 711360DEST_PATH_IMAGE088
representing the second-stage particles for which the historical iteration is optimal,
Figure 355968DEST_PATH_IMAGE089
in order to be the step size,
Figure 546778DEST_PATH_IMAGE090
is a random number, and is a random number,
Figure 302244DEST_PATH_IMAGE091
wherein, in the step (A),
Figure 894900DEST_PATH_IMAGE092
=1,2,···,
Figure 444830DEST_PATH_IMAGE093
Figure 122936DEST_PATH_IMAGE093
representing the iteration number of the second stage particle swarm optimization model when
Figure 416514DEST_PATH_IMAGE094
When the ratio is not less than 1,
Figure 129255DEST_PATH_IMAGE095
=0;
and 4.4, updating the second-stage particles based on the updating speed:
Figure 850086DEST_PATH_IMAGE096
step 4.5, calculating the second-stage evaluation function of each second-stage particle after the particle update:
Figure 281067DEST_PATH_IMAGE097
in the formula (I), the compound is shown in the specification,
Figure 378336DEST_PATH_IMAGE098
is shown as
Figure 211163DEST_PATH_IMAGE099
A second-stage evaluation function of second-stage particles, K being a camera parameter, P representing coordinates of any 3D point in the detection box,
Figure 102896DEST_PATH_IMAGE100
indicates all detectionsThe set of all 3D points in the box,
Figure 755594DEST_PATH_IMAGE101
is a binary function expressed as:
Figure 656554DEST_PATH_IMAGE102
in the formula (I), the compound is shown in the specification,
Figure 81238DEST_PATH_IMAGE103
representing the mask in the camera image corresponding to the detection box,
Figure 409451DEST_PATH_IMAGE104
indicating that the 3D point P falls within the mask after projection,
Figure 283866DEST_PATH_IMAGE105
indicating that the 3D point P has not fallen into the mask after projection
Figure 722938DEST_PATH_IMAGE106
Step 4.6, extracting the second-stage particles with the lowest second-stage evaluation function
Figure 264777DEST_PATH_IMAGE107
,1≤C≤
Figure 498313DEST_PATH_IMAGE108
Judgment of
Figure 860024DEST_PATH_IMAGE109
If yes, outputting
Figure 837207DEST_PATH_IMAGE110
As an actual external parameter between camera and lidar
Figure 233553DEST_PATH_IMAGE111
Will be
Figure 637990DEST_PATH_IMAGE112
As second-stage particles optimized for historical iteration
Figure 752576DEST_PATH_IMAGE113
And then returning to the step 4.2.
The beneficial technical effects of the invention are as follows:
1. compared with the calibration method using artificial calibration objects such as a chessboard and the like, the method provided by the invention takes the pedestrians in a natural scene as the target, does not depend on specific artificial calibration objects, saves the complex preparation work before calibration and is simple and convenient to operate;
2. compared with a calibration method using specific geometric characteristics such as parallel lines, planes and V-shaped angles, the method provided by the invention has more applicable scenes:
specific geometric characteristics such as parallel lines, V-shaped angles and the like are usually present in artificial environments such as indoor wall surfaces and building bodies, so that the application scene of a calibration algorithm of the geometric characteristics is limited, the method disclosed by the invention aims at visible pedestrians and is not limited by the specific geometric characteristics, and the method is more universal;
3. compared with a calibration method for estimating external parameters by using a deep neural network, the method is not end-to-end, but is formed by connecting a plurality of functional modules, and has high interpretability:
the method comprises four modules of point cloud pedestrian detection, image pedestrian instance segmentation, feature point selection and two-stage random particle swarm optimization, wherein a pedestrian is taken as a target, a matched pedestrian key point is taken as an initial point pair of a particle swarm model, the particle swarm is subsequently utilized to sequentially carry out two-stage optimization on the key point position and an external parameter, and compared with a depth network which inputs point cloud and an image and outputs an external parameter result, the method has high interpretability;
4. compared with a calibration method in which the extracted key point pairs are directly input into the particle swarm model and the output result is used as the final external parameter, the method provided by the invention adopts the particle swarm algorithm to optimize the key point positions after the key point pairs are extracted, so that the obtained final external parameter result is more accurate:
in the conventional calibration method, the external parameters are obtained by directly using a PnP algorithm after the matching point pairs are obtained, but the points in the point cloud and the points in the image do not have strict one-to-one mapping relation due to the sparsity of the point cloud, so that the positions of the initially extracted matching point pairs are not correct, the positions of the selected corresponding point pairs are optimized by using a particle swarm model, the problem that the point cloud and the image cannot be strictly corresponding is solved, more accurate key point positions are obtained, and more accurate external parameter results are obtained.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a method for calibrating external parameters of a laser radar and a camera based on a natural scene according to an embodiment of the present invention;
FIG. 2 is a simulation diagram illustrating the detection frame and mask segmentation according to an embodiment of the present invention;
FIG. 3 is a simulation diagram illustrating the mapping of a laser point cloud to a pedestrian in a camera image according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart illustrating the mapping of the laser point cloud and the pedestrian in the camera image according to the embodiment of the present invention;
FIG. 5 is a schematic diagram of an iterative optimization process of a first-stage particle swarm optimization model according to an embodiment of the present invention;
FIG. 6 is a schematic view of an iterative optimization process of a second-stage particle swarm optimization model according to an embodiment of the present disclosure;
fig. 7 is a simulation diagram of a calibration result in the embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that all the directional indicators (such as up, down, left, right, front, and rear … …) in the embodiment of the present invention are only used to explain the relative position relationship between the components, the movement situation, etc. in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indicator is changed accordingly.
In addition, the descriptions related to "first", "second", etc. in the present invention are only for descriptive purposes and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "connected," "secured," and the like are to be construed broadly, and for example, "secured" may be a fixed connection, a removable connection, or an integral part; the connection can be mechanical connection, electrical connection, physical connection or wireless communication connection; they may be directly connected or indirectly connected through intervening media, or they may be connected internally or in any other suitable relationship, unless expressly stated otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In addition, the technical solutions in the embodiments of the present invention may be combined with each other, but it must be based on the realization of those skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should not be considered to exist, and is not within the protection scope of the present invention.
At present, the accuracy of 3D point cloud detection and image semantic segmentation for pedestrians has reached a higher accuracy, and in order to achieve calibration without artificial calibration objects in a natural scene, a non-end-to-end calibration method with high interpretability is designed based on radar point cloud and camera images acquired simultaneously and with pedestrians in a natural scene as a target. According to the method, through point cloud pedestrian detection and image pedestrian instance segmentation, central points and vertexes of corresponding pedestrians in the point cloud and the image are extracted, and therefore key point matching of the point cloud and the image is obtained. Taking the key points as initial points of the particle swarm, initializing a translation vector and a rotation matrix by using a particle swarm algorithm, performing iteration on the key point positions for a limited number of times by using a random particle swarm algorithm, then directly optimizing a rotation vector and a feature matrix by using the random particle swarm algorithm, and finally converging to obtain a stable rotation matrix and a stable translation vector.
Referring to fig. 1, the method for calibrating external parameters of a laser radar and a camera based on a natural scene disclosed in this embodiment specifically includes the following steps:
step 1, acquiring a laser point cloud and a camera image which are synchronous and have pedestrians, inputting the laser point cloud into a pointpilars point cloud detection network, and outputting detection frames of different individual pedestrians; inputting the camera image into the example segmentation network MRCNN, and outputting masks of different individual pedestrians, that is, as shown in fig. 2, where fig. 2(a) is a simulation diagram of the mask and fig. 2(b) is a simulation diagram of the detection frame.
And 2, establishing mapping between the laser point cloud and the pedestrians in the camera image based on the detection frame and the mask to obtain a plurality of 2D-3D matching point pairs of the camera image and the laser point cloud. For the convenience of mapping the point cloud and the pedestrian in the image, reference is made to fig. 3-4, where fig. 3(a) is a laser point cloud conversion depth map, and fig. 3(b) is a camera image. The specific implementation process of establishing the mapping in this embodiment is as follows:
step 2.1, converting the laser point cloud into a depth map, and determining coordinates corresponding to the radar point cloud and a detection frame according to pixel values in the detection frame in the depth map;
step 2.2, corresponding the depth map to the pedestrian in the camera image through manual marking to obtain pedestrian mapping of the laser point cloud and the camera image, and then extracting the central point and the head vertex of the pedestrian in the laser point cloud and the camera image respectively to obtain a plurality of groups of matching point pairs, wherein each group of matching point pairs comprises a 2D point in the camera image and a 3D point in the laser point cloud;
step 2.3, extracting a plurality of groups of 2D points and 3D points with mapping relations in the camera image, around the corresponding 2D points and around the corresponding 3D points in the laser point cloud respectively for each group of matching point pairs in the step 2.2 to obtain a plurality of new matching point pairs;
and 2.4, integrating the matching point pairs obtained in the step 2.2 and the step 2.3 to obtain the 2D-3D matching point pair in the step 2.
And 3, inputting the 2D-3D matching point pairs serving as first-stage particles into a first-stage particle swarm optimization model, iteratively optimizing the 2D-3D matching point pairs, and obtaining initial external parameters based on the optimized 2D-3D matching point pairs. Referring to fig. 5, the iterative optimization process of the first-stage particle swarm optimization model is as follows:
step 3.1, constructing first-stage particles based on all the 2D-3D matching point pairs:
Figure 533451DEST_PATH_IMAGE114
in the formula (I), the compound is shown in the specification,
Figure 784303DEST_PATH_IMAGE115
a particle swarm representing a first-stage particle swarm optimization model,
Figure 359641DEST_PATH_IMAGE116
is shown as
Figure 695945DEST_PATH_IMAGE117
Group 2D-The coordinates of the 2D points in the pair of 3D matching points,
Figure 280510DEST_PATH_IMAGE118
is shown as
Figure 651448DEST_PATH_IMAGE117
The coordinates of the 3D points in the set of 2D-3D matching point pairs,
Figure 397687DEST_PATH_IMAGE119
representing the first stage in a particle swarm optimization model
Figure 486866DEST_PATH_IMAGE117
The number of the first-stage particles,
Figure 609543DEST_PATH_IMAGE120
representing the total number of the 2D-3D matching point pairs and the total number of first-stage particles in the first-stage particle swarm optimization model;
step 3.2, calculating a first-stage evaluation function of each first-stage particle:
Figure 569409DEST_PATH_IMAGE121
in the formula (I), the compound is shown in the specification,
Figure 14778DEST_PATH_IMAGE122
is shown as
Figure 60094DEST_PATH_IMAGE123
A first-stage evaluation function for the first-stage particles, K being a camera intrinsic parameter,
Figure 986462DEST_PATH_IMAGE124
is as follows
Figure 66413DEST_PATH_IMAGE125
The corresponding external parameter of the first-stage particle, P represents the coordinate of any 3D point in the detection frame,
Figure 154455DEST_PATH_IMAGE126
representing the set of all 3D points in all detection boxes, HSS is a binary function, represented as:
Figure 952647DEST_PATH_IMAGE127
in the formula (I), the compound is shown in the specification,
Figure 417126DEST_PATH_IMAGE128
representing the mask in the camera image corresponding to the detection box,
Figure 617163DEST_PATH_IMAGE129
indicating that the 3D point P falls within the mask after projection,
Figure 876106DEST_PATH_IMAGE130
indicating that the 3D point P does not fall in the mask after projection;
step 3.3, taking the first-stage particle with the lowest first-stage evaluation function as the optimal first-stage particle in the current population, and calculating the updating speed of each iteration of each first-stage particle:
Figure 161594DEST_PATH_IMAGE131
in the formula (I), the compound is shown in the specification,
Figure 164185DEST_PATH_IMAGE132
is shown as
Figure 953150DEST_PATH_IMAGE133
The first stage particles are in
Figure 914153DEST_PATH_IMAGE134
The update rate at the time of the sub-iteration,
Figure 421357DEST_PATH_IMAGE135
representing the optimal first phase particles in the current population,
Figure 962060DEST_PATH_IMAGE136
representing the first phase particles that are optimal for the historical iteration,
Figure 136689DEST_PATH_IMAGE137
in order to be the step size,
Figure 3014DEST_PATH_IMAGE138
is a random number, and is a random number,
Figure 997515DEST_PATH_IMAGE139
wherein, in the step (A),
Figure 341909DEST_PATH_IMAGE140
=1,2,···,
Figure 371045DEST_PATH_IMAGE141
Figure 408271DEST_PATH_IMAGE141
expressing the maximum iteration number of the first-stage particle swarm optimization model
Figure 624488DEST_PATH_IMAGE142
When the ratio is not less than 1,
Figure 38152DEST_PATH_IMAGE143
=0, wherein, when the number of iterations is the first time, the first-stage particle with the optimal historical iteration is the corresponding first-stage particle itself;
step 3.4, updating the first phase particles based on the update speed
Figure 659145DEST_PATH_IMAGE144
Step 3.5, calculating the second-stage evaluation function of each first-stage particle after the particle update:
Figure 132852DEST_PATH_IMAGE145
in the formula (I), the compound is shown in the specification,
Figure 836366DEST_PATH_IMAGE146
is shown as
Figure 522562DEST_PATH_IMAGE147
A second-stage evaluation function for the first-stage particles, K being a camera intrinsic parameter,
Figure 526290DEST_PATH_IMAGE148
indicating updated the second
Figure 905319DEST_PATH_IMAGE147
The corresponding external parameter of the first-stage particle, P represents the coordinate of any 3D point in the detection frame,
Figure 96129DEST_PATH_IMAGE149
representing the set of all 3D points in all detection boxes, HSS is a binary function, represented as:
Figure 851595DEST_PATH_IMAGE150
in the formula (I), the compound is shown in the specification,
Figure 178671DEST_PATH_IMAGE151
representing the mask in the camera image corresponding to the detection box,
Figure 994180DEST_PATH_IMAGE152
indicating that the 3D point P falls within the mask after projection,
Figure 672286DEST_PATH_IMAGE153
indicating that the 3D point P does not fall in the mask after projection;
step 3.6, extracting the first-stage particles with the lowest second-stage evaluation function
Figure 231444DEST_PATH_IMAGE154
,1≤A≤
Figure 678605DEST_PATH_IMAGE155
And is based on
Figure 133858DEST_PATH_IMAGE156
Expanding to obtain the particle swarm of the next iteration
Figure 564839DEST_PATH_IMAGE157
Figure 396529DEST_PATH_IMAGE158
Figure 229356DEST_PATH_IMAGE159
Figure 855509DEST_PATH_IMAGE160
Figure 508207DEST_PATH_IMAGE161
Figure 409167DEST_PATH_IMAGE162
In the formula (I), the compound is shown in the specification,
Figure 565342DEST_PATH_IMAGE163
is the step length;
step 3.7, judge
Figure 627976DEST_PATH_IMAGE164
If yes, outputting an external parameter corresponding to the optimal first-stage particle in the current population as an initial external parameter, otherwise, commanding
Figure 767970DEST_PATH_IMAGE165
And is combined withAnd 3.2, taking the optimal first-stage particle in the current population as the optimal first-stage particle of the historical iteration, and returning to the step 3.2.
And 4, inputting the initial external parameters serving as second-stage particles into a second-stage particle swarm optimization model for iterative optimization to obtain actual external parameters between the camera and the laser radar. Referring to fig. 6, the iterative optimization process of the second stage particle swarm optimization model is as follows:
step 4.1, guiding expansion based on the gradient of each variable in the initial external parameters to obtain a particle swarm of the second-stage particle swarm optimization model:
Figure 805629DEST_PATH_IMAGE166
Figure 347469DEST_PATH_IMAGE167
Figure 846583DEST_PATH_IMAGE168
Figure 677136DEST_PATH_IMAGE169
Figure 185477DEST_PATH_IMAGE170
Figure 581824DEST_PATH_IMAGE171
Figure 455102DEST_PATH_IMAGE172
in the formula (I), the compound is shown in the specification,
Figure 569688DEST_PATH_IMAGE173
a particle swarm representing a second stage particle swarm optimization model,
Figure 616142DEST_PATH_IMAGE174
representing the second stage in the particle swarm optimization model
Figure 132574DEST_PATH_IMAGE175
The number of the second-stage particles is,
Figure 176753DEST_PATH_IMAGE176
is shown as
Figure 44215DEST_PATH_IMAGE175
The azimuth angle in the individual second-stage particles,
Figure 628780DEST_PATH_IMAGE177
is shown as
Figure 202981DEST_PATH_IMAGE175
The tumbling angle in the individual second stage particles,
Figure 214799DEST_PATH_IMAGE178
is shown as
Figure 303978DEST_PATH_IMAGE175
The roll angle in the individual second-stage particles,
Figure 692234DEST_PATH_IMAGE179
is shown as
Figure 120941DEST_PATH_IMAGE175
The translation vector in each of the second-stage particles,
Figure 303661DEST_PATH_IMAGE180
representing the total number of second-stage particles in the second-stage particle swarm optimization model,
Figure 880136DEST_PATH_IMAGE181
indicating the azimuth angle in the initial outer reference,
Figure 72083DEST_PATH_IMAGE182
indicating the roll angle in the initial outer reference,
Figure 355297DEST_PATH_IMAGE183
indicating the roll angle in the initial profile,
Figure 977426DEST_PATH_IMAGE184
representing the translation vector in the initial outer reference,
Figure 41197DEST_PATH_IMAGE185
is the step length;
step 4.2, calculating the first-stage evaluation function of each second-stage particle:
Figure 771256DEST_PATH_IMAGE186
in the formula (I), the compound is shown in the specification,
Figure 908976DEST_PATH_IMAGE187
is shown as
Figure 699078DEST_PATH_IMAGE175
A first-stage evaluation function of second-stage particles, K being a camera parameter, P representing coordinates of any 3D point in the detection box,
Figure 250145DEST_PATH_IMAGE188
representing the set of all 3D points in all detection boxes, HSS is a binary function, represented as:
Figure 721577DEST_PATH_IMAGE189
in the formula (I), the compound is shown in the specification,
Figure 776121DEST_PATH_IMAGE190
representing the mask in the camera image corresponding to the detection box,
Figure 2703DEST_PATH_IMAGE191
indicating that the 3D point P falls within the mask after projection,
Figure 509908DEST_PATH_IMAGE192
indicating that the 3D point P does not fall in the mask after projection;
step 4.3, taking the second-stage particle with the lowest first-stage evaluation function as the optimal second-stage particle in the current population, and calculating the updating speed of each second-stage particle in each iteration:
Figure 785031DEST_PATH_IMAGE193
in the formula (I), the compound is shown in the specification,
Figure 225240DEST_PATH_IMAGE194
is shown as
Figure 91565DEST_PATH_IMAGE195
The second stage particles are in
Figure 554907DEST_PATH_IMAGE196
The update rate at the time of the sub-iteration,
Figure 430459DEST_PATH_IMAGE197
representing the optimal second-stage particles in the current population,
Figure 459595DEST_PATH_IMAGE198
representing the second-stage particles for which the historical iteration is optimal,
Figure 700084DEST_PATH_IMAGE199
in order to be the step size,
Figure 447460DEST_PATH_IMAGE200
is a random number, and is a random number,
Figure 126703DEST_PATH_IMAGE201
wherein, in the step (A),
Figure 213607DEST_PATH_IMAGE202
=1,2,···,
Figure 484052DEST_PATH_IMAGE203
Figure 656407DEST_PATH_IMAGE203
representing the iteration number of the second stage particle swarm optimization model when
Figure 605253DEST_PATH_IMAGE204
When the ratio is not less than 1,
Figure 874560DEST_PATH_IMAGE205
=0, wherein, when the number of iterations is the first time, the second-stage particle with the optimal historical iteration is the corresponding second-stage particle itself;
and 4.4, updating the second-stage particles based on the updating speed:
Figure 253589DEST_PATH_IMAGE206
step 4.5, calculating the second-stage evaluation function of each second-stage particle after the particle update:
Figure 913240DEST_PATH_IMAGE207
in the formula (I), the compound is shown in the specification,
Figure 934286DEST_PATH_IMAGE208
is shown as
Figure 792521DEST_PATH_IMAGE209
A second-stage evaluation function of second-stage particles, K being a camera parameter, P representing coordinates of any 3D point in the detection box,
Figure 811292DEST_PATH_IMAGE210
representing the set of all 3D points in all detection boxes, HSS is a binary function, represented as:
Figure 20557DEST_PATH_IMAGE211
in the formula (I), the compound is shown in the specification,
Figure 314135DEST_PATH_IMAGE212
representing the mask in the camera image corresponding to the detection box,
Figure 230138DEST_PATH_IMAGE213
indicating that the 3D point P falls within the mask after projection,
Figure 216549DEST_PATH_IMAGE213
indicating that the 3D point P has not fallen into the mask after projection
Figure 913109DEST_PATH_IMAGE214
Step 4.6, extracting the second-stage particles with the lowest second-stage evaluation function
Figure 10378DEST_PATH_IMAGE215
,1≤C≤
Figure 780888DEST_PATH_IMAGE216
Judgment of
Figure 203779DEST_PATH_IMAGE217
If yes, outputting
Figure 122057DEST_PATH_IMAGE218
As an actual external parameter between camera and lidar
Figure 757437DEST_PATH_IMAGE219
Will be
Figure 648033DEST_PATH_IMAGE220
As a function of the second stage of evaluationSecond-stage particles optimized for historical iteration
Figure 241825DEST_PATH_IMAGE221
And then returning to the step 4.2.
Fig. 7 is a schematic diagram of an effect of projecting point clouds onto an image after final external parameters are obtained through two-stage particle swarm convergence in this embodiment, where a first line of image and a second line of image are distributed as calibration results of two scenes, fig. 7(a), 7(b), and 7(c) are a change process of the point clouds projected onto the image when a first scene is optimized, and fig. 7(d), 7(e), and 7(f) are a change process of the point clouds projected onto the image when a second scene is optimized.
The calibration method disclosed by the embodiment takes pedestrians visible everywhere in a natural scene as a target, does not need to prepare a calibration plate and other special objects for calibration, does not need to have a scene with a special geometric structure such as a building, and greatly expands the use scene of the calibration method. Because of the sparsity of the radar point cloud, the point cloud and the camera image do not have a one-to-one mapping relationship, and the position of the feature point obtained by feature extraction is not accurate, the calibration process is optimized through two-stage particle swarm convergence, the problem that the point cloud and the image cannot strictly correspond is solved, and a more accurate key point position is obtained, so that a more accurate external parameter result is obtained.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (5)

1. A laser radar and camera external reference calibration method based on natural scenes is characterized by comprising the following steps:
step 1, acquiring a laser point cloud and a camera image which are synchronous and have pedestrians, and acquiring detection frames of different individual pedestrians in the laser point cloud and masks of the different individual pedestrians in the camera image;
step 2, establishing mapping between the laser point cloud and pedestrians in the camera image based on the detection frame and the mask to obtain a plurality of 2D-3D matching point pairs of the camera image and the laser point cloud;
step 3, inputting the 2D-3D matching point pairs serving as first-stage particles into a first-stage particle swarm optimization model, iteratively optimizing the 2D-3D matching point pairs, and obtaining initial external parameters based on the optimized 2D-3D matching point pairs;
step 4, inputting the initial external parameters serving as second-stage particles into a second-stage particle swarm optimization model for iterative optimization to obtain actual external parameters between the camera and the laser radar;
in step 3, the iterative optimization of the first-stage particle swarm optimization model specifically includes the following steps:
step 3.1, constructing first-stage particles based on all the 2D-3D matching point pairs:
Figure 790794DEST_PATH_IMAGE001
in the formula (I), the compound is shown in the specification,
Figure 139867DEST_PATH_IMAGE002
a particle swarm representing a first-stage particle swarm optimization model,
Figure 312222DEST_PATH_IMAGE003
is shown as
Figure 529577DEST_PATH_IMAGE004
The coordinates of the 2D points in the set of 2D-3D matching point pairs,
Figure 408671DEST_PATH_IMAGE005
is shown as
Figure 256542DEST_PATH_IMAGE004
Seating of 3D points in a set of 2D-3D matching pointsThe mark is that,
Figure 978510DEST_PATH_IMAGE006
representing the first stage in a particle swarm optimization model
Figure 609343DEST_PATH_IMAGE004
The number of the first-stage particles,
Figure 670839DEST_PATH_IMAGE007
representing the total number of the 2D-3D matching point pairs and the total number of first-stage particles in the first-stage particle swarm optimization model;
step 3.2, calculating a first-stage evaluation function of each first-stage particle:
Figure 486349DEST_PATH_IMAGE008
in the formula (I), the compound is shown in the specification,
Figure 296611DEST_PATH_IMAGE009
is shown as
Figure 59031DEST_PATH_IMAGE004
A first-stage evaluation function for the first-stage particles, K being a camera intrinsic parameter,
Figure 771772DEST_PATH_IMAGE010
is as follows
Figure 899128DEST_PATH_IMAGE004
The corresponding external parameter of the first-stage particle, P represents the coordinate of any 3D point in the detection frame,
Figure 267792DEST_PATH_IMAGE011
the HSS is a binary function, expressed as:
Figure 365061DEST_PATH_IMAGE013
in the formula (I), the compound is shown in the specification,
Figure 338834DEST_PATH_IMAGE014
representing the mask in the camera image corresponding to the detection box,
Figure 699408DEST_PATH_IMAGE015
indicating that the 3D point P falls within the mask after projection,
Figure 352106DEST_PATH_IMAGE016
indicating that the 3D point P does not fall in the mask after projection;
step 3.3, taking the first-stage particle with the lowest first-stage evaluation function as the optimal first-stage particle in the current population, and calculating the updating speed of each iteration of each first-stage particle:
Figure 128432DEST_PATH_IMAGE017
+
Figure 81345DEST_PATH_IMAGE018
in the formula (I), the compound is shown in the specification,
Figure 612820DEST_PATH_IMAGE019
is shown as
Figure 628181DEST_PATH_IMAGE020
The first stage particles are in
Figure 332831DEST_PATH_IMAGE021
The update rate at the time of the sub-iteration,
Figure 77933DEST_PATH_IMAGE022
representing the optimal first phase particles in the current population,
Figure 452414DEST_PATH_IMAGE023
representing the first phase particles that are optimal for the historical iteration,
Figure 345284DEST_PATH_IMAGE024
Figure 463413DEST_PATH_IMAGE025
in order to be the step size,
Figure 63021DEST_PATH_IMAGE026
is a random number, and is a random number,
Figure 998616DEST_PATH_IMAGE027
wherein, in the step (A),
Figure 316465DEST_PATH_IMAGE021
=1,2,···,
Figure 238285DEST_PATH_IMAGE028
Figure 754717DEST_PATH_IMAGE028
expressing the maximum iteration number of the first-stage particle swarm optimization model
Figure 798896DEST_PATH_IMAGE021
When the ratio is not less than 1,
Figure 273215DEST_PATH_IMAGE029
=0;
and 3.4, updating the first-stage particles based on the updating speed:
Figure 123359DEST_PATH_IMAGE030
step 3.5, calculating the second-stage evaluation function of each first-stage particle after the particle update:
Figure 697560DEST_PATH_IMAGE031
in the formula (I), the compound is shown in the specification,
Figure 584745DEST_PATH_IMAGE032
is shown as
Figure 939503DEST_PATH_IMAGE020
A second-stage evaluation function for the first-stage particles, K being a camera intrinsic parameter,
Figure 265442DEST_PATH_IMAGE033
indicating updated the second
Figure 366253DEST_PATH_IMAGE020
The corresponding external parameter of the first-stage particle, P represents the coordinate of any 3D point in the detection frame,
Figure 80131DEST_PATH_IMAGE034
representing the set of all 3D points in all detection boxes,
Figure 266393DEST_PATH_IMAGE035
Figure 396023DEST_PATH_IMAGE036
in the formula (I), the compound is shown in the specification,
Figure 7133DEST_PATH_IMAGE037
representing the mask in the camera image corresponding to the detection box,
Figure 564016DEST_PATH_IMAGE038
indicating that the 3D point P falls within the mask after projection,
Figure 503153DEST_PATH_IMAGE039
indicating that the 3D point P does not fall in the mask after projection;
step 3.6, extracting the first-stage particles with the lowest second-stage evaluation function
Figure 233212DEST_PATH_IMAGE040
,1≤A≤
Figure 636512DEST_PATH_IMAGE041
And is based on
Figure 36400DEST_PATH_IMAGE042
Expanding to obtain the particle swarm of the next iteration
Figure 587467DEST_PATH_IMAGE043
Figure 731004DEST_PATH_IMAGE044
Figure 988810DEST_PATH_IMAGE045
Figure 215392DEST_PATH_IMAGE046
Figure 863542DEST_PATH_IMAGE047
Figure 873086DEST_PATH_IMAGE048
In the formula (I), the compound is shown in the specification,
Figure 313295DEST_PATH_IMAGE049
Figure 57915DEST_PATH_IMAGE050
Figure 521258DEST_PATH_IMAGE051
is the step length;
step 3.7, judge
Figure 131231DEST_PATH_IMAGE052
=
Figure 301312DEST_PATH_IMAGE053
If yes, outputting an external parameter corresponding to the optimal first-stage particle in the current population as an initial external parameter, otherwise, commanding
Figure 807380DEST_PATH_IMAGE052
=
Figure 289177DEST_PATH_IMAGE052
+1, and taking the optimal first-stage particle in the current population as the optimal first-stage particle for historical iteration, and returning to the step 3.2.
2. The method for calibrating the external parameters of the laser radar and the camera based on the natural scene as claimed in claim 1, wherein in step 1:
inputting the laser point cloud into a pointpilars point cloud detection network, and outputting detection frames of different individual pedestrians;
the camera images are input into an example segmentation network MRCNN, and masks of different individual pedestrians are output.
3. The method for calibrating the external parameters of the laser radar and the camera based on the natural scene as claimed in claim 1, wherein the specific process of the step 2 is as follows:
step 2.1, converting the laser point cloud into a depth map, and determining coordinates corresponding to the radar point cloud and a detection frame according to pixel values in the detection frame in the depth map;
step 2.2, corresponding the depth map to the pedestrian in the camera image through manual marking to obtain pedestrian mapping of the laser point cloud and the camera image, and then extracting the central point and the head vertex of the pedestrian in the laser point cloud and the camera image respectively to obtain a plurality of groups of matching point pairs, wherein each group of matching point pairs comprises a 2D point in the camera image and a 3D point in the laser point cloud;
step 2.3, extracting a plurality of groups of 2D points and 3D points with mapping relations in the camera image, around the corresponding 2D points and around the corresponding 3D points in the laser point cloud respectively for each group of matching point pairs in the step 2.2 to obtain a plurality of new matching point pairs;
and 2.4, integrating the matching point pairs obtained in the step 2.2 and the step 2.3 to obtain the 2D-3D matching point pair in the step 2.
4. The method for external reference calibration of lidar and camera based on natural scene as claimed in claim 1, 2 or 3, wherein in step 4, the iterative optimization of the second stage particle swarm optimization model specifically comprises the following steps:
step 4.1, guiding expansion based on the gradient of each variable in the initial external parameter to obtain a particle swarm of a second-stage particle swarm optimization model;
step 4.2, calculating a first-stage evaluation function of each second-stage particle;
4.3, taking the second-stage particle with the lowest first-stage evaluation function as the optimal second-stage particle in the current population, and calculating the updating speed of each second-stage particle in each iteration;
4.4, updating the second-stage particles based on the updating speed;
step 4.5, calculating a second-stage evaluation function of each second-stage particle after the particle is updated;
and 4.6, extracting second-stage particles with the lowest second-stage evaluation function, judging whether the iteration frequency is the maximum or not, if so, outputting the second-stage particles with the lowest second-stage evaluation function as actual external parameters between the camera and the laser radar, otherwise, iterating once and updating the second-stage particles with the optimal historical iteration, and returning to the step 4.2.
5. The method for external reference calibration of lidar and camera based on natural scene as claimed in claim 4, wherein the iterative optimization of the second stage particle swarm optimization model comprises:
step 4.1, guiding expansion based on the gradient of each variable in the initial external parameters to obtain a particle swarm of the second-stage particle swarm optimization model:
Figure 843786DEST_PATH_IMAGE054
Figure 727428DEST_PATH_IMAGE055
Figure 404397DEST_PATH_IMAGE056
Figure 248856DEST_PATH_IMAGE057
Figure 466211DEST_PATH_IMAGE058
Figure 673202DEST_PATH_IMAGE059
Figure 193176DEST_PATH_IMAGE060
in the formula (I), the compound is shown in the specification,
Figure 915144DEST_PATH_IMAGE061
a particle swarm representing a second stage particle swarm optimization model,
Figure 873873DEST_PATH_IMAGE062
representing the second stage in the particle swarm optimization model
Figure 341894DEST_PATH_IMAGE063
The number of the second-stage particles is,
Figure 422983DEST_PATH_IMAGE064
is shown as
Figure 242034DEST_PATH_IMAGE063
The azimuth angle in the individual second-stage particles,
Figure 4454DEST_PATH_IMAGE065
is shown as
Figure 717195DEST_PATH_IMAGE063
The tumbling angle in the individual second stage particles,
Figure 578972DEST_PATH_IMAGE066
is shown as
Figure 9953DEST_PATH_IMAGE063
The roll angle in the individual second-stage particles,
Figure 310484DEST_PATH_IMAGE067
is shown as
Figure 15748DEST_PATH_IMAGE063
The translation vector in each of the second-stage particles,
Figure 173060DEST_PATH_IMAGE068
representing the total number of second-stage particles in the second-stage particle swarm optimization model,
Figure 294599DEST_PATH_IMAGE069
indicating the azimuth angle in the initial outer reference,
Figure 70925DEST_PATH_IMAGE070
indicating the roll angle in the initial outer reference,
Figure 758259DEST_PATH_IMAGE071
indicating the roll angle in the initial profile,
Figure 289734DEST_PATH_IMAGE072
representing the translation vector in the initial outer reference,
Figure 305095DEST_PATH_IMAGE073
Figure 275325DEST_PATH_IMAGE074
Figure 692531DEST_PATH_IMAGE075
Figure 394908DEST_PATH_IMAGE076
Figure 287777DEST_PATH_IMAGE077
Figure 405906DEST_PATH_IMAGE078
is the step length;
step 4.2, calculating the first-stage evaluation function of each second-stage particle:
Figure 5515DEST_PATH_IMAGE079
in the formula (I), the compound is shown in the specification,
Figure 675530DEST_PATH_IMAGE080
is shown as
Figure 931062DEST_PATH_IMAGE063
A first-stage evaluation function of second-stage particles, K being a camera parameter, P representing coordinates of any 3D point in the detection box,
Figure 711936DEST_PATH_IMAGE081
the HSS is a binary function, expressed as:
Figure 431631DEST_PATH_IMAGE082
in the formula (I), the compound is shown in the specification,
Figure 413493DEST_PATH_IMAGE083
representing the mask in the camera image corresponding to the detection box,
Figure 15376DEST_PATH_IMAGE084
indicating that the 3D point P falls within the mask after projection,
Figure 803203DEST_PATH_IMAGE085
indicating that the 3D point P does not fall in the mask after projection;
step 4.3, taking the second-stage particle with the lowest first-stage evaluation function as the optimal second-stage particle in the current population, and calculating the updating speed of each second-stage particle in each iteration:
Figure 315087DEST_PATH_IMAGE086
+
Figure 326906DEST_PATH_IMAGE087
in the formula (I), the compound is shown in the specification,
Figure 353767DEST_PATH_IMAGE088
is shown as
Figure 632038DEST_PATH_IMAGE063
The second stage particles are in
Figure 123062DEST_PATH_IMAGE089
The update rate at the time of the sub-iteration,
Figure 181148DEST_PATH_IMAGE090
representing the optimal second-stage particles in the current population,
Figure 960885DEST_PATH_IMAGE091
representing the second-stage particles for which the historical iteration is optimal,
Figure 152832DEST_PATH_IMAGE092
Figure 108150DEST_PATH_IMAGE093
in order to be the step size,
Figure 930612DEST_PATH_IMAGE094
is a random number, and is a random number,
Figure 728804DEST_PATH_IMAGE095
wherein, in the step (A),
Figure 599808DEST_PATH_IMAGE089
=1,2,···,
Figure 534266DEST_PATH_IMAGE096
Figure 527630DEST_PATH_IMAGE096
representing the iteration number of the second stage particle swarm optimization model when
Figure 954063DEST_PATH_IMAGE097
When the ratio is not less than 1,
Figure 222234DEST_PATH_IMAGE098
=0;
and 4.4, updating the second-stage particles based on the updating speed:
Figure 480040DEST_PATH_IMAGE099
step 4.5, calculating the second-stage evaluation function of each second-stage particle after the particle update:
Figure 581988DEST_PATH_IMAGE100
in the formula (I), the compound is shown in the specification,
Figure 89192DEST_PATH_IMAGE101
is shown as
Figure 364316DEST_PATH_IMAGE063
A second-stage evaluation function of second-stage particles, K being a camera parameter, P representing coordinates of any 3D point in the detection box,
Figure 414312DEST_PATH_IMAGE102
representing the set of all 3D points in all detection boxes,
Figure 546216DEST_PATH_IMAGE103
Figure 681662DEST_PATH_IMAGE104
in the formula (I), the compound is shown in the specification,
Figure 494897DEST_PATH_IMAGE105
representing the mask in the camera image corresponding to the detection box,
Figure 789612DEST_PATH_IMAGE106
indicating that the 3D point P falls within the mask after projection,
Figure 699275DEST_PATH_IMAGE107
indicating that the 3D point P does not fall in the mask after projection;
step 4.6, extracting the second-stage particles with the lowest second-stage evaluation function
Figure 446651DEST_PATH_IMAGE108
,1≤C≤
Figure 63577DEST_PATH_IMAGE109
Judgment of
Figure 88165DEST_PATH_IMAGE110
=
Figure 765134DEST_PATH_IMAGE111
If yes, outputting
Figure 734227DEST_PATH_IMAGE108
As an actual external parameter between camera and lidar
Figure 826948DEST_PATH_IMAGE110
=
Figure 830676DEST_PATH_IMAGE110
+1, will
Figure 412967DEST_PATH_IMAGE112
Second stage of evaluation of the functionSecond stage particles optimized for historical iterations
Figure 744722DEST_PATH_IMAGE113
And then returning to the step 4.2.
CN202110716414.4A 2021-06-28 2021-06-28 External parameter calibration method of laser radar and camera based on natural scene Active CN113256696B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110716414.4A CN113256696B (en) 2021-06-28 2021-06-28 External parameter calibration method of laser radar and camera based on natural scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110716414.4A CN113256696B (en) 2021-06-28 2021-06-28 External parameter calibration method of laser radar and camera based on natural scene

Publications (2)

Publication Number Publication Date
CN113256696A CN113256696A (en) 2021-08-13
CN113256696B true CN113256696B (en) 2021-09-24

Family

ID=77189798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110716414.4A Active CN113256696B (en) 2021-06-28 2021-06-28 External parameter calibration method of laser radar and camera based on natural scene

Country Status (1)

Country Link
CN (1) CN113256696B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115184909B (en) * 2022-07-11 2023-04-07 中国人民解放军国防科技大学 Vehicle-mounted multi-spectral laser radar calibration system and method based on target detection
CN115471574B (en) * 2022-11-02 2023-02-03 北京闪马智建科技有限公司 External parameter determination method and device, storage medium and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108416811A (en) * 2018-03-08 2018-08-17 云南电网有限责任公司电力科学研究院 A kind of video camera self-calibrating method and device
CN112785702A (en) * 2020-12-31 2021-05-11 华南理工大学 SLAM method based on tight coupling of 2D laser radar and binocular camera
CN112907681A (en) * 2021-02-26 2021-06-04 北京中科慧眼科技有限公司 Combined calibration method and system based on millimeter wave radar and binocular camera

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447869B (en) * 2015-11-30 2019-02-12 四川华雁信息产业股份有限公司 Camera self-calibration method and device based on particle swarm optimization algorithm
EP3477334A1 (en) * 2017-10-24 2019-05-01 Veoneer Sweden AB Automatic radar sensor calibration
CN107977997B (en) * 2017-11-29 2020-01-17 北京航空航天大学 Camera self-calibration method combined with laser radar three-dimensional point cloud data
CN108509918B (en) * 2018-04-03 2021-01-08 中国人民解放军国防科技大学 Target detection and tracking method fusing laser point cloud and image
CN110456330B (en) * 2019-08-27 2021-07-09 中国人民解放军国防科技大学 Method and system for automatically calibrating external parameter without target between camera and laser radar
CN110543727B (en) * 2019-09-05 2023-01-03 北京工业大学 Improved particle swarm algorithm-based omnidirectional mobile intelligent wheelchair robot parameter identification method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108416811A (en) * 2018-03-08 2018-08-17 云南电网有限责任公司电力科学研究院 A kind of video camera self-calibrating method and device
CN112785702A (en) * 2020-12-31 2021-05-11 华南理工大学 SLAM method based on tight coupling of 2D laser radar and binocular camera
CN112907681A (en) * 2021-02-26 2021-06-04 北京中科慧眼科技有限公司 Combined calibration method and system based on millimeter wave radar and binocular camera

Also Published As

Publication number Publication date
CN113256696A (en) 2021-08-13

Similar Documents

Publication Publication Date Title
CN108242079B (en) VSLAM method based on multi-feature visual odometer and graph optimization model
CN110853075B (en) Visual tracking positioning method based on dense point cloud and synthetic view
CN106940704B (en) Positioning method and device based on grid map
Bazin et al. 3-line RANSAC for orthogonal vanishing point detection
CN109272537B (en) Panoramic point cloud registration method based on structured light
CA2826534C (en) Backfilling points in a point cloud
CN112001958B (en) Virtual point cloud three-dimensional target detection method based on supervised monocular depth estimation
CN111899301A (en) Workpiece 6D pose estimation method based on deep learning
CN113256696B (en) External parameter calibration method of laser radar and camera based on natural scene
CN112819903A (en) Camera and laser radar combined calibration method based on L-shaped calibration plate
CN111998862B (en) BNN-based dense binocular SLAM method
CN111144349A (en) Indoor visual relocation method and system
CN113050074B (en) Camera and laser radar calibration system and calibration method in unmanned environment perception
CN113361365B (en) Positioning method, positioning device, positioning equipment and storage medium
Bybee et al. Method for 3-D scene reconstruction using fused LiDAR and imagery from a texel camera
CN112785724A (en) Visual color matching method for ancient buildings based on LiDAR point cloud and two-dimensional image
CN117197333A (en) Space target reconstruction and pose estimation method and system based on multi-view vision
CN109636897B (en) Octmap optimization method based on improved RGB-D SLAM
Yoon et al. Targetless multiple camera-LiDAR extrinsic calibration using object pose estimation
CN112258631B (en) Three-dimensional target detection method and system based on deep neural network
CN111198563B (en) Terrain identification method and system for dynamic motion of foot type robot
Le Besnerais et al. Dense height map estimation from oblique aerial image sequences
Buck et al. Capturing uncertainty in monocular depth estimation: Towards fuzzy voxel maps
CN116630953A (en) Monocular image 3D target detection method based on nerve volume rendering
CN115830116A (en) Robust visual odometer method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant