CN115661262A - Internal and external parameter calibration method and device and electronic equipment - Google Patents

Internal and external parameter calibration method and device and electronic equipment Download PDF

Info

Publication number
CN115661262A
CN115661262A CN202211281195.2A CN202211281195A CN115661262A CN 115661262 A CN115661262 A CN 115661262A CN 202211281195 A CN202211281195 A CN 202211281195A CN 115661262 A CN115661262 A CN 115661262A
Authority
CN
China
Prior art keywords
target
point cloud
environment
camera
internal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211281195.2A
Other languages
Chinese (zh)
Inventor
洪小平
苗子良
何不为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southern University of Science and Technology
Original Assignee
Southern University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southern University of Science and Technology filed Critical Southern University of Science and Technology
Priority to CN202211281195.2A priority Critical patent/CN115661262A/en
Publication of CN115661262A publication Critical patent/CN115661262A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the application provides an internal and external reference calibration method and device and electronic equipment, and relates to the technical field of sensor calibration. The method comprises the following steps: obtaining an environment point cloud and an environment image corresponding to the same environment; according to the initial external parameters, carrying out angle mapping, edge feature extraction and reverse mapping on the environment point cloud to obtain a target edge point cloud in the environment point cloud, wherein the space mapped to the target edge point cloud is a target space with a pitch angle and an azimuth angle in a camera coordinate system of a camera as orthogonal axes; according to the initial internal reference, performing angle mapping, edge feature extraction and reverse mapping on the environment image to obtain target edge pixel points in the environment image, and mapping the space to which the environment image is mapped to be a target space; and carrying out iterative updating on the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point cloud so as to obtain the target internal parameters and the target external parameters. Therefore, the internal and external parameter combined calibration of the camera and the laser radar is automatically completed under the condition of no marker.

Description

Internal and external parameter calibration method and device and electronic equipment
Technical Field
The application relates to the technical field of sensor calibration, in particular to an internal and external parameter calibration method and device and electronic equipment.
Background
At present, the camera is generally used in combination with the laser radar. In this approach, two parameters need to be calibrated. The first is the intrinsic parameters of the camera, which are used to map pixel coordinates to spatial coordinates. The second is external parameters between the camera and the lidar, i.e. rigid body transformation parameters between the two sensor coordinate systems. For the fusion use of multiple sensors, sensor calibration is the most important work.
Most of the existing camera internal reference calibration methods are based on fixed markers. The internal reference calibration method based on the fixed marker is limited by a calibration scene, needs manual operation of the marker and is very complicated; meanwhile, the calibration method needs to well design the position of the marker, so that a more accurate calibration result is obtained, the requirement on manual operation is higher, and the calibration working scene is further limited.
In addition, the existing external reference calibration method for the camera and the laser radar is carried out in two steps, namely, the internal reference of the camera is calibrated firstly, and then the external reference is calibrated based on the internal reference. In the traditional internal and external reference two-step calibration method, the error of the internal reference calibration result can be further amplified in the external reference calibration.
Disclosure of Invention
The embodiment of the application provides an internal and external reference calibration method, an internal and external reference calibration device, electronic equipment and a readable storage medium, which can automatically complete the internal and external reference combined calibration of a camera and a laser radar under the condition of no marker, can solve the problems of the existing camera internal reference calibration method and the external reference calibration method between the camera and the laser radar, and avoids the error amplification caused by two-step calibration process.
The embodiment of the application can be realized as follows:
in a first aspect, an embodiment of the present application provides an internal and external reference calibration method, where the method includes:
obtaining an environment point cloud and an environment image corresponding to the same environment, wherein the environment point cloud is obtained through a laser radar, and the environment image is obtained through a camera;
according to the initial external parameters, carrying out angle mapping, edge feature extraction and reverse mapping on the environment point cloud to obtain a target edge point cloud in the environment point cloud, wherein the space mapped to the environment point cloud is a target space with a pitch angle and an azimuth angle in a camera coordinate system of the camera as orthogonal axes;
according to the initial internal reference, carrying out angle mapping, edge feature extraction and reverse mapping on the environment image to obtain target edge pixel points in the environment image, wherein a space to which the environment image is mapped is the target space;
and iteratively updating the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point cloud so as to obtain the target internal parameters of the camera and the target external parameters between the laser radar and the camera.
In a second aspect, an embodiment of the present application provides an internal and external reference calibration apparatus, where the apparatus includes:
the system comprises an obtaining module, a processing module and a processing module, wherein the obtaining module is used for obtaining an environment point cloud and an environment image corresponding to the same environment, the environment point cloud is obtained through a laser radar, and the environment image is obtained through a camera;
the first edge determining module is used for carrying out angle mapping, edge feature extraction and reverse mapping on the environment point cloud according to initial external parameters to obtain target edge point cloud in the environment point cloud, wherein the space mapped to the target edge point cloud is a target space with a pitch angle and an azimuth angle in a camera coordinate system of the camera as orthogonal axes;
the second edge determining module is used for carrying out angle mapping, edge feature extraction and reverse mapping on the environment image according to the initial internal parameters to obtain target edge pixel points in the environment image, wherein the space to which the environment image is mapped is the target space;
and the optimization module is used for iteratively updating the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point clouds so as to obtain the target internal parameters of the camera and the target external parameters between the laser radar and the camera.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor and a memory, where the memory stores machine executable instructions that can be executed by the processor, and the processor can execute the machine executable instructions to implement the internal and external reference calibration method described in the foregoing embodiment.
In a fourth aspect, an embodiment of the present application provides a readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the internal and external reference calibration method according to the foregoing embodiment.
According to the internal and external reference calibration method, the internal and external reference calibration device, the electronic equipment and the readable storage medium, firstly, an environment image obtained by a camera is obtained, and environment point cloud obtained by collecting information of the same environment by using a laser radar is obtained; then, based on the initial external reference, carrying out angle mapping, edge feature extraction and reverse mapping on the environment point cloud to obtain a target edge point cloud in the environment point cloud, wherein the space mapped to the target edge point cloud is a target space with a pitch angle and an azimuth angle in a camera coordinate system of the camera as orthogonal axes; according to the initial internal reference, angle mapping, edge feature extraction and reverse mapping are carried out on the environment image to obtain target edge pixel points in the environment image, wherein the space to which the environment image is mapped is the target space; and finally, carrying out iterative updating on the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point cloud so as to obtain the target internal parameters of the camera and the target external parameters between the laser radar and the camera. Therefore, the internal and external reference combined calibration of the camera and the laser radar can be automatically completed under the environment without the marker, the problems of the existing camera internal reference calibration method and the external reference calibration method between the camera and the laser radar are solved, and the error amplification condition caused by the two-step calibration process is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a block diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an internal and external reference calibration method provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a laser radar and a camera according to an embodiment of the present disclosure;
FIG. 4 is a schematic flowchart illustrating the substeps involved in step S110 of FIG. 2;
FIG. 5 is a second schematic flowchart of the sub-steps included in step S110 in FIG. 2;
FIG. 6 is a flowchart illustrating the sub-steps included in step S120 in FIG. 2;
FIG. 7 is a schematic flow chart of sub-steps included in sub-step S123 of FIG. 6;
FIG. 8 is a flowchart illustrating the sub-steps included in step S130 of FIG. 2;
FIG. 9 is a schematic flow chart of the substeps involved in substep S132 of FIG. 8;
FIG. 10 is a flowchart illustrating the sub-steps included in step S140 of FIG. 2;
FIG. 11 is a schematic diagram illustrating an application of an internal and external reference calibration method according to an embodiment of the present application;
fig. 12 is a schematic block diagram of an internal and external reference calibration apparatus provided in an embodiment of the present application.
Icon: 100-an electronic device; 110-a memory; 120-a processor; 130-a communication unit; 200-internal and external reference calibration device; 210-an obtaining module; 220-a first edge determination module; 230-a second edge determination module; 240-optimization module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It is noted that relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
In the following, how to calibrate internal and external parameters is described by taking the combined use of a fisheye camera and a laser radar as an example.
The fish-eye camera is an ultra-wide-angle lens, and generally uses a plurality of groups of convex lenses or a combination of the convex lenses and a plane mirror to change an incident light path, so that incident light rays in a large view field can be focused on a smaller imaging plane. Although the fisheye camera has a large visual field, the fisheye camera is not widely applied in the field of robots, and one reason is that the fisheye camera is seriously distorted and is difficult to accurately calibrate.
When the fisheye camera and the laser radar are used in combination, two parameters need to be calibrated. The first is the internal parameters of the fisheye camera, which are used to map pixel coordinates to spatial coordinates. The second is external parameters between the camera and the lidar, i.e. rigid body transformation parameters between the two sensor coordinate systems. For the use of a sensor or the fusion use of multiple sensors, the calibration of the sensor is the most important and fundamental work.
At present, when a fisheye camera and a laser radar are used in a fusion mode, the general steps include calibrating camera internal parameters, and calibrating external parameters between the camera and the laser radar based on the calibrated camera internal parameters. The following describes the calibration of the two parameters.
The earliest fish-eye camera reference models were constructed by different types of trigonometric functions, such as orthogonal projection models, stereographic projection models, equal product projection models, etc., and later all these trigonometric function models can be described uniformly by polynomials using taylor expansion. Most of the fish-eye camera internal reference models up to this point are constructed using polynomials.
The existing fisheye camera internal reference calibration method is all based on fixed markers. For example, the widely used OcamCalib MATLAB Toolbox requires the board to be manually placed in different spatial positions and at least 10 pictures taken. The method realizes automatic angular point extraction, calculates the reprojection error of the angular point under the internal reference model based on the estimated postures of different chequers, and performs nonlinear optimization by taking the reprojection error as a cost function.
For another example, the positions of independent variables and dependent variables of the polynomial fisheye camera model are changed, so that a more direct model construction mode is realized, and meanwhile, fixed proportionality coefficients are added, so that the value ranges of coefficients of different orders of the polynomial are limited. The method uses the stripe pattern as a fixed marker, constructs a cost function through the prior geometrical relationship given by the pattern, and obtains the internal reference estimation of the fisheye camera in a nonlinear optimization mode.
In the existing external parameter calibration method for the fisheye camera and the laser radar, a bearing angle image is generated by using a point cloud of the laser radar, and edge features are enhanced. And (3) manually selecting corresponding edge feature points on the bearing angle image and the image of the fisheye camera of the laser radar, and constructing a PnP problem to optimize and solve the external parameters of the two sensors.
For the existing internal reference calibration method, the internal reference calibration method based on the fixed marker is limited by the calibration scene, and the marker needs to be operated manually, which is very complicated. Meanwhile, when the marker is close to the camera, the angular points are accurately extracted, but the angular point density is low, and the calibrated internal reference polynomial curve is not accurate in the area with few angular points. When the marker is far away, although the corner density can be increased through a plurality of chequers, the error of corner extraction can be increased because the pixel area occupied by the chequers in the image is small. Meanwhile, because the method needs to estimate the posture of the checkerboard relative to the camera, the farther the distance is, the less sensitive the estimation of the translation vector is, and errors are easy to generate. The existing external reference calibration method for the fisheye camera and the laser radar is only one, and the characteristic pixel points are selected completely by manpower, so that the error is very large.
Meanwhile, it should be noted that almost all existing camera calibration methods are two steps, in which the internal reference of the camera is calibrated first, and then the external reference of the camera is calibrated based on the internal reference. For the fish-eye camera model, the instability of the higher order polynomial results in a smaller error for each polynomial coefficient optimization resulting in an error for the larger polynomial curve. If the external reference is calibrated after an unstable internal reference calibration result is used as a priori, the deviation of the parameters is further amplified.
It will be appreciated that the above analysis is performed by taking a fisheye camera as an example, and that similar problems can still exist in the calibration method used when other types of cameras and lidar are used in combination.
Based on the above research, the embodiment of the application provides an internal and external parameter calibration method, an internal and external parameter calibration device, an electronic device, and a readable storage medium, wherein an environment image obtained by a camera and an environment point cloud obtained by a laser radar are obtained, then a target edge pixel point and a target edge point cloud are obtained through mapping, edge extraction, and reverse mapping, and further a target internal parameter and a target external parameter are obtained based on the target edge pixel point and the target edge point cloud. Therefore, the situation that the calibration process is limited by a calibration environment or a fixed marker can be solved, the dependence of the calibration process on manpower is eliminated, and the calibration process can be automatically completed in any environment; moreover, the problems that the calibration is inaccurate due to insufficient density of the characteristic points of the fixed marker or the distance between the fixed marker and the sensor greatly influences the calibration result are solved; the problem of calibration errors caused by manual selection of feature points in the existing fisheye camera and laser radar external reference calibration method is solved; and because the internal and external parameters of the fisheye camera and the laser radar are calibrated in a combined manner, all parameters are estimated in the same optimization process, so that the problem that the error caused by the unstable internal parameter calibration result in the traditional internal and external parameter two-step calibration method is further amplified in the external parameter calibration can be solved.
It should be noted that the defects existing in the above solutions are the results obtained after the inventor has practiced and studied carefully, and therefore, the discovery process of the above problems and the solutions proposed by the following embodiments of the present application to the above problems should be the contribution of the inventor to the present application in the process of the present application.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments and features of the embodiments described below can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a block diagram of an electronic device 100 according to an embodiment of the present disclosure. The electronic device 100 may be, but is not limited to, a computer, a server, etc. The electronic device 100 includes a memory 110, a processor 120, and a communication unit 130. The elements of the memory 110, the processor 120 and the communication unit 130 are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The memory 110 is used to store programs or data. The Memory 110 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Read Only Memory (EPROM), an electrically Erasable Read Only Memory (EEPROM), and the like.
The processor 120 is used to read/write data or programs stored in the memory 110 and perform corresponding functions. For example, the memory 110 stores an internal and external reference calibration apparatus 200, and the internal and external reference calibration apparatus 200 includes at least one software function module that can be stored in the memory 110 in the form of software or firmware (firmware). The processor 120 executes various functional applications and data processing by running software programs and modules stored in the memory 110, such as the extrinsic calibration apparatus 200 in the embodiment of the present application, so as to implement the extrinsic calibration method in the embodiment of the present application.
The communication unit 130 is used for establishing a communication connection between the electronic apparatus 100 and another communication terminal via a network, and for transceiving data via the network.
It should be understood that the structure shown in fig. 1 is only a schematic structural diagram of the electronic device 100, and the electronic device 100 may further include more or fewer components than those shown in fig. 1 (for example, the electronic device 100 may further include a camera and a radar), or have a different configuration from that shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
Referring to fig. 2, fig. 2 is a schematic flow chart of an internal and external reference calibration method according to an embodiment of the present application. The method may be applied to the electronic device 100. The specific process of the internal and external reference calibration method is explained in detail below. In this embodiment, the method may include steps S110 to S140.
Step S110, an environment point cloud and an environment image corresponding to the same environment are obtained.
In this embodiment, a camera may be used to capture an arbitrary environment, resulting in an environment image. The environment image includes color information in the environment. And the laser radar can be used for collecting information of the same environment so as to obtain environment point clouds corresponding to the same environment. The environmental point cloud may include three-dimensional coordinates of each scanned point in a lidar coordinate system and a reflectivity of the point. The reflectance is in the range of 0 to 255, and the reflectance can be directly used as a gray scale value of a scanning point on an image. The camera may be a fish-eye camera or other cameras, and may be determined by actual conditions.
The electronic device, the camera and the laser radar can be independent devices, and in this case, the camera can send the acquired image to the electronic device, so that the electronic device can obtain the environment image; the laser radar can send the point cloud collected aiming at the same environment to the electronic equipment, so that the electronic equipment obtains the environment point cloud. The electronic device and the camera and the laser radar can also be integrated devices, and the electronic device can obtain the environment image and the environment point cloud through controlling the camera and the laser radar. It should be noted that the above-mentioned manner of obtaining the environment image and the environment point cloud is only an example, and the environment image and the environment point cloud obtained in other manners may also be used, for example, another device sends the point cloud and the image of the same environment to the electronic device.
And step S120, according to the initial external parameters, carrying out angle mapping, edge feature extraction and reverse mapping on the environment point cloud to obtain a target edge point cloud in the environment point cloud.
In this embodiment, the initial external parameter between the laser radar and the camera may be preset, that is, the initial value of the external parameter is preset. And under the condition of obtaining the environment point cloud, carrying out angle mapping on the environment point cloud according to the initial external parameters so as to map the environment point cloud into a target space A. The target space a is a space in which a pitch angle and an azimuth angle in a camera coordinate system (three-dimensional coordinate system) of the camera are orthogonal axes. After the angle mapping is completed, edge feature extraction can be performed to extract edge features from the image of the environment point cloud in the target space A, and then the edge features are reversely mapped to the laser radar coordinate system to obtain the target edge point cloud in the environment point cloud. And the points in the target edge point cloud are edge points determined from the environment point cloud.
And step S130, according to the initial internal reference, performing angle mapping, edge feature extraction and reverse mapping on the environment image to obtain target edge pixel points in the environment image.
In this embodiment, the initial internal parameters of the camera may be preset, that is, the initial values of the internal parameters are preset. And under the condition of obtaining the environment image, carrying out angle mapping on the environment image according to the initial internal reference so as to map the environment image into the target space A. That is, the spaces mapped by the environment image and the environment point cloud are the same, and both are mapped to the target space a. Then, similar to the processing process of the environmental point cloud, edge extraction is performed to extract edge features from the image of the target space a corresponding to the environmental image, and the edge features are reversely mapped to a camera image coordinate system (a two-dimensional coordinate system, that is, a coordinate system where the environmental image is located) of the camera, so as to obtain target edge pixel points in the environmental image. And the target edge pixel points are edge pixel points determined in the environment image.
And step S140, performing iterative updating on the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point clouds to obtain target internal parameters of the camera and target external parameters between the laser radar and the camera.
The initial internal parameters and the initial external parameters can be optimized and updated according to the target edge pixel points and the target edge point cloud until the result of the target edge point cloud after being processed based on the updated internal parameters and the updated external parameters and the target edge pixel points meet preset conditions, and then optimization can be determined to be completed. When the optimization is completed, the internal parameter at the time can be used as a target internal parameter of the camera, and the external parameter at the time can be used as a target external parameter between the laser radar and the camera. The target internal parameter is an internal parameter of the calibrated camera, and the target external parameter is an external parameter of the calibrated laser radar relative to the camera.
The embodiment of the application provides a method for calibrating internal and external parameters together aiming at the condition of the fusion use of a camera and a laser radar, and under the condition of no marker, the internal and external parameters of the camera and the laser radar are calibrated together in the same optimization process, so that the problems of the fisheye camera internal parameter calibration method and the external parameter calibration method between the camera and the laser radar are solved, and the error amplification condition caused by the two-step calibration process is avoided.
Optionally, the laser radar may be a radar that scans repeatedly, or may also be a radar that scans non-repeatedly, which may be determined specifically by combining actual requirements. As a possible implementation manner, as shown in fig. 3, the lidar is a Livox Mid-360 lidar, which is a 4-line lidar and is the first lidar having both non-repetitive scanning characteristics and 360 ° horizontal view. Due to its non-repetitive scanning characteristics, the field coverage approaches 100% over time. The Camera is a Fisheye Camera (fisherye Camera) having a 360 ° horizontal field of view and a 70 ° vertical field of view. The whole sensor combination (including Livox Mid-360 laser radar and a fish-eye camera) is small in size, and the volumes of the laser radar and the fish-eye camera are only 6.5x6.5x6.5cm and 5x5x10cm respectively, so that the sensor combination can be integrated on any mobile platform, such as a mobile chassis, a machine dog, an unmanned aerial vehicle and the like.
Because the view field of the camera in the vertical direction is different from the view field of the laser radar in the vertical direction, if the view field of the camera in the vertical direction is larger than the view field of the laser radar in the vertical direction, if the internal and external reference calibration is directly performed on the basis of the point cloud acquired by the laser radar and the image acquired by the camera, the calibration result may be poor due to the point cloud information deletion of the laser radar. To avoid this, the environmental point cloud may be obtained in the manner shown in fig. 4.
Referring to fig. 4, fig. 4 is a flowchart illustrating one of the sub-steps included in step S110 in fig. 2. In this embodiment, the process of acquiring the environmental point cloud in step S110 may include sub-steps S111 to S112.
And a substep S111, obtaining initial point clouds at different angles through the laser radar.
And a substep S112, splicing the initial point clouds at different angles to obtain the environment point cloud.
In this embodiment, when the field of view of the camera in the vertical direction is greater than the field of view of the lidar in the vertical direction, the attitude of the lidar may be changed, so that the lidar scans at different attitudes, thereby obtaining initial point clouds corresponding to different angles. Thus, point clouds can be accumulated at multiple viewing angles. The specific posture transformation mode can be determined by combining with actual requirements as long as the subsequently obtained view corresponding to the environment point cloud is not less than the view corresponding to the environment image, namely the view corresponding to the environment point cloud is required to include the view corresponding to the environment image. And then, splicing the initial point clouds at different angles to obtain the environmental point cloud. Alternatively, the concatenation may be performed using an ICP (Iterative Closest Point) algorithm. For example, the initial point cloud may be obtained using a non-repetitive scanning lidar, and a sub-pixel level (characteristic of the non-repetitive scanning lidar) point cloud that approximately covers the full field of view of the lidar may be obtained by accumulating the elapsed time.
As a possible implementation, in order to enhance robustness under different illumination environments, the environment image may be obtained by the method shown in fig. 5. Referring to fig. 5, fig. 5 is a second schematic flowchart illustrating the sub-steps included in step S110 in fig. 2. In the present embodiment, the process of acquiring the environmental image in step S110 may include sub-step S115 to sub-step S116.
And a substep S115, obtaining initial images of the camera obtained in different exposure time lengths in the environment.
And a substep S116 of exposing and fusing the obtained plurality of initial images to obtain the environment image.
In this embodiment, the cameras may be controlled to perform image acquisition in the same environment by using different exposure durations, so as to obtain multiple initial images corresponding to different exposure durations. Then, exposure fusion may be performed on the plurality of initial images, resulting in a High Dynamic Range (HDR) image as the environment image. In this way, robustness under different lighting environments may be enhanced.
In the case of obtaining the environmental point cloud, the target edge point cloud may be obtained in the manner shown in fig. 6. Referring to fig. 6, fig. 6 is a flowchart illustrating sub-steps included in step S120 in fig. 2. In the present embodiment, step S120 may include substeps S121 to substep S125.
And a substep S121, transferring the environment point cloud to the camera coordinate system according to the initial external parameters to obtain a converted environment point cloud.
And a substep S122, calculating a first pitch angle and a first azimuth angle corresponding to each point in the converted environment point cloud according to the coordinate of the point in the camera coordinate system.
In this embodiment, the first pitch angle and the first azimuth angle corresponding to each point in the environment point cloud after the conversion to the camera coordinate system can be calculated according to the following formulas (1) to (3). Ginseng radix extract
Figure BDA0003898286910000121
The environment point cloud can be converted from the laser radar coordinate system to the camera coordinate system according to the initial external parameter by the following formula (1), and the converted environment point cloud is obtained. Wherein, formula (1) is:
Figure BDA0003898286910000122
wherein, C p represents a three-dimensional point in the camera coordinate system { C }, L p represents a three-dimensional point under a laser radar coordinate system; in the case of a given external parameter a,
Figure BDA0003898286910000123
representing an extrinsic transformation (by rotation) from the lidar coordinate system { L } to the camera coordinate system { C } for a three-dimensional point in the lidar coordinate system
Figure BDA0003898286910000124
And a translation
Figure BDA0003898286910000125
)。
After the conversion is finished, according to the three-dimensional coordinates of each point in the converted environment point cloud, the pitch angle theta can be calculated based on the formula (2), and the azimuth angle phi can be calculated based on the formula (3). Wherein, the formula (2) is:
Figure BDA0003898286910000126
formula (3)
Figure BDA0003898286910000127
And a substep S123 of generating a first projection image in the target space according to the first pitch angle, the first azimuth angle and the reflectivity corresponding to each point.
According to a first pitch angle, a first azimuth angle and reflectivity corresponding to each point in the environment point cloud under the camera coordinate system, the reflectivity is used as a gray value, and a first projection image is generated in the target space A. Optionally, the determining manner of the gray value of each pixel grid in the first projection image may be determined in combination with actual requirements.
Alternatively, the size of the target space a may be set in advance. For example,
Figure BDA0003898286910000128
wherein the range of azimuth angles is [0,2 π]The pitch angle is in the range of [0, pi ]]In the target space a, the horizontal axis is the azimuth angle, the vertical axis is the pitch angle, and 8000 is the set target space size. In this manner, the corresponding pitch and azimuth ranges for each pixel grid in the target space a may be determined.
Under the condition of determining the first pitch angle, the first azimuth angle and the reflectivity corresponding to each pixel grid in the target space A and each point in the environmental point cloud under the camera coordinate system, the gray value of each pixel point in the target space A can be determined in a corresponding mode according to actual requirements, and therefore the first projection image is obtained.
As a possible implementation, the first projection image may be obtained in the manner shown in fig. 7. Referring to fig. 7, fig. 7 is a flowchart illustrating sub-steps included in sub-step S123 in fig. 6. In the present embodiment, step S123 may include sub-steps S1231 to S1232.
And a substep S1231 of searching for each pixel grid in the target space by taking the center point of the pixel grid as a center and the first preset distance as a radius according to the first pitch angle and the first azimuth angle corresponding to each point.
And a substep S1232 of determining a gray level of the pixel grid according to the reflectivity corresponding to the searched point to obtain the first projection image.
In this embodiment, according to a first pitch angle and a first azimuth angle corresponding to each point in the environmental point cloud under the camera coordinates, for each pixel grid in the target space a, a center point of the pixel grid is used as a search center, and a first preset distance is used as a radius in the point cloud mapped into the target space a to perform search; and determining the gray value of the pixel grid according to the reflectivity of all the searched points based on the pixel grid. For example, the average value of the reflectances of all the points searched based on the pixel grid is used as the gradation value of the pixel grid. The first preset distance may be determined by combining actual requirements. In this way, when there are a plurality of projection points in one pixel cell, the gray scale value of each pixel can be determined after the above processing is performed for each pixel cell, and the first projection image can be obtained.
And a substep S124 of performing edge extraction on the first projection image to obtain a first target edge feature.
Optionally, an edge feature extraction algorithm, such as a Canny algorithm, may be used to perform edge feature extraction on the first projection image to obtain the first target edge feature.
As a possible implementation manner, an edge feature extraction algorithm may be used to extract a first edge feature from the first projection image, and then filter according to the length of an edge pixel, so as to retain the first edge feature with a longer length as the first target edge feature. Optionally, the pixel length of each first edge feature may be compared with a first preset pixel length, and the first edge feature having a pixel length greater than the first preset pixel length may be used as the first target edge feature.
And a substep S125 of performing reverse mapping on the first target edge feature according to a mapping manner used when mapping to the target space to obtain the target edge point cloud.
Under the condition of obtaining the first target edge feature, the first target edge feature in the target space a may be reversely mapped back to the corresponding original space (i.e., radar coordinate system) according to the mapping relationship between the pixel grid and the corresponding point cloud when the first projection image is generated previously, so as to obtain the target edge point cloud. Thus, the target edge point cloud can be automatically determined from the environment point cloud.
Similarly, the environment image is processed in a similar manner to obtain target edge pixel points.
Referring to fig. 8, fig. 8 is a flowchart illustrating sub-steps included in step S130 in fig. 2. In the present embodiment, step S130 may include sub-steps S131 to S134.
And a substep S131, calculating a second pitch angle and a second azimuth angle corresponding to each pixel point in the environment image according to the initial internal reference and the environment image.
The following describes how to obtain the second pitch angle and the second azimuth angle by taking the camera as a fisheye camera as an example. In this embodiment, the second pitch angle and the second azimuth angle corresponding to each pixel point in the environment image may be calculated according to the following formulas (4) to (7). Internal reference
Figure BDA0003898286910000141
The environment image may be corrected by equation (4) first. Wherein, the formula (4) is:
Figure BDA0003898286910000142
wherein, [ u ', v']Pixel coordinates representing two-dimensional points of the processed ambient image,
Figure BDA0003898286910000143
represents a distortion correction matrix, [ u, v]Pixel coordinates of a two-dimensional point representing an ambient image, [ u ] 0 ,v 0 ]Pixel coordinates representing a center point of the ambient image.
Then, a second pitch angle and a second azimuth angle corresponding to each pixel point in the environment image can be calculated based on the formulas (5) to (7):
Figure BDA0003898286910000151
Figure BDA0003898286910000152
θ=F -1 (r;a 0 ,...,a n ) (7)
wherein the formula (7) is the formula (8) r = F (θ; a) 0 ,...,a n )=a 0 +a 1 θ+...+a n θ n The inverse of (3) is obtained by spline curve fitting. F (theta; a) 0 ,...,a n ) Is a polynomial of a reference model of the fisheye camera which calculates the pitch angle as [ u [ ] 0 ,v 0 ]The radius of the pixel at the center r.
In this way, the spatial pitch and azimuth angles may be calculated starting from the camera pixel coordinates and thus projected into the target space.
And a substep S132, generating a second projection image in the target space according to a second pitch angle, a second azimuth angle and a gray value corresponding to each pixel point in the environment image.
The gray value corresponding to each pixel grid in the target space A can be determined according to the second pitch angle, the second azimuth angle and the gray value corresponding to each pixel point in the environment image, and therefore a second projection image is generated in the target space A. Optionally, the determination manner of the gray-scale value of each pixel grid in the second projection image may be determined in combination with actual requirements.
As a possible implementation, the second projection image may be obtained in the manner shown in fig. 9. Referring to fig. 9, fig. 9 is a flowchart illustrating sub-steps included in sub-step S132 in fig. 8. In the present embodiment, step S132 may include substeps S1321 to substep S1322.
And a substep S1321, searching for each pixel grid in the target space by taking a center point of the pixel grid as a center and a second preset distance as a radius according to a second pitch angle and a second azimuth angle corresponding to each pixel point in the environment image.
In the sub-step S1322, the gray value of the pixel grid is determined according to the gray value corresponding to the searched point, so as to obtain the second projection image.
In this embodiment, according to a second pitch angle and a second azimuth angle corresponding to each point in the environmental image, for each pixel grid in the target space a, a center point of the pixel grid is used as a search center, and a second preset distance is used as a radius to search in a pixel point mapped to the target space a; and determining the gray value of the pixel grid according to the gray values of all the pixel points searched based on the pixel grid. For example, the average of the gray values of all the pixels searched based on the pixel grid is used as the gray value of the pixel grid. The second preset distance may be determined by combining actual requirements. In this way, when the resolution of the environment image is smaller than the size of the target space a, the gray scale value of each pixel can still be determined, and the second projection image can be obtained.
And a substep S133 of performing edge extraction on the second projection image to obtain a second target edge feature.
Similar to the method for obtaining the first target edge feature from the first projection image, the second edge feature can be extracted from the second projection image by using an edge feature extraction algorithm, and then the second edge feature with a longer length is retained as the second target edge feature after filtering according to the length of the edge pixel. Optionally, the pixel length of each second edge feature may be compared with a second preset pixel length, and the second edge feature having the pixel length greater than the second preset pixel length may be used as the second target edge feature.
And a substep S134, performing reverse mapping on the second target edge feature according to a mapping mode used when mapping to the target space, and obtaining the target edge pixel point.
Under the condition of obtaining the second target edge feature, the second target edge feature in the target space a may be reversely mapped back to the corresponding original space (i.e., the camera image coordinate system) according to the mapping relationship between the pixel grid when the second projection image is generated and the corresponding pixel point in the corresponding environment image, so as to obtain the target edge pixel point. Therefore, the target edge pixel point can be automatically determined from the environment image.
Under the condition of obtaining the target edge point cloud and the target edge pixel points, the target edge point cloud can be projected onto a camera image plane corresponding to the camera, and the projected pixel points can be used as projection results. And performing iterative updating on the initial internal parameters and the initial external parameters to obtain the target internal parameters and the target external parameters by taking the distribution condition of the projection result and the distribution condition of the target edge pixel points as targets.
Optionally, ICP may be used to calculate the closest point distance for the projection result and the target edge pixel point. At this time, the closest point distance is a function value of the cost function. The initial internal parameters and the initial external parameters can be optimized by using the maximum cost function corresponding to the distance of the closest point. Other ways of determining the cost function and then optimizing can also be used.
As one possible implementation, the optimization may be accomplished in the manner shown in FIG. 10 to obtain the target internal parameters and the target external parameters. Referring to fig. 10, fig. 10 is a flowchart illustrating the sub-steps included in step S140 in fig. 2. In the present embodiment, step S140 may include sub-steps S141 to S144.
And a substep S141 of calculating a target kernel density estimation function according to the target edge pixel points.
In this embodiment, in order to make the gradient of the cost function continuous and easy to optimize, kernel Density Estimation (KDE) is used, based on a target edge pixel serving as a sample, and based on a preset window width, edge distribution of the target edge pixel is estimated, so as to obtain a target Kernel Density Estimation function.
And a substep S142, projecting the target edge point cloud to the camera image plane according to the current internal reference and external reference, and determining the position of each projection point.
Under the condition of obtaining the target edge point cloud, the target edge point cloud can be projected to the camera plane to obtain the two-dimensional coordinates of each projection point. Taking a fish-eye camera as an example, the projection process is as follows:
internal reference
Figure BDA0003898286910000171
Radix Ginseng Indici
Figure BDA0003898286910000172
Figure BDA0003898286910000173
Figure BDA0003898286910000174
Wherein,
Figure BDA0003898286910000175
r=F(θ;a 0 ,...,a n )=a 0 +a 1 θ+...+a n θ n
wherein, L p represents a three-dimensional point in the lidar coordinate system L,
Figure BDA0003898286910000176
representing an external parametric transformation (by rotation) from the { L } coordinate system to the fisheye camera coordinate system { C } (three-dimensional coordinate system)
Figure BDA0003898286910000177
And a translation
Figure BDA0003898286910000178
) (ii) a Theta represents a pitch angle calculated under a { L } coordinate system, and phi represents an azimuth angle calculated under the { L } coordinate system; f (theta; a) 0 ,...,a n ) Polynomial representing fish-eye camera reference model, which calculates pitch angle as [ u [ ] 0 ,v 0 ]A pixel radius r at the center; C p represents the pixel point coordinate (two-dimensional coordinate in the camera image coordinate system), Π: (ii) C P; Θ) represents an internal reference transformation transforming a three-dimensional spatial point from the coordinate system C to the two-dimensional fisheye camera image coordinate system,
Figure BDA0003898286910000181
a distortion correction matrix is represented.
And a substep S143 of calculating a function value corresponding to each projection point according to the position of each projection point and the target kernel density estimation function.
And substituting the position coordinates of the projection points into the target kernel density estimation function aiming at each projection point, and calculating to obtain a function value corresponding to the projection points, namely calculating the value of the projection points on the KDE distribution probability density function.
And a substep S144, adjusting the currently used internal parameter and external parameter according to the maximum function value as the target, and skipping to the function value calculation step according to the adjusted internal parameter and external parameter so as to obtain the target internal parameter and target external parameter after the iteration updating is finished.
The sum of squares of the function values of all the projection points can be used as a cost function, or the result of dividing the square of the function values of all the projection points by the number of the projection points can be used as the cost function, and the optimization is performed by using a nonlinear optimization method (for example, an L-M method) and the maximized cost function as a target, so that the optimized values of the internal and external parameters are obtained.
In the optimization process, a function value corresponding to the radar projection point can be obtained through calculation according to the current internal and external parameters and the current target kernel density estimation function. Then, the current internal parameter and the current external parameter are adjusted according to the function value, the adjusted internal parameter and the adjusted external parameter are used as the updated current internal parameter and the updated external parameter, and then the substep S142 is skipped to recalculate the function value corresponding to the radar projection point. When the preset times of the process are repeated or the calculated function value meets the requirement, the iteration can be determined to be completed, and the optimal internal and external parameters are obtained.
Optionally, in order to make the gradient of the cost function as effective as possible for the optimization, multiple rounds of optimization may be performed, gradually approximating the parameter values to correct values. To enable efficient optimization in different regions, the window width of the KDE may be readjusted after each round of optimization. That is, in the parameter optimization process, the window width corresponding to the target kernel density estimation function may also be adjusted, and the updated target kernel density estimation function is obtained by calculation according to the adjusted window width. The window width before adjustment is larger than the window width after adjustment, and a target kernel density estimation function corresponding to one window width is used for calculating a plurality of groups of function values corresponding to internal and external parameters. And after the updated target kernel density estimation function is obtained, calculating a function value again by using the function, and optimizing the internal and external parameters.
In KDE, the window width determines how many sampling points the kernel function is affected by, i.e., whether the estimated probability density function is smooth or not. At the beginning of optimization, the window width can be set to be larger, and the smooth function can enable the optimization to rapidly approach the optimal solution area. And after entering the optimal solution area, gradually reducing the window width to increase the gradient of the local area and enable the optimization to continuously approach the optimal solution.
In the process of one round of optimization, the optimization of the current round can be determined to be completed under the condition of iterating preset times or calculating that the value of the cost function is greater than a preset value.
In this mode, a current target kernel density estimation function can be calculated under a window width, and then a function value of a radar projection point based on current internal reference and external reference under the current window width is calculated through substeps S142-143; and adjusting the current internal parameters and external parameters based on the function values, then skipping to the substep S142, and repeating for a certain number of times to obtain the optimal internal parameters and external parameters corresponding to the current window. Then, the window width may be adjusted, the target kernel density estimation function may be re-estimated according to the new window width, and then the internal and external parameters may be adjusted again based on the new target kernel density estimation function. When the window width and the internal and external parameters are stopped to be adjusted, the target internal parameters and the target external parameters are obtained.
Next, the internal and external reference calibration method of the fisheye camera and the laser radar shown in fig. 3 will be described with reference to fig. 11. Wherein, the vertical direction view field of fisheye camera is greater than the vertical direction view field of lidar, and lidar is non-repetitive scanning lidar.
First, pre-Processing is performed. Scanning (LiDAR Scans from differential views) is performed by a laser radar from different angles, and then Point cloud registration (Point cloud registration) is performed on Point clouds obtained by scanning to obtain environment Point clouds. The Fisheye camera is enabled to collect images at different exposure time lengths, and Fisheye images (Fisheye images with different exposure time lengths) with different exposure time lengths are obtained. Exposure fusion (Exposure fusion) is performed on the plurality of fisheye images to obtain an environment image. Projecting (projecting by azimuth and pitch angles) the environmental point cloud and the environmental image respectively to obtain a first Projection image corresponding to the environmental point cloud and a second Projection image corresponding to the environmental image.
Next, edge extraction (Edge extraction) is performed. Edge extraction (Canny edge extriciton) is performed on the first projection image and the second projection image respectively by using a Canny algorithm. Then, the extracted edge features can be mapped reversely to Obtain original point clouds and original fish-eye pixels (pixels) corresponding to the edge features.
Then, iterative Optimization (Iterative Optimization) is performed. And aiming at the original fisheye pixels corresponding to the determined Edge features, edge distribution estimation by KDE is carried out by using KDE to obtain an Edge probability prediction function. Performing point cloud projection on the original point cloud corresponding to the determined edge features, and calculating by combining an edge probability prediction function to obtain each projected point cloudThe corresponding edge probability threshold (Point probability and edge probability) of the Point. The maximum cost function corresponding to the optimized internal parameter and external parameter is the target, namely
Figure BDA0003898286910000201
Optimization was performed using the L-M method (Optimization by L-M method). After one parameter adjustment, the parameter (Update parameter) may be updated, and the current cost value may be calculated again based on the updated parameter.
And resetting the window width (Set bandwidth) under the condition that the optimal internal reference and the optimal external reference corresponding to the window width are determined, calculating a new marginal probability prediction function based on the reset window, and further performing parameter optimization under the newly Set window width.
The embodiment of the application provides a method for calibrating all internal parameters and external parameters together aiming at the fusion use of a fisheye camera and a laser radar, and the estimation of all the internal parameters and the external parameters is completed in the same optimization process. Accumulating sufficiently dense (sub-pixel level) point clouds by utilizing the non-repetitive characteristic of the Livox laser radar, extracting edge characteristics from a reflectivity projection graph and a projection graph of the fisheye camera, giving an edge distribution probability density function of the fisheye camera through Kernel Density Estimation (KDE), projecting edge characteristic points of the laser radar to a plane of the fisheye camera through internal and external parameter transformation, and maximizing the mean square value of the probability density function of each projection point on the distribution of the fisheye camera to obtain the optimal internal and external parameters.
In order to perform the corresponding steps in the above embodiments and various possible manners, an implementation manner of the internal and external reference calibration apparatus 200 is given below, and optionally, the internal and external reference calibration apparatus 200 may adopt the device structure of the electronic device 100 shown in fig. 1. Further, referring to fig. 12, fig. 12 is a block diagram illustrating an internal and external reference calibration apparatus 200 according to an embodiment of the present disclosure. It should be noted that the internal and external reference calibration apparatus 200 provided in the present embodiment has the same basic principle and technical effects as those of the above embodiments, and for brief description, reference may be made to corresponding contents in the above embodiments for parts that are not mentioned in the present embodiment. In this embodiment, the internal and external reference calibration apparatus 200 may include: an obtaining module 210, a first edge determining module 220, a second edge determining module 230, and an optimizing module 240.
The obtaining module 210 is configured to obtain an environment point cloud and an environment image corresponding to the same environment. Wherein the environmental point cloud is obtained through a laser radar, and the environmental image is obtained through a camera.
The first edge determining module 220 is configured to perform angle mapping, edge feature extraction, and inverse mapping on the environment point cloud according to an initial external parameter to obtain a target edge point cloud in the environment point cloud, where a space mapped to the target edge point cloud is a target space in which a pitch angle and an azimuth angle in a camera coordinate system of the camera are orthogonal axes;
the second edge determining module 230 is configured to perform angle mapping, edge feature extraction, and reverse mapping on the environment image according to the initial internal reference, so as to obtain a target edge pixel point in the environment image. Wherein the space to which the environment image is mapped is the target space.
The optimization module 240 is configured to iteratively update the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point cloud, so as to obtain target internal parameters of the camera and target external parameters between the laser radar and the camera.
Alternatively, the above modules may be stored in the form of software or Firmware (Firmware) in the memory 110 shown in fig. 1 or solidified in an Operating System (OS) of the electronic device 100, and may be executed by the processor 120 in fig. 1. Meanwhile, data, codes of programs, and the like required to execute the above-described modules may be stored in the memory 110.
The embodiment of the application also provides a readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the internal and external reference calibration method is realized.
To sum up, the embodiment of the present application provides an internal and external parameter calibration method, an apparatus, an electronic device, and a readable storage medium, and the method includes firstly, obtaining an environment image obtained by using a camera, and obtaining an environment point cloud obtained by using a laser radar to perform information acquisition on the same environment; then, based on the initial external reference, carrying out angle mapping, edge feature extraction and reverse mapping on the environment point cloud to obtain a target edge point cloud in the environment point cloud, wherein the space mapped to the target edge point cloud is a target space with a pitch angle and an azimuth angle in a camera coordinate system of the camera as orthogonal axes; according to the initial internal reference, performing angle mapping, edge feature extraction and reverse mapping on the environment image to obtain target edge pixel points in the environment image, wherein a space to which the environment image is mapped is the target space; and finally, carrying out iterative updating on the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point cloud so as to obtain the target internal parameters of the camera and the target external parameters between the laser radar and the camera. Therefore, the internal and external reference combined calibration of the camera and the laser radar can be automatically completed under the environment without the marker, the problems of the existing camera internal reference calibration method and the external reference calibration method between the camera and the laser radar are solved, and the error amplification condition caused by the two-step calibration process is avoided.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist alone, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The foregoing is illustrative of only alternative embodiments of the present application and is not intended to limit the present application, which may be modified or varied by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An internal and external reference calibration method is characterized by comprising the following steps:
obtaining an environment point cloud and an environment image corresponding to the same environment, wherein the environment point cloud is obtained through a laser radar, and the environment image is obtained through a camera;
according to the initial external parameters, carrying out angle mapping, edge feature extraction and reverse mapping on the environment point cloud to obtain a target edge point cloud in the environment point cloud, wherein the space mapped to the target edge point cloud is a target space with a pitch angle and an azimuth angle in a camera coordinate system of the camera as orthogonal axes;
according to the initial internal reference, carrying out angle mapping, edge feature extraction and reverse mapping on the environment image to obtain target edge pixel points in the environment image, wherein the space to which the environment image is mapped is the target space;
and iteratively updating the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point cloud so as to obtain target internal parameters of the camera and target external parameters between the laser radar and the camera.
2. The method of claim 1, wherein the obtaining the target edge point cloud in the environment point cloud by performing angle mapping, edge feature extraction and inverse mapping on the environment point cloud according to the initial parameters comprises:
transferring the environmental point cloud to the camera coordinate system according to the initial external parameters to obtain a converted environmental point cloud;
calculating a first pitch angle and a first azimuth angle corresponding to each point in the converted environment point cloud according to the coordinate of the point in the camera coordinate system;
generating a first projection image in the target space according to a first pitch angle, a first azimuth angle and reflectivity corresponding to each point;
performing edge extraction on the first projection image to obtain a first target edge feature;
and reversely mapping the first target edge feature according to a mapping mode used when the first target edge feature is mapped to the target space to obtain the target edge point cloud.
3. The method of claim 2, wherein generating a first projection image in the target space according to the first pitch angle, the first azimuth angle and the reflectivity corresponding to the point comprises:
searching for each pixel grid in the target space by taking a center point of the pixel grid as a center and a first preset distance as a radius according to a first pitch angle and a first azimuth angle corresponding to each point;
and determining the gray value of the pixel grid according to the reflectivity corresponding to the searched point to obtain the first projection image.
4. The method according to claim 1, wherein the obtaining target edge pixel points in the environment image by performing angle mapping, edge feature extraction and inverse mapping on the environment image according to the initial internal reference comprises:
calculating to obtain a second pitch angle and a second azimuth angle corresponding to each pixel point in the environment image according to the initial internal reference and the environment image;
generating a second projection image in the target space according to a second pitch angle, a second azimuth angle and a gray value corresponding to each pixel point in the environment image;
performing edge extraction on the second projection image to obtain a second target edge feature;
and reversely mapping the second target edge feature according to a mapping mode used when the second target edge feature is mapped to the target space to obtain the target edge pixel point.
5. The method of claim 1,
when the view field of the camera in the vertical direction is larger than the view field of the laser radar in the vertical direction, the obtaining of the environmental point cloud corresponding to the same environment comprises:
initial point clouds obtained by the lidar at different angles;
splicing the initial point clouds at different angles to obtain the environment point cloud, wherein the view corresponding to the environment point cloud is not less than the view corresponding to the environment image;
and/or the presence of a gas in the gas,
the obtaining of the environment image corresponding to the same environment includes:
obtaining initial images of the camera in the environment, wherein the initial images are obtained by the camera in different exposure time lengths;
and exposing and fusing the obtained multiple initial images to obtain the environment image.
6. The method according to any one of claims 1 to 5, wherein the iteratively updating the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point cloud to obtain the target internal parameters of the camera and the target external parameters between the lidar and the camera comprises:
and taking the distribution condition of the projection result and the distribution condition of the target edge pixel points as targets, and performing iterative updating on the initial internal parameters and the initial external parameters to obtain the target internal parameters and the target external parameters, wherein the projection result is the pixel points obtained when the target edge point cloud is projected onto a camera image plane corresponding to the camera.
7. The method according to claim 6, wherein the iteratively updating the initial internal parameters and the initial external parameters to obtain the target internal parameters and the target external parameters by using the distribution of the projection result and the distribution of the target edge pixel points as targets, comprises:
calculating to obtain a target kernel density estimation function according to the target edge pixel points;
projecting the target edge point cloud to the camera image plane according to the current internal reference and external reference, and determining the position of each projection point;
calculating to obtain a function value corresponding to each projection point according to the position of each projection point and the target kernel density estimation function;
and adjusting the currently used internal parameters and external parameters by taking the maximum function value as a target, and jumping to the following steps according to the adjusted internal parameters and external parameters: and projecting the target edge point cloud to the camera image plane according to the current internal parameters and external parameters, determining the position of each projection point, and obtaining the target internal parameters and the target external parameters until the iteration updating is finished.
8. The method according to claim 7, wherein the iteratively updating the initial internal parameters and the initial external parameters to obtain the target internal parameters and the target external parameters with the distribution of the projection result consistent with the distribution of the target edge pixel points as a target, further comprises:
in the iterative updating process, the window width corresponding to the target kernel density estimation function is adjusted, and the updated target kernel density estimation function is obtained through calculation according to the adjusted window width, wherein the window width before adjustment is larger than the adjusted window width, and the target kernel density estimation function corresponding to one window width is used for calculating multiple groups of function values corresponding to internal and external parameters.
9. An internal and external reference calibration device, characterized in that the device comprises:
the system comprises an obtaining module, a processing module and a processing module, wherein the obtaining module is used for obtaining an environment point cloud and an environment image corresponding to the same environment, the environment point cloud is obtained through a laser radar, and the environment image is obtained through a camera;
the first edge determining module is used for carrying out angle mapping, edge feature extraction and reverse mapping on the environment point cloud according to initial external parameters to obtain target edge point cloud in the environment point cloud, wherein the space mapped to the target edge point cloud is a target space with a pitch angle and an azimuth angle in a camera coordinate system of the camera as orthogonal axes;
the second edge determining module is used for carrying out angle mapping, edge feature extraction and reverse mapping on the environment image according to the initial internal parameters to obtain target edge pixel points in the environment image, wherein the space to which the environment image is mapped is the target space;
and the optimization module is used for iteratively updating the initial internal parameters and the initial external parameters according to the target edge pixel points and the target edge point clouds so as to obtain the target internal parameters of the camera and the target external parameters between the laser radar and the camera.
10. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor to implement the extrinsic reference calibration method of any one of claims 1 to 8.
CN202211281195.2A 2022-10-19 2022-10-19 Internal and external parameter calibration method and device and electronic equipment Pending CN115661262A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211281195.2A CN115661262A (en) 2022-10-19 2022-10-19 Internal and external parameter calibration method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211281195.2A CN115661262A (en) 2022-10-19 2022-10-19 Internal and external parameter calibration method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN115661262A true CN115661262A (en) 2023-01-31

Family

ID=84989207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211281195.2A Pending CN115661262A (en) 2022-10-19 2022-10-19 Internal and external parameter calibration method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN115661262A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115953484A (en) * 2023-03-13 2023-04-11 福思(杭州)智能科技有限公司 Parameter calibration method and device for detection equipment, storage medium and electronic device
CN115994955A (en) * 2023-03-23 2023-04-21 深圳佑驾创新科技有限公司 Camera external parameter calibration method and device and vehicle
CN116152333A (en) * 2023-04-17 2023-05-23 天翼交通科技有限公司 Method, device, equipment and medium for calibrating camera external parameters
CN116740197A (en) * 2023-08-11 2023-09-12 之江实验室 External parameter calibration method and device, storage medium and electronic equipment
CN117437303A (en) * 2023-12-18 2024-01-23 江苏尚诚能源科技有限公司 Method and system for calibrating camera external parameters

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115953484A (en) * 2023-03-13 2023-04-11 福思(杭州)智能科技有限公司 Parameter calibration method and device for detection equipment, storage medium and electronic device
CN115953484B (en) * 2023-03-13 2023-07-04 福思(杭州)智能科技有限公司 Parameter calibration method and device of detection equipment, storage medium and electronic device
CN115994955A (en) * 2023-03-23 2023-04-21 深圳佑驾创新科技有限公司 Camera external parameter calibration method and device and vehicle
CN115994955B (en) * 2023-03-23 2023-07-04 深圳佑驾创新科技有限公司 Camera external parameter calibration method and device and vehicle
CN116152333A (en) * 2023-04-17 2023-05-23 天翼交通科技有限公司 Method, device, equipment and medium for calibrating camera external parameters
CN116152333B (en) * 2023-04-17 2023-09-01 天翼交通科技有限公司 Method, device, equipment and medium for calibrating camera external parameters
CN116740197A (en) * 2023-08-11 2023-09-12 之江实验室 External parameter calibration method and device, storage medium and electronic equipment
CN116740197B (en) * 2023-08-11 2023-11-21 之江实验室 External parameter calibration method and device, storage medium and electronic equipment
CN117437303A (en) * 2023-12-18 2024-01-23 江苏尚诚能源科技有限公司 Method and system for calibrating camera external parameters
CN117437303B (en) * 2023-12-18 2024-02-23 江苏尚诚能源科技有限公司 Method and system for calibrating camera external parameters

Similar Documents

Publication Publication Date Title
CN115661262A (en) Internal and external parameter calibration method and device and electronic equipment
CN107316325B (en) Airborne laser point cloud and image registration fusion method based on image registration
CN107167788B (en) Method and system for obtaining laser radar calibration parameters and laser radar calibration
CN105096329B (en) Method for accurately correcting image distortion of ultra-wide-angle camera
CN112270713B (en) Calibration method and device, storage medium and electronic device
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
CN112444242A (en) Pose optimization method and device
CN111583119B (en) Orthoimage splicing method and equipment and computer readable medium
WO2012126500A1 (en) 3d streets
CN108332752B (en) Indoor robot positioning method and device
CN108364279B (en) Method for determining pointing deviation of stationary orbit remote sensing satellite
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN113947638B (en) Method for correcting orthographic image of fish-eye camera
CN108801225B (en) Unmanned aerial vehicle oblique image positioning method, system, medium and equipment
CN112270698A (en) Non-rigid geometric registration method based on nearest curved surface
CN111260539A (en) Fisheye pattern target identification method and system
CN117665841B (en) Geographic space information acquisition mapping method and device
CN115359130B (en) Radar and camera combined calibration method and device, electronic equipment and storage medium
CN115423863B (en) Camera pose estimation method and device and computer readable storage medium
Bybee et al. Method for 3-D scene reconstruction using fused LiDAR and imagery from a texel camera
WO2021212319A1 (en) Infrared image processing method, apparatus and system, and mobile platform
CN115937325B (en) Vehicle-end camera calibration method combined with millimeter wave radar information
CN115588127B (en) Method for fusing airborne laser point cloud and multispectral image
CN114792343B (en) Calibration method of image acquisition equipment, method and device for acquiring image data
CN114387532A (en) Boundary identification method and device, terminal, electronic equipment and unmanned equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination