CN111308448B - External parameter determining method and device for image acquisition equipment and radar - Google Patents

External parameter determining method and device for image acquisition equipment and radar Download PDF

Info

Publication number
CN111308448B
CN111308448B CN201811504941.3A CN201811504941A CN111308448B CN 111308448 B CN111308448 B CN 111308448B CN 201811504941 A CN201811504941 A CN 201811504941A CN 111308448 B CN111308448 B CN 111308448B
Authority
CN
China
Prior art keywords
point cloud
image
preset
target
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811504941.3A
Other languages
Chinese (zh)
Other versions
CN111308448A (en
Inventor
万富华
吕吉鑫
孙杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201811504941.3A priority Critical patent/CN111308448B/en
Publication of CN111308448A publication Critical patent/CN111308448A/en
Application granted granted Critical
Publication of CN111308448B publication Critical patent/CN111308448B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Abstract

The embodiment of the application provides an external parameter determining method and device for image acquisition equipment and a radar, wherein the method comprises the following steps: acquiring initial values of external parameters of image acquisition equipment and a laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, wherein a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar and has image characteristics and reflection characteristics; converting the image data and the point cloud data into the same coordinate system according to the initial values of the external parameters and the internal parameters; and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset registration condition, wherein the preset registration condition represents that the preset calibration plate image in the image data and the preset calibration plate point cloud in the point cloud data are registered. The method can determine the external parameters of the image acquisition equipment and the radar, and the external parameters are determined accurately.

Description

External parameter determining method and device for image acquisition equipment and radar
Technical Field
The application relates to the technical field of equipment parameter calibration, in particular to an external parameter determination method and device for image acquisition equipment and a radar.
Background
Aiming at the scene with higher measurement precision requirement, compared with a single sensor, the multi-sensor information fusion technology can obtain more accurate parameters, and the reliability and fault tolerance of the system can be improved. The premise of information fusion among various sensors is to carry out combined calibration.
The image acquisition equipment can acquire visible light image information, but is easily influenced by factors such as external weather, illumination and the like, and lacks three-dimensional data information of a target object; the laser radar can quickly acquire three-dimensional data information of a space target object, but cannot acquire information such as texture, color and the like of the object. Therefore, the performance of the laser radar and the performance of the image acquisition equipment can be well complemented, but because the data acquired by the two sensors are based on respective coordinate systems, the data acquired by the two sensors need to be represented in the same coordinate system, and the external parameters of the image acquisition equipment and the radar need to be determined by representing the data acquired by the two sensors in one coordinate system.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for determining external parameters of an image acquisition device and a radar, so as to determine the external parameters of the image acquisition device and the radar. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides an image acquisition device and a method for determining external parameters of a radar, where the method includes:
acquiring initial values of external parameters of image acquisition equipment and a laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, wherein a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar and has image characteristics and reflection characteristics;
converting the image data and the point cloud data to the same coordinate system according to the initial value of the external parameter and the internal parameter;
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset registration condition, wherein the preset registration condition represents that a preset calibration plate image in the image data and a preset calibration plate point cloud in the point cloud data are registered.
Optionally, the converting the image data and the point cloud data into the same coordinate system according to the initial value of the external parameter and the internal parameter includes:
determining a target point cloud corresponding to the preset calibration plate in the point cloud data;
converting the target point cloud into a coordinate system of the image data according to the initial value of the external parameter and the internal parameter;
the adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset coincidence condition, comprises:
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the target point cloud meet a preset coincidence condition.
Optionally, the determining the target point cloud corresponding to the preset calibration plate in the point cloud data includes:
determining a target calibration area of the preset calibration plate in the image data;
determining a mapping area of the target calibration area in the point cloud data according to the initial value of the external parameter and the internal parameter;
determining initial point clouds in the point cloud data according to the mapping areas;
and filtering out point clouds with different depths from the preset calibration plate in the initial point cloud based on the depths of all the points to obtain target point cloud.
Optionally, the reflection intensity of the area with a large gray scale in the preset calibration board is higher than the reflection intensity of the area with a small gray scale, and the method further includes:
carrying out binarization processing and fuzzy processing on the image data to obtain a target calibration image;
clustering each point in the target point cloud by a preset clustering method according to the reflection intensity to obtain a high-reflection-intensity point cloud and/or a low-reflection-intensity point cloud, wherein the high-reflection-intensity point cloud is formed by points with reflection intensities larger than a preset intensity threshold value, and the low-reflection-intensity point cloud is formed by points with reflection intensities lower than the preset intensity threshold value;
the adjusting the value of the external parameter, and when the image data and the target point cloud satisfy a preset coincidence condition, outputting a calibration value of the external parameter, including:
and taking the average value of the gray levels of the target point cloud and the projected image of the target calibration image as an optimization target, taking the value of the external parameter as an optimization variable, and determining the optimal value of the optimization variable by a preset nonlinear optimization method to obtain the calibration value of the external parameter.
Optionally, before the binarizing and blurring processing are performed on the image data to obtain the target calibration image, the method further includes:
and according to the internal reference, carrying out distortion removal processing on the image data.
In a second aspect, an embodiment of the present application provides an external parameter determining apparatus for an image capturing device and a radar, where the apparatus includes:
the device comprises a parameter acquisition module, a parameter acquisition module and a parameter processing module, wherein the parameter acquisition module is used for acquiring initial values of external parameters of image acquisition equipment and a laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar, and the preset calibration plate has image characteristics and reflection characteristics;
the coordinate conversion module is used for converting the image data and the point cloud data into the same coordinate system according to the initial values of the external parameters and the internal parameters;
and the calibration value output module is used for adjusting the value of the external parameter and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset coincidence condition, wherein the preset coincidence condition represents that a preset calibration plate image in the image data is coincided with a preset calibration plate point cloud in the point cloud data.
Optionally, the coordinate conversion module includes:
the target point cloud determining submodule is used for determining a target point cloud corresponding to the preset calibration plate in the point cloud data;
the target point cloud mapping submodule is used for converting the target point cloud into a coordinate system of the image data according to the initial value of the external parameter and the internal parameter;
the calibration value output module is specifically configured to:
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the target point cloud meet a preset coincidence condition.
Optionally, the target point cloud determining sub-module includes:
a calibration area determining unit, configured to determine a target calibration area of the preset calibration plate in the image data;
a mapping area determining unit, configured to determine a mapping area of the target calibration area in the point cloud data according to the initial value of the external parameter and the internal parameter;
an initial point cloud determining unit, configured to determine an initial point cloud in the point cloud data according to the mapping region;
and the point cloud filtering unit is used for filtering point clouds with different depths from the preset calibration plate in the initial point cloud based on the depths of all points to obtain a target point cloud.
Optionally, the reflection intensity of the area with large gray scale in the preset calibration board is higher than the reflection intensity of the area with small gray scale, and the apparatus further includes:
the image processing module is used for carrying out binarization processing and fuzzy processing on the image data to obtain a target calibration image;
the point cloud clustering module is used for clustering each point in the target point cloud by a preset clustering method according to the reflection intensity to obtain a high-reflection-intensity point cloud and/or a low-reflection-intensity point cloud, wherein the high-reflection-intensity point cloud is formed by points with reflection intensity larger than a preset intensity threshold, and the low-reflection-intensity point cloud is formed by points with reflection intensity lower than the preset intensity threshold;
the calibration value output module is specifically configured to:
and taking the average value of the gray levels of the target point cloud and the projected image of the target calibration image as an optimization target, taking the value of the external parameter as an optimization variable, and determining the optimal value of the optimization variable by a preset nonlinear optimization method to obtain the calibration value of the external parameter.
Optionally, the apparatus further comprises:
and the distortion removal module is used for carrying out distortion removal processing on the image data according to the internal parameters.
In a third aspect, an embodiment of the present application provides an external parameter determining apparatus for an image capturing device and a radar, where the apparatus includes:
the image processing module is arranged for determining the area where the preset calibration plate is located in the image data of the preset calibration plate, and performing binarization and fuzzy processing on the image of the area where the preset calibration plate is located to obtain a target calibration area image;
a laser point cloud processing module configured to project the reflection intensity point cloud onto the image data according to the initial value of the external parameter and the internal parameter; taking the point cloud at the target position as the initial point cloud reflected by the preset calibration plate in the projected reflection intensity point cloud; based on the depth of each point cloud in the reflection intensity point cloud, filtering out point clouds with different depths from the preset calibration plate in the initial point cloud to obtain filtered point clouds; according to the reflection intensity, determining a high reflection intensity point cloud and a low reflection intensity point cloud in the filtered point cloud by a preset clustering method, and taking the high reflection intensity point cloud and/or the low reflection intensity point cloud as target point clouds;
and the nonlinear optimization module is configured to convert the target point cloud and the target calibration area into the same coordinate system according to the initial values of the external parameters and the internal parameters, take the average value of the gray levels of the projection images of the target point cloud and the target calibration area as an optimization target, take the value of the external parameters as an optimization variable, determine the optimal value of the optimization variable through a preset nonlinear optimization method, and obtain the calibration value of the external parameters.
In a fourth aspect, an embodiment of the present application provides an electronic device, which includes a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the method for determining external parameters of the image capturing device and the radar according to any one of the first aspect described above when executing the program stored in the memory.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the method for determining external parameters of an image capturing apparatus and a radar according to any one of the above first aspects is implemented.
The method and the device for determining the external parameters of the image acquisition equipment and the radar acquire initial values of the external parameters of the image acquisition equipment and the laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, wherein a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar and has image characteristics and reflection characteristics; converting the image data and the point cloud data into the same coordinate system according to the initial values of the external parameters and the internal parameters; and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset registration condition, wherein the preset registration condition represents that the preset calibration plate image in the image data and the preset calibration plate point cloud in the point cloud data are registered. External parameters of the image acquisition equipment and the radar can be determined, and the external parameters are determined accurately. Of course, it is not necessary for any product or method of the present application to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a first flowchart of an image acquisition device and a radar external parameter determination method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an embodiment of the present application illustrating a default calibration board;
fig. 3 is a second flowchart of an image acquisition device and a radar external parameter determination method according to an embodiment of the present application;
fig. 4 is a first schematic diagram of an image acquisition device and an external parameter determination apparatus of a radar according to an embodiment of the present application;
fig. 5 is a first schematic diagram of a workflow of an image acquisition device and an external parameter determining apparatus of a radar according to an embodiment of the present disclosure;
fig. 6 is a second schematic diagram of a workflow of the image acquisition device and the external parameter determining apparatus of the radar according to the embodiment of the present application;
FIG. 7 is a third schematic diagram of a workflow of an image capturing device and an external reference determining apparatus of a radar according to an embodiment of the present disclosure;
FIG. 8 is a fourth schematic diagram of a workflow of an image capturing device and an external reference determining apparatus of a radar according to an embodiment of the present disclosure;
FIG. 9 is a second schematic diagram of an image capturing device and an external parameter determining apparatus of a radar according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of an electronic device according to an embodiment of the application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
First, terms of art in the embodiments of the present application are explained.
Laser radar: an optical distance sensor can sense the geometric shape and the reflection intensity of a surrounding object in a laser ranging mode. The obtained data is a three-dimensional point cloud with reflection intensity information. In general, there are a plurality of scanning lines in the lidar, and the scanning lines are distributed in a certain regular manner in the vertical direction of the radar. When the radar works, the radar can horizontally rotate and obtain the distance of each scanning line corresponding to a certain horizontal angle, so that the three-dimensional environment can be sensed.
External ginseng: the relative pose relationship refers to the distance and angle between the image acquisition device and the radar in the embodiment of the application. The general mathematical representation is in the form of a 4x4 rotational-translation matrix.
Internal reference of the image acquisition device: the parameters of the image capturing device generally include focal length and principal point information, distortion coefficient, and the like.
Calibrating a plate: a feature designed and manufactured to perform a calibration function. Having known shape, size, color, etc. characteristics. Since it is generally plate-shaped, it is also called a calibration plate. When calibration is performed, data of the sensor during detection of the calibration plate needs to be collected, so that a calibration function is realized.
Initial value: the value obtained by the preliminary treatment is closer to the accurate value, but still has certain error.
Image binarization: converting the gray scale image into a black and white image.
In order to accurately obtain external references of an image acquisition device and a laser radar, an embodiment of the present application provides an external reference determination method of an image acquisition device and a radar, referring to fig. 1, where the method includes:
and S11, acquiring initial values of external parameters of image acquisition equipment and a laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, wherein a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar, and the preset calibration plate has image characteristics and reflection characteristics.
The method for determining the external parameters of the image acquisition equipment and the laser radar in the embodiment of the application can be realized through a calibration system, and the calibration system is any system capable of realizing the method for determining the external parameters of the image acquisition equipment and the laser radar in the embodiment of the application. For example:
the calibration system may be an electronic device comprising: a processor, a memory, a communication interface, and a bus; the processor, the memory and the communication interface are connected through a bus and complete mutual communication; the memory stores executable program code; the processor reads the executable program codes stored in the memory to run programs corresponding to the executable program codes, so as to execute the image acquisition device and the external parameter determination method of the laser radar in the embodiment of the application.
The calibration system may also be an application program, which is used to execute the image acquisition device and the external parameter determination method for the lidar according to the embodiment of the present application during running.
The calibration system may also be a storage medium for storing executable code for executing the image acquisition device and the method for determining external parameters of a lidar according to the embodiments of the present application.
As shown in fig. 2, the image acquisition device and the lidar are mounted on the base, and the external parameters to be determined are the distance and the angle between the image acquisition device and the lidar. The image capturing device is any device capable of capturing image data, and may be, for example, a camera or a video camera. The internal parameters of the image acquisition equipment can be directly determined according to the delivery parameters of the image acquisition equipment. The initial values of the external parameters of the image acquisition equipment and the laser radar can be obtained by any relevant external parameter determination method, for example, the initial values of the external parameters of the image acquisition equipment and the laser radar are values obtained by measurement by using a measurement tool; the initial values of the external parameters of the image acquisition equipment and the laser radar can also be installation values of the base and the like.
The point cloud data includes position information and reflection intensity information of each reflected point. The method comprises the steps that a preset calibration plate is arranged in a common detection area of image acquisition equipment and a laser radar in advance, so that image data acquired by the image acquisition equipment comprises an image of the preset calibration plate, and point cloud data acquired by the laser radar comprises a point reflected by the preset calibration plate. In the embodiment of the application, it should be ensured that the calibration plate is preset at the same position when the image acquisition device acquires image data and when the laser radar acquires the point cloud of the reflection intensity.
The preset calibration plate has image characteristics and reflection characteristics. The image characteristics of the preset calibration plate mean that the preset calibration plate can be distinguished from the background, and the position of the preset calibration plate can be conveniently determined in image data. In particular, the image feature may be a grayscale feature. The reflective features of the pre-set calibration plate support reflection of the scanning beam of the lidar. In order to improve the data resolution of the laser radar, the position of a preset calibration plate can be adjusted in a common detection area of the image acquisition equipment and the laser radar, and the data resolution of the laser radar in the vertical direction is larger than a set resolution threshold value, so that the test precision is improved. It is of course also possible to move the assembly comprising the image acquisition device and the lidar to increase the data resolution of the lidar, but it should be ensured that at least a part of the pre-set calibration plate is within the common detection area of the image acquisition device and the lidar.
And S12, converting the image data and the point cloud data into the same coordinate system according to the initial value of the external parameter and the internal parameter.
And the calibration system converts the point cloud data from the coordinate system of the laser radar to the coordinate system of the image acquisition equipment according to the initial values of the external parameters and the internal parameters of the image acquisition equipment, namely, the point cloud data is projected onto the image data. Of course, it is also possible to transfer the image data from the coordinate system of the image acquisition device into the coordinate system of the lidar.
And S13, adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset registration condition, wherein the preset registration condition represents that the preset calibration plate image in the image data is registered with the preset calibration plate point cloud in the point cloud data.
The preset coincidence condition can be set according to actual conditions, and represents the coordinates of the image of the preset calibration plate in the image data and coincides with the coordinates of the point cloud of the preset calibration plate in the point cloud data. For example, the calibration system may determine the area of the preset calibration plate in the image data through a computer vision technique, determine the area of the preset calibration plate in the point cloud data through detecting the position information of each point, use the misalignment degree of the two areas as a cost function, adjust the value of the external parameter when the misalignment degree is minimum, and output the parameter of the external parameter at this time as the calibration value.
In the embodiment of the application, the external parameters of the image acquisition equipment and the radar are determined, and the external parameters are determined accurately.
Optionally, referring to fig. 3, in step S12, converting the image data and the point cloud data into a same coordinate system according to the initial value of the external reference and the internal reference, where the converting includes:
and S121, determining a target point cloud corresponding to the preset calibration plate in the point cloud data.
The calibration system can acquire a point cloud selection instruction which is manually input and comprises position information of a target point cloud which is manually selected, and the calibration system determines the target point cloud according to the point cloud selection instruction. The calibration system can also screen out target point clouds semi-automatically or fully automatically. The separation using the algorithm can be performed using known information.
For example, the distance between the preset calibration plate and the laser radar may be used for determination, and the preset calibration plate is set as a reflection plane closest to the laser radar, so that a point cloud plane closest to the radar is a target point cloud.
For example: the calibration system can acquire the distance from a preset calibration plate to the laser radar, hereinafter referred to as a test distance, and uses the point cloud with the same depth and test distance in the reflection intensity point cloud as a target point cloud. When determining the target point cloud according to the test distance, in order to improve the precision, it is necessary to ensure that only a preset calibration plate exists in the space of the test distance from the laser radar. It can be understood by those skilled in the art that the test distance is not a simple value in the actual application process, but an interval, for example, the distance from the calibration board to the lidar is a, and 1% is taken as the fault tolerance, so that the test distance applied in the actual calculation is [0.99a,1.01a ].
The calibration system can also project the point cloud data onto the image data by using the external parameter initial value and the camera internal parameter, determine the position of a preset calibration plate in the image data, and determine the point of the preset calibration plate as a target point cloud and the like in the point cloud data according to the position of the preset calibration plate in the image data.
And S122, converting the target point cloud into a coordinate system of the image data according to the initial value of the external parameter and the internal parameter.
The step S13 of adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data satisfy the preset registration condition, includes:
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the target point cloud meet a preset superposition condition.
In the embodiment of the application, only the target point cloud corresponding to the preset calibration plate is converted into the coordinate system of the image data, so that the computing resource can be saved, and the calibration efficiency can be improved.
Optionally, in the step S121, in the point cloud data, determining a target point cloud corresponding to the preset calibration plate includes:
s1211, determining a target calibration area of the preset calibration plate in the image data.
The preset calibration plate has image characteristics, and the calibration system can determine a target calibration area in the image data through a computer vision technology according to the image characteristics of the preset calibration plate, for example, the target calibration area is determined in the image data through a pre-trained convolutional neural network. Of course, the calibration system may determine the target calibration area by using a computer vision technique according to the characteristics of the preset calibration plate, such as shape and texture.
And S1212, determining a mapping area of the target calibration area in the point cloud data according to the initial value of the external parameter and the internal parameter.
And the calibration system converts the coordinates of the point cloud data into a coordinate system of the image data according to the initial values of the external parameters and the internal parameters of the image acquisition equipment, namely, the point cloud data is projected onto the image data, and a mapping area of a target calibration area in the point cloud data is determined. Of course, the coordinates of the target calibration area may also be converted into the coordinate system of the point cloud data from the coordinates of the image data, so as to determine the mapping area of the target calibration area in the point cloud data.
And S1213, determining an initial point cloud in the point cloud data according to the mapping area.
And according to the mapping area, selecting point clouds in an area covered by the mapping area from the point cloud data to serve as initial point clouds. Of course, if the point cloud of the complete preset calibration plate is desired to be obtained, the center of the mapping region may also be used as the central point, and the range of the mapping region is increased to N times, where N is a preset empirical value, and N is greater than 1.
And S1214, filtering out point clouds with different depths from the preset calibration plate from the initial point cloud based on the depths of all the points to obtain target point cloud.
The calibration system can obtain the distance between the preset calibration plate and the laser radar in advance, so that the depth of the preset calibration plate is obtained.
Optionally, when the preset calibration board is a plane, the step S1214 may be replaced by: and performing plane fitting on the initial point cloud, and filtering points which do not accord with plane hypothesis to obtain the target point cloud. The preset calibration plate is approximately a plane. And performing plane fitting on the initial point cloud, and filtering points which do not accord with plane hypothesis to obtain the target point cloud only containing the preset calibration plate.
Optionally, the reflection intensity of the area with large gray scale in the preset calibration plate is higher than the reflection intensity of the area with small gray scale, and the method further includes:
and S21, carrying out binarization processing and fuzzy processing on the image data to obtain a target calibration image.
And the calibration system performs binarization processing on the image information of the target position, namely the image information of a preset calibration plate to obtain a binarized image, and performs fuzzy processing on the binarized image, wherein the binarized image after the fuzzy processing is the target calibration image.
For example, a checkerboard pattern is printed on the preset calibration board, and the position of the preset calibration board in the image data, i.e. the target position, is determined by a method of the checkerboard corner points. And carrying out binarization processing on the image of the target position by adopting a self-adaptive threshold value method. And (3) carrying out fuzzy processing on the black-and-white image after the binarization processing by adopting a mean value filtering method, wherein the size of a kernel of the mean value filtering can be determined according to the size of the high reflection intensity point compared with the size of the high reflection intensity area, for example, half of the side length of a checkerboard. The reflection intensity range is generally 0-255, a predetermined intensity threshold value, such as 125, 126, or 128, can be set, and the point where the reflection intensity is higher than the predetermined intensity threshold value is a high reflection intensity point. Generally, the high reflection intensity point is located in the high reflection area, but due to the limitation of the lidar detection principle, the reflection intensity of some points outside the edge of the high reflection intensity area is also high, so that the high reflection intensity point and the points in the high reflection area cannot be completely identical.
And S22, clustering each point in the target point cloud through a preset clustering method according to the reflection intensity to obtain a high-reflection-intensity point cloud and/or a low-reflection-intensity point cloud, wherein the high-reflection-intensity point cloud is formed by points with reflection intensities larger than a preset intensity threshold, and the low-reflection-intensity point cloud is formed by points with reflection intensities lower than the preset intensity threshold.
The preset intensity threshold may be set according to actual conditions, for example, 128 or 130. The predetermined clustering method may be any related clustering method, such as k-means clustering algorithm, mean-Shift clustering, density-based noise application space clustering, and expectation maximization clustering using a gaussian mixture model. The calibration system can arrange the point clouds of the laser radar according to the scanning lines, and the points of each scanning line are divided into two types of point clouds of high reflection intensity point clouds and low reflection intensity point clouds by adopting a preset clustering method to obtain the point clouds of high reflection intensity and/or the point clouds of low reflection intensity.
Correspondingly, in S122, the target point cloud is converted into the coordinate system of the image data according to the initial value of the external parameter and the internal parameter, and the target point cloud may be replaced with the clustered high reflection intensity point cloud and/or the clustered low reflection intensity point cloud. Of course, in S122, the high reflection intensity point cloud and/or the low reflection intensity point cloud may be converted into the coordinate system of the image data according to the initial value of the external parameter and the internal parameter.
The step S13 of adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the target point cloud satisfy a preset registration condition, includes:
and taking the average value of the gray levels of the target point cloud and the projected image of the target calibration image as an optimization target, taking the value of the external parameter as an optimization variable, and determining the optimal value of the optimization variable by a preset nonlinear optimization method to obtain the calibration value of the external parameter.
Projecting the target point cloud onto image data, taking the gray average value of a pixel where the target point cloud is located in the projected image as an optimization target, taking the value of the external parameter as an optimization variable, and determining the optimal value of the optimization variable through a preset nonlinear optimization method to obtain the calibration value of the external parameter.
For example: the preset calibration board has a checkerboard pattern in which the low grey (near black) portion is made of retro-reflective material, and the reflected intensity value of the point cloud obtained when the lidar detects this material is high. Therefore, the area corresponding to the target point cloud with high reflection intensity is a low gray value area (near black area) on the target calibration image. Only the high reflection intensity target point cloud can be projected on the target calibration image, and when the external parameter value is accurate, the high reflection intensity point cloud should be projected near the area with lower gray value of the target calibration image. And establishing an optimization target/cost function by utilizing the corresponding relation. Because the number of point clouds on the preset calibration plate and each laser scanning line may be different, the number of the point clouds is normalized in the cost function.
Figure BDA0001899200290000131
Wherein the content of the first and second substances,r and t respectively represent a rotation matrix and a translation vector in the external parameters. P is ijk And representing the high reflection intensity point cloud of the kth radar point in the jth high reflection radar scanning line in the ith frame of radar data. There are f +1 frames of high reflection intensity lidar data. The ith frame data comprises n (i) +1 laser radar scanning lines with high reflection intensity; in the jth high reflection intensity scanning line of the ith frame, there are m (i, j) +1 high reflection intensity lidar points. The Proj (X) function is a projection function that projects a three-dimensional point in space under the coordinate system of the image acquisition device onto a two-dimensional plane. The gray (X, i) function is to obtain the gray value of the image corresponding to the coordinate at X on the ith gray image after binarization and blurring, and the gray value can be obtained after bicubic interpolation is performed on the gray image. W ij And the weight coefficient is the corresponding weight coefficient on the jth radar scanning line of the ith frame data. The coefficient is obtained by calculating the product of the number of scan lines in the ith frame and the number of points on the jth line and taking the reciprocal.
And optimizing the cost function by using a nonlinear optimization method so as to obtain an optimal value, and taking the optimal value as a calibration value of the external parameter.
In the embodiment of the application, the calibration value of the external parameter is determined by establishing data correlation through the reflection intensity information in the point cloud data and the gray scale information in the image data, the calibration plate does not need to be established on the assumption that the calibration plate is a strict plane, errors caused when the pose of the calibration plate is extracted from the image can not be introduced, and the accuracy of the determined calibration value of the external parameter is high.
Optionally, before the binarizing and blurring processing are performed on the image data to obtain the target calibration image, the method further includes:
and performing distortion removal processing on the image data according to the internal reference.
According to the image acquisition equipment, the image data is subjected to distortion removal processing according to the internal reference of the image acquisition equipment, and the obtained image data is more accurate.
The embodiment of the present application further provides an external reference determining apparatus for an image capturing device and a radar, as shown in fig. 4, including: an image processing module 401, a laser point cloud processing module 402 and a nonlinear optimization module 403. As shown in fig. 5, the input data of the image processing module 401 is image data including a preset calibration board and internal parameters of the image capturing device; the output data of the image processing module 401 is an image (target calibration image) after binarization and blurring processing and coordinates of an area where a preset calibration plate is located. The input data of the laser point cloud processing module 402 is internal parameters of the image acquisition device, point cloud data of the laser radar, initial values of external parameters of the image acquisition device and the laser radar, and coordinates of the area where the preset calibration plate is output by the image processing module 401; the output data of the laser point cloud processing module 402 is a high reflection intensity point cloud and/or a low reflection intensity point cloud on a preset calibration plate. The input data of the nonlinear optimization module 403 is internal parameters of the image acquisition device, a target calibration image output by the image processing module 401, a high reflection intensity point cloud and/or a low reflection intensity point cloud on a preset calibration plate output by the laser point cloud processing module 402, and initial values of external parameters of the image acquisition device and the laser radar; the output data of the non-linear optimization module 403 is the calibrated value of the external parameters.
The image processing module 401 is configured to determine an area where a preset calibration plate is located in image data, and perform binarization and blurring processing on the area where the preset calibration plate is located to obtain a target calibration image.
The image processing module 401 may detect the feature mark on the preset calibration board through a computer vision technology to obtain an area of the preset calibration board in the image data, and output a position (target position) of the area. And carrying out binarization processing on the image of the area where the preset calibration plate is located, and then carrying out fuzzy processing on the black-and-white image after binarization processing.
For example, as shown in fig. 6, a checkerboard pattern is printed on a preset calibration board, the image processing module 401 performs distortion removal processing on image data through camera parameters, and determines an image area where the calibration board is located through a method of detecting checkerboard corner points on the calibration board. And carrying out binarization processing on the image of the block region by adopting an adaptive threshold value method. And then carrying out fuzzy processing on the black and white image in the area by adopting an average filtering method, wherein the core size of the average filtering is determined by comparing the high reflection intensity point with the size of the high reflection intensity area, such as taking half of the side length of the checkerboard.
A laser point cloud processing module 402 configured to project point cloud data onto the image data according to an initial value of an external parameter and an internal parameter; in the projected point cloud data, taking the point cloud at the target position as the initial point cloud reflected by a preset calibration plate; filtering out point clouds with different depths from the preset calibration plate depth from the initial point cloud based on the depth of each point cloud in the reflection intensity point cloud to obtain filtered point clouds; and according to the reflection intensity, determining a high reflection intensity point cloud and a low reflection intensity point cloud in the filtered point cloud by a preset clustering method, and taking the high reflection intensity point cloud and/or the low reflection intensity point cloud as a target point cloud.
The laser point cloud processing module 402 projects the reflection intensity point cloud of the laser radar onto the target calibration image according to the initial value of the external parameter and the internal parameter of the image acquisition device. According to the region position (target position) of the calibration plate output by the image processing module 401, the reflection intensity point cloud projected in the preset calibration plate region is selected. Due to the fact that parallax exists between the image acquisition equipment and the laser radar (when the two sensors are installed, centers cannot be completely overlapped, and the two sensors have difference in visual angles), laser radar point clouds on other non-preset calibration plates may exist in the visual angle area of the preset calibration plate on the image data. And filtering the laser radar point clouds based on the depth to obtain the laser radar point clouds on a preset calibration plate. And classifying points on each radar scanning line on the calibration plate according to the reflection intensity, and dividing the laser radar point cloud of each scanning line into two types of points with high and low reflection intensities.
For example, as shown in fig. 7, the point cloud data of the laser radar is projected onto the image data by the external parameter initial value and the internal parameter. And selecting the laser radar point cloud projected in the area of the calibration plate according to the area data of the calibration plate output by the image processing module 401. In order to filter out additional lidar point clouds caused by parallax in an area, all point clouds in the area are clustered according to distance (point clouds which are close to each other are aggregated into a category), and the point clouds with the largest number and the closest to a radar (generally, no other object exists between a calibration plate and the lidar) are selected as the lidar point clouds on the calibration plate. Arranging the laser radar point clouds according to the scanning lines, dividing the points of each scanning line into two types of point clouds with high reflection intensity and low reflection intensity in a k-means clustering mode, and outputting the point clouds with high reflection intensity and/or the point clouds with low reflection intensity to obtain target point clouds.
And a nonlinear optimization module 403, configured to convert the target point cloud and the target calibration image into a same coordinate system according to the initial values of the external parameters and the internal parameters, use an average value of gray levels of projection images of the target point cloud and the target calibration image as an optimization target, use values of the external parameters as optimization variables, and determine optimal values of the optimization variables by a preset nonlinear optimization method to obtain calibration values of the external parameters.
For example, as shown in fig. 8, the nonlinear optimization module 403 projects the high reflection intensity point cloud and/or the low reflection intensity point cloud in the target point cloud output by the laser radar point cloud processing module 402 on the binarized and blurred image output by the image processing module according to the external parameter initial value and the internal parameter, takes the gray average value of the projected image as the optimization target, takes the optimized variable as the external parameter, and obtains the calibration value of the external parameter after performing nonlinear optimization.
For example: the preset calibration plate is printed with a checkerboard pattern of the chess, so that the area corresponding to the point cloud with high reflection intensity in the target point cloud is a low-gray value area (close to a black area) on the target calibration image. The high reflection intensity point cloud can be projected on a target calibration image, and when the external parameter value is accurate, the high reflection intensity point cloud should be projected in an area with a lower gray value in a target calibration area. With the high reflection intensity points should be projected near the black checkerboard, an optimization objective/cost function is established. Since the number of point clouds on the preset calibration plate and each laser scanning line may be different, the number of point clouds is normalized in the cost function.
Figure BDA0001899200290000161
Wherein, R and t respectively represent a rotation matrix and a translation vector in the external parameter. P ijk And representing the high reflection intensity point cloud of the kth radar point in the jth high reflection radar scanning line in the ith frame of radar data. There are f +1 frames of high reflection intensity lidar data. The ith frame data comprises n (i) +1 laser radar scanning lines with high reflection intensity; in the jth scanning line of the ith frame, m (i, j) +1 laser radar points with high reflection intensity exist. The Proj (X) function is a projection function that projects a three-dimensional point in space under the coordinate system of the image acquisition device onto a two-dimensional plane. The gray (X, i) function is to obtain the image gray value corresponding to the coordinate at X on the ith gray image after binarization and blurring, and the gray value can be obtained after bicubic interpolation on the gray image. W ij And the weighting coefficient is the corresponding weight coefficient on the jth radar scanning line of the ith frame data. The coefficient is obtained by calculating the product of the number of scan lines in the ith frame and the number of points on the jth line and taking the reciprocal.
And optimizing the cost function by using a nonlinear optimization method so as to obtain an optimal value, and taking the optimal value as a calibration value of the external parameter.
In the embodiment of the application, the external parameters of the image acquisition equipment and the laser radar are determined, the calibration value of the external parameters is determined by establishing data correlation through the reflection intensity information in the reflection intensity point cloud and the gray scale information in the image data, the calibration plate does not need to be established on the assumption that the calibration plate is a strict plane, errors caused when the pose of the calibration plate is extracted from the image can not be introduced, and the accuracy of the determined external parameter calibration value is high.
The embodiment of the present application provides an external reference determining apparatus for an image acquisition device and a radar, referring to fig. 9, the apparatus includes:
a parameter obtaining module 901, configured to obtain an initial value of external parameters of an image acquisition device and a laser radar, internal parameters of the image acquisition device, image data acquired by the image acquisition device, and point cloud data acquired by the laser radar, where a preset calibration plate is disposed in a common detection area of the image acquisition device and the laser radar, and the preset calibration plate has an image feature and a reflection feature;
a coordinate conversion module 902, configured to convert the image data and the point cloud data into a same coordinate system according to the initial values of the external parameters and the internal parameters;
a calibration value output module 903, configured to adjust a value of the external parameter, and output a calibration value of the external parameter when the image data and the point cloud data meet a preset coincidence condition, where the preset coincidence condition represents that a preset calibration plate image in the image data and a preset calibration plate point cloud in the point cloud data coincide.
Optionally, the coordinate transformation module 902 includes:
the target point cloud determining submodule is used for determining a target point cloud corresponding to the preset calibration plate in the point cloud data;
the target point cloud mapping submodule is used for converting the target point cloud into a coordinate system of the image data according to the initial value of the external parameter and the internal parameter;
the calibration value output module 903 is specifically configured to:
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the target point cloud meet a preset superposition condition.
Optionally, the target point cloud determining sub-module includes:
a calibration area determining unit, configured to determine a target calibration area of the preset calibration plate in the image data;
a mapping area determining unit, configured to determine a mapping area of the target calibration area in the point cloud data according to the initial value of the external parameter and the internal parameter;
an initial point cloud determining unit, configured to determine an initial point cloud in the point cloud data according to the mapping region;
and the point cloud filtering unit is used for filtering point clouds with different depths from the preset calibration plate in the initial point cloud based on the depths of all points to obtain a target point cloud.
Optionally, the reflection intensity of the area with large gray scale in the preset calibration board is higher than the reflection intensity of the area with small gray scale, and the external reference determining device for the image acquisition device and the radar in the embodiment of the application further includes:
the image processing module is used for carrying out binarization processing and fuzzy processing on the image data to obtain a target calibration image;
the point cloud clustering module is used for clustering each point in the target point cloud by a preset clustering method according to the reflection intensity to obtain a high-reflection-intensity point cloud and/or a low-reflection-intensity point cloud, wherein the high-reflection-intensity point cloud is formed by points with reflection intensity larger than a preset intensity threshold, and the low-reflection-intensity point cloud is formed by points with reflection intensity lower than the preset intensity threshold;
the calibration value output module 903 is specifically configured to:
and taking the average value of the gray levels of the target point cloud and the projected image of the target calibration image as an optimization target, taking the value of the external parameter as an optimization variable, and determining the optimal value of the optimization variable by a preset nonlinear optimization method to obtain the calibration value of the external parameter.
Optionally, the external reference determining apparatus for image acquisition device and radar in the embodiment of the present application further includes:
and the distortion removing module is used for carrying out distortion removing processing on the image data according to the internal parameters.
The embodiment of the present application further provides an electronic device, as shown in fig. 10, which includes a processor 1001, a communication interface 1002, a memory 1003 and a communication bus 1004, wherein the processor 1001, the communication interface 1002 and the memory 1003 complete mutual communication through the communication bus 1004,
a memory 1003 for storing a computer program;
the processor 1001 is configured to implement the following steps when executing the program stored in the memory 1003:
acquiring initial values of external parameters of image acquisition equipment and a laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, wherein a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar and has image characteristics and reflection characteristics;
converting the image data and the point cloud data into the same coordinate system according to the initial value of the external parameter and the internal parameter;
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset registration condition, wherein the preset registration condition represents that a preset calibration plate image in the image data is registered with a preset calibration plate point cloud in the point cloud data.
In the embodiment of the application, the external parameters of the image acquisition equipment and the radar are determined, and the external parameters are determined accurately.
Optionally, the processor 1001 is configured to implement any one of the above-described external parameter determining methods for the image capturing device and the radar when executing the program stored in the memory 1003.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
An embodiment of the present application provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the following steps:
acquiring initial values of external parameters of image acquisition equipment and a laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, wherein a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar and has image characteristics and reflection characteristics;
converting the image data and the point cloud data into the same coordinate system according to the initial value of the external parameter and the internal parameter;
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset registration condition, wherein the preset registration condition represents that a preset calibration plate image in the image data is registered with a preset calibration plate point cloud in the point cloud data.
Optionally, the computer program, when executed by the processor, may further implement any of the above-described image capturing devices and the radar external parameter determining method.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the embodiments of the apparatus, the electronic device, and the storage medium, since they are substantially similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
The above description is only for the preferred embodiment of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (11)

1. An image acquisition device and radar external parameter determination method, the method comprising:
acquiring initial values of external parameters of image acquisition equipment and a laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, wherein a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar and has image characteristics and reflection characteristics;
converting the image data and the point cloud data to the same coordinate system according to the initial values of the external parameters and the internal parameters;
adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset registration condition, wherein the preset registration condition represents that a preset calibration plate image in the image data and a preset calibration plate point cloud in the point cloud data are registered;
the step of converting the image data and the point cloud data into the same coordinate system according to the initial value of the external parameter and the internal parameter comprises the following steps:
converting the point cloud data from the coordinate system of the laser radar to the coordinate system of the image acquisition equipment according to the initial value of the external parameter and the internal parameter; or
And converting the image data from the coordinate system of the image acquisition equipment into the coordinate system of the laser radar.
2. The method of claim 1, wherein converting the image data and the point cloud data to a same coordinate system according to the initial values of the external parameters and the internal parameters comprises:
determining a target point cloud corresponding to the preset calibration plate in the point cloud data;
converting the target point cloud into a coordinate system of the image data according to the initial value of the external parameter and the internal parameter;
the adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset coincidence condition, comprises:
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the target point cloud meet a preset superposition condition.
3. The method according to claim 2, wherein the determining the target point cloud corresponding to the preset calibration plate in the point cloud data comprises:
determining a target calibration area of the preset calibration plate in the image data;
determining a mapping area of the target calibration area in the point cloud data according to the initial value of the external parameter and the internal parameter;
determining an initial point cloud in the point cloud data according to the mapping area;
and filtering out point clouds with different depths from the preset calibration plate in the initial point cloud based on the depths of all the points to obtain target point cloud.
4. The method of claim 2, wherein the preset calibration plate has a higher reflection intensity in a region with a large gray scale than in a region with a small gray scale, and further comprising:
carrying out binarization processing and fuzzy processing on the image data to obtain a target calibration image;
clustering each point in the target point cloud by a preset clustering method according to the reflection intensity to obtain a high-reflection-intensity point cloud and/or a low-reflection-intensity point cloud, wherein the high-reflection-intensity point cloud is formed by points with reflection intensities larger than a preset intensity threshold value, and the low-reflection-intensity point cloud is formed by points with reflection intensities lower than the preset intensity threshold value;
the adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the target point cloud meet a preset coincidence condition, comprises:
and taking the average value of the gray levels of the target point cloud and the projected image of the target calibration image as an optimization target, taking the value of the external parameter as an optimization variable, and determining the optimal value of the optimization variable by a preset nonlinear optimization method to obtain the calibration value of the external parameter.
5. The method according to claim 4, wherein before the binarizing and blurring processing are performed on the image data to obtain the target calibration image, the method further comprises:
and according to the internal reference, carrying out distortion removal processing on the image data.
6. An external parameter determining apparatus for an image capturing device and a radar, the apparatus comprising:
the device comprises a parameter acquisition module, a parameter acquisition module and a parameter processing module, wherein the parameter acquisition module is used for acquiring initial values of external parameters of image acquisition equipment and a laser radar, internal parameters of the image acquisition equipment, image data acquired by the image acquisition equipment and point cloud data acquired by the laser radar, a preset calibration plate is arranged in a common detection area of the image acquisition equipment and the laser radar, and the preset calibration plate has image characteristics and reflection characteristics;
the coordinate conversion module is used for converting the image data and the point cloud data into the same coordinate system according to the initial values of the external parameters and the internal parameters;
the calibration value output module is used for adjusting the value of the external parameter and outputting the calibration value of the external parameter when the image data and the point cloud data meet a preset coincidence condition, wherein the preset coincidence condition represents that a preset calibration plate image in the image data is coincided with a preset calibration plate point cloud in the point cloud data;
the coordinate conversion module is specifically used for converting the point cloud data from a coordinate system of the laser radar to a coordinate system of the image acquisition equipment according to the initial value of the external parameter and the internal parameter; or converting the image data from the coordinate system of the image acquisition equipment into the coordinate system of the laser radar.
7. The apparatus of claim 6, wherein the coordinate transformation module comprises:
the target point cloud determining submodule is used for determining a target point cloud corresponding to the preset calibration plate in the point cloud data;
the target point cloud mapping submodule is used for converting the target point cloud into a coordinate system of the image data according to the initial value of the external parameter and the internal parameter;
the calibration value output module is specifically configured to:
and adjusting the value of the external parameter, and outputting the calibration value of the external parameter when the image data and the target point cloud meet a preset superposition condition.
8. The apparatus of claim 7, wherein the target point cloud determination sub-module comprises:
a calibration area determining unit, configured to determine a target calibration area of the preset calibration plate in the image data;
a mapping area determining unit, configured to determine a mapping area of the target calibration area in the point cloud data according to the initial value of the external parameter and the internal parameter;
an initial point cloud determining unit, configured to determine an initial point cloud in the point cloud data according to the mapping region;
and the point cloud filtering unit is used for filtering out point clouds with different depths from the preset calibration plate in the initial point cloud based on the depths of all points to obtain target point cloud.
9. The apparatus of claim 7, wherein the preset calibration plate has a reflection intensity in a region with a large gray scale higher than that in a region with a small gray scale, the apparatus further comprising:
the image processing module is used for carrying out binarization processing and fuzzy processing on the image data to obtain a target calibration image;
the point cloud clustering module is used for clustering each point in the target point cloud by a preset clustering method according to the reflection intensity to obtain a high-reflection-intensity point cloud and/or a low-reflection-intensity point cloud, wherein the high-reflection-intensity point cloud is formed by points with reflection intensity larger than a preset intensity threshold, and the low-reflection-intensity point cloud is formed by points with reflection intensity lower than the preset intensity threshold;
the calibration value output module is specifically configured to:
and taking the average value of the gray levels of the target point cloud and the projected image of the target calibration image as an optimization target, taking the value of the external parameter as an optimization variable, and determining the optimal value of the optimization variable by a preset nonlinear optimization method to obtain the calibration value of the external parameter.
10. The apparatus of claim 9, further comprising:
and the distortion removing module is used for carrying out distortion removing processing on the image data according to the internal parameters.
11. An external parameter determining apparatus for an image capturing device and a radar, the apparatus comprising:
the image processing module is arranged for determining the area where the preset calibration plate is located in the image data of the preset calibration plate, and performing binarization and fuzzy processing on the image of the area where the preset calibration plate is located to obtain a target calibration area image;
a laser point cloud processing module configured to project a reflection intensity point cloud onto the image data according to an initial value and an internal parameter of the external parameter; taking the point cloud at the target position as an initial point cloud reflected by the preset calibration plate in the projected reflection intensity point cloud; based on the depth of each point cloud in the reflection intensity point cloud, filtering out point clouds with different depths from the preset calibration plate in the initial point cloud to obtain filtered point clouds; according to the reflection intensity, determining a high reflection intensity point cloud and a low reflection intensity point cloud in the filtered point cloud by a preset clustering method, and taking the high reflection intensity point cloud and/or the low reflection intensity point cloud as target point clouds;
the nonlinear optimization module is configured to convert the target point cloud and the target calibration area into the same coordinate system according to the initial values of the external parameters and the internal parameters, take the average value of the gray levels of the projection images of the target point cloud and the target calibration area as an optimization target, take the values of the external parameters as optimization variables, and determine the optimal values of the optimization variables through a preset nonlinear optimization method to obtain the calibration values of the external parameters;
the step of converting the target point cloud and the target calibration area to the same coordinate system according to the initial value of the external parameter and the internal parameter comprises the following steps:
converting the target point cloud from the coordinate system of the laser radar to the coordinate system of the image acquisition equipment according to the initial value of the external parameter and the internal parameter; or
And converting the target calibration area from the coordinate system of the image acquisition equipment to the coordinate system of the laser radar.
CN201811504941.3A 2018-12-10 2018-12-10 External parameter determining method and device for image acquisition equipment and radar Active CN111308448B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811504941.3A CN111308448B (en) 2018-12-10 2018-12-10 External parameter determining method and device for image acquisition equipment and radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811504941.3A CN111308448B (en) 2018-12-10 2018-12-10 External parameter determining method and device for image acquisition equipment and radar

Publications (2)

Publication Number Publication Date
CN111308448A CN111308448A (en) 2020-06-19
CN111308448B true CN111308448B (en) 2022-12-06

Family

ID=71154261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811504941.3A Active CN111308448B (en) 2018-12-10 2018-12-10 External parameter determining method and device for image acquisition equipment and radar

Country Status (1)

Country Link
CN (1) CN111308448B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111965624B (en) * 2020-08-06 2024-04-09 阿波罗智联(北京)科技有限公司 Laser radar and camera calibration method, device, equipment and readable storage medium
CN111965625B (en) * 2020-08-11 2023-02-21 上海禾赛科技有限公司 Correction method and device for laser radar and environment sensing system
CN112381873A (en) * 2020-10-23 2021-02-19 北京亮道智能汽车技术有限公司 Data labeling method and device
CN112446928B (en) * 2021-02-01 2021-04-23 南京爱奇艺智能科技有限公司 External parameter determining system and method for shooting device
CN112837383B (en) * 2021-03-01 2022-02-22 东南大学 Camera and laser radar recalibration method and device and computer readable storage medium
CN113269857A (en) * 2021-05-28 2021-08-17 东软睿驰汽车技术(沈阳)有限公司 Coordinate system relation obtaining method and device
CN113658270B (en) * 2021-08-10 2023-09-29 湖南视比特机器人有限公司 Method, device, medium and system for multi-vision calibration based on workpiece hole center
CN113884278B (en) * 2021-09-16 2023-10-27 杭州海康机器人股份有限公司 System calibration method and device for line laser equipment
CN115471574B (en) * 2022-11-02 2023-02-03 北京闪马智建科技有限公司 External parameter determination method and device, storage medium and electronic device
CN116577796B (en) * 2022-11-17 2024-03-19 昆易电子科技(上海)有限公司 Verification method and device for alignment parameters, storage medium and electronic equipment
CN115953484B (en) * 2023-03-13 2023-07-04 福思(杭州)智能科技有限公司 Parameter calibration method and device of detection equipment, storage medium and electronic device
CN116168090B (en) * 2023-04-24 2023-08-22 南京芯驰半导体科技有限公司 Equipment parameter calibration method and device
CN116758170B (en) * 2023-08-15 2023-12-22 北京市农林科学院智能装备技术研究中心 Multi-camera rapid calibration method for livestock and poultry phenotype 3D reconstruction and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102034238A (en) * 2010-12-13 2011-04-27 西安交通大学 Multi-camera system calibrating method based on optical imaging test head and visual graph structure
CN103606139A (en) * 2013-09-09 2014-02-26 上海大学 Sonar image splicing method
WO2016185637A1 (en) * 2015-05-20 2016-11-24 三菱電機株式会社 Point-cloud-image generation device and display system
CN107255821A (en) * 2017-06-07 2017-10-17 旗瀚科技有限公司 A kind of method for splicing simulated laser radar data based on many depth cameras
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
CN108020825A (en) * 2016-11-03 2018-05-11 岭纬公司 Laser radar, Laser video camera head, the fusion calibration system of video camera and method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313B (en) * 2010-07-14 2011-12-21 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
EP2914975B1 (en) * 2012-11-05 2019-03-06 The Chancellor, Masters and Scholars of The University of Oxford Extrinsic calibration of imaging sensing devices and 2d lidars mounted on transportable apparatus
EP2728376A1 (en) * 2012-11-05 2014-05-07 The Chancellor, Masters and Scholars of the University of Oxford Extrinsic calibration of imaging sensing devices and 2D LIDARs mounted on transportable apparatus
CN103837869B (en) * 2014-02-26 2016-06-01 北京工业大学 Based on single line laser radar and the CCD camera scaling method of vector relations
CN104778694B (en) * 2015-04-10 2017-11-14 北京航空航天大学 A kind of parametrization automatic geometric correction method shown towards multi-projection system
CN107953827A (en) * 2016-10-18 2018-04-24 杭州海康威视数字技术股份有限公司 A kind of vehicle blind zone method for early warning and device
CN107977924A (en) * 2016-10-21 2018-05-01 杭州海康威视数字技术股份有限公司 A kind of image processing method based on dual sensor imaging, system
CN107544095B (en) * 2017-07-28 2019-03-08 河南工程学院 A kind of method that Three Dimensional Ground laser point cloud is merged with ground penetrating radar image
CN108257173A (en) * 2017-12-29 2018-07-06 上海物景智能科技有限公司 Object separation method and apparatus and system in a kind of image information
CN108399643A (en) * 2018-03-15 2018-08-14 南京大学 A kind of outer ginseng calibration system between laser radar and camera and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102034238A (en) * 2010-12-13 2011-04-27 西安交通大学 Multi-camera system calibrating method based on optical imaging test head and visual graph structure
CN103606139A (en) * 2013-09-09 2014-02-26 上海大学 Sonar image splicing method
WO2016185637A1 (en) * 2015-05-20 2016-11-24 三菱電機株式会社 Point-cloud-image generation device and display system
CN108020825A (en) * 2016-11-03 2018-05-11 岭纬公司 Laser radar, Laser video camera head, the fusion calibration system of video camera and method
CN107255821A (en) * 2017-06-07 2017-10-17 旗瀚科技有限公司 A kind of method for splicing simulated laser radar data based on many depth cameras
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种自适应摄像机与激光雷达联合标定算法;姚文韬等;《控制工程》;20171120;77-81 *
一种针孔相机与三维激光雷达外参标定方法;韩正勇等;《传感器与微系统》;20180419;第37卷(第4期);9-12+16 *

Also Published As

Publication number Publication date
CN111308448A (en) 2020-06-19

Similar Documents

Publication Publication Date Title
CN111308448B (en) External parameter determining method and device for image acquisition equipment and radar
CN109146929B (en) Object identification and registration method based on event-triggered camera and three-dimensional laser radar fusion system
CN111429533B (en) Camera lens distortion parameter estimation device and method
CN112927306B (en) Calibration method and device of shooting device and terminal equipment
CN114037992A (en) Instrument reading identification method and device, electronic equipment and storage medium
CN112200848A (en) Depth camera vision enhancement method and system under low-illumination weak-contrast complex environment
CN116228780A (en) Silicon wafer defect detection method and system based on computer vision
CN114821497A (en) Method, device and equipment for determining position of target object and storage medium
CN111062341B (en) Video image area classification method, device, equipment and storage medium
CN110557622B (en) Depth information acquisition method and device based on structured light, equipment and medium
Barua et al. An Efficient Method of Lane Detection and Tracking for Highway Safety
CN113450335B (en) Road edge detection method, road edge detection device and road surface construction vehicle
JP6492603B2 (en) Image processing apparatus, system, image processing method, and program
WO2019052320A1 (en) Monitoring method, apparatus and system, electronic device, and computer readable storage medium
CN114724119A (en) Lane line extraction method, lane line detection apparatus, and storage medium
CN114166132A (en) Vehicle height snapshot measuring method and device
CN113052886A (en) Method for acquiring depth information of double TOF cameras by adopting binocular principle
CN111435080B (en) Water level measuring method, device and system
CN113095324A (en) Classification and distance measurement method and system for cone barrel
TWI493475B (en) Method for determining convex polygon object in image
CN115018788B (en) Overhead line abnormality detection method and system based on intelligent robot
Kostrin et al. Application of an Automated Optoelectronic System for Determining Position of an Object
CN114926528A (en) Leveling method, device and system for image sensor and storage medium
CN114742898A (en) Combined calibration method and system for laser radar and camera
CN117611652A (en) Grounding device mounting bolt size measurement method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant