CN113256740A - Calibration method of radar and camera, electronic device and storage medium - Google Patents

Calibration method of radar and camera, electronic device and storage medium Download PDF

Info

Publication number
CN113256740A
CN113256740A CN202110722650.7A CN202110722650A CN113256740A CN 113256740 A CN113256740 A CN 113256740A CN 202110722650 A CN202110722650 A CN 202110722650A CN 113256740 A CN113256740 A CN 113256740A
Authority
CN
China
Prior art keywords
point
cloud data
point cloud
calibration
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110722650.7A
Other languages
Chinese (zh)
Inventor
蔡军
孔健
苗占东
马可
黄毅
陈士荣
施子凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecarx Hubei Tech Co Ltd
Original Assignee
Hubei Ecarx Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Ecarx Technology Co Ltd filed Critical Hubei Ecarx Technology Co Ltd
Priority to CN202110722650.7A priority Critical patent/CN113256740A/en
Publication of CN113256740A publication Critical patent/CN113256740A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the invention provides a calibration method of a radar and a camera, electronic equipment and a storage medium, relates to the technical field of automatic driving, and can improve the calibration efficiency of the radar and the camera. The technical scheme of the embodiment of the invention comprises the following steps: setting a calibration plate in a common visual area of a vehicle-mounted radar and a camera; in the process that the vehicle drives to the calibration plate, a vehicle-mounted radar collects multi-frame point cloud data, and a vehicle-mounted camera collects multi-frame image data; determining target image data matched with each frame of point cloud data according to the time stamp of each frame of point cloud data in the multi-frame point cloud data and the time stamp of each frame of image data in the multi-frame image data; then, aiming at each frame of point cloud data, determining a first calibration point in the point cloud data, and determining a second calibration point in target image data matched with the point cloud data to obtain a calibration point group corresponding to the point cloud data; and determining a coordinate conversion relation between a coordinate system of the radar and a coordinate system of the camera according to the calibration point group corresponding to the cloud data of each point.

Description

Calibration method of radar and camera, electronic device and storage medium
Technical Field
The present invention relates to the field of automatic driving technologies, and in particular, to a calibration method for radar and a camera, an electronic device, and a storage medium.
Background
In the automatic driving process, the intelligent driving vehicle needs to sense abundant surrounding environment information, and when the potential safety hazard exists in the surrounding environment, safety measures are taken in time. In order to collect environmental information around a vehicle, two sensors, namely a laser radar sensor and a camera, are generally installed in an intelligent driving vehicle. The camera can acquire abundant texture and color information in the environment, but is difficult to accurately acquire distance information of characteristic objects in the environment; the laser radar can accurately acquire the distance information of the characteristic object in the environment, but is difficult to acquire the texture and color information in the environment. Therefore, in order to combine the respective advantages of the camera and the lidar, the camera and the lidar need to be calibrated to obtain a coordinate conversion relationship between a coordinate system of the lidar and a coordinate system of the camera, so that data acquired by the camera and the lidar are fused.
When calibrating the laser radar and the camera, a calibration plate is generally placed outside the vehicle, and when the vehicle is static, the data of the calibration plate is collected through the laser radar and the camera which are installed on the vehicle, and then calibration is carried out based on the collected data of the calibration plate. In order to make the calibration result more accurate, the placing position of the calibration plate needs to be adjusted for many times to acquire the data of the calibration plate acquired by the camera and the laser radar under various different poses, and calibration is performed according to the acquired data of the calibration plate. However, the position of the calibration plate needs to be adjusted manually, so that the calibration efficiency is low.
Disclosure of Invention
The embodiment of the invention aims to provide a calibration method of a radar and a camera, electronic equipment and a storage medium, so as to improve the calibration efficiency of the radar and the camera. The specific technical scheme is as follows:
in a first aspect, a method for calibrating a radar and a camera includes:
setting a calibration plate in a common visual area of a vehicle-mounted radar and a camera;
in the process that the vehicle drives to the calibration plate, a vehicle-mounted radar collects multi-frame point cloud data, and a vehicle-mounted camera collects multi-frame image data;
determining target image data matched with each frame of point cloud data according to the time stamp of each frame of point cloud data in the multi-frame point cloud data and the time stamp of each frame of image data in the multi-frame image data;
aiming at each frame of point cloud data, determining a first calibration point in the point cloud data, and determining a second calibration point in target image data matched with the point cloud data to obtain a calibration point group corresponding to the point cloud data; the first calibration point is a light spot corresponding to the specified position of the calibration plate in the point cloud data, and the second calibration point is a pixel point corresponding to the specified position of the calibration plate in the target image data;
and determining a coordinate conversion relation between the coordinate system of the radar and the coordinate system of the camera according to the calibration point group corresponding to each point cloud data.
Optionally, the determining, according to the timestamp of each frame of point cloud data in the multiple frames of point cloud data and the timestamp of each frame of image data in the multiple frames of image data, target image data matched with each frame of point cloud data includes:
performing the following operations for each frame of point cloud data in the plurality of frames of point cloud data:
searching image data with the minimum difference value with the time stamp of the point cloud data from the multi-frame image data;
and if the difference value between the time stamp of the point cloud data and the time stamp of the searched image data is smaller than a preset difference value, using the searched image data as target image data matched with the point cloud data.
Optionally, before the determining the first calibration point in the point cloud data, the method further includes:
acquiring motion information corresponding to the point cloud data, wherein the motion information comprises: the radar collects the displacement and the speed of the vehicle in the process of the point cloud data;
according to the motion information, distortion compensation is carried out on the point cloud data to obtain the coordinate of each light spot in the point cloud data under a specified coordinate system, wherein the specified coordinate system is a radar coordinate system where the first collected light spot in the point cloud data is located;
and for each light spot, replacing the original coordinates of the light spot by the coordinates of the light spot in the specified coordinate system.
Optionally, the performing distortion compensation on the point cloud data according to the motion information to obtain coordinates of each light point in the point cloud data in a specified coordinate system includes:
obtaining the relative scanning time of each light spot in the process of scanning the point cloud data by the radar based on the obtained radar beam to which each light spot belongs;
interpolating between the acquired motion information and the adjacent motion information to obtain the displacement and the speed of each moment in a time period between the moment corresponding to the acquired operation information and the moment corresponding to the adjacent motion information;
aiming at each light spot included in the point cloud data, determining a compensation transformation matrix of a coordinate system where the light spot is located relative to the specified coordinate system according to the relative scanning time of the light spot in the process of scanning the point cloud data by the radar and the displacement and the speed corresponding to the relative scanning time;
and multiplying the coordinates of the light spot by the determined compensation transformation matrix to obtain the coordinates of the light spot in the specified coordinate system.
Optionally, the calibration plate is a checkerboard calibration plate, and the designated position is a vertex of the checkerboard calibration plate;
the determining a second index point in the target image data matching the point cloud data includes:
extracting coordinates of designated angle point pairs in a checkerboard pattern included in the target image data, wherein each angle point pair and one vertex of the checkerboard calibration plate are positioned on the same straight line;
and determining pixel points of vertexes, which are positioned on a straight line with the specified corner points in the target image data, as second calibration points according to the coordinates of each specified corner point pair.
Optionally, the determining, according to the coordinates of each designated corner pair, a pixel point of a vertex of the target image data located on a straight line with the designated corner pair as the second calibration point includes:
for each appointed angle point pair, determining the distance between two angle points included in the appointed angle point pair and the included angle degree between a connecting line between the two angle points included in the appointed angle point pair and the lower bottom edge of the checkerboard pattern according to the coordinates of the appointed angle point pair;
and determining a pixel point of a vertex which is positioned on a straight line with the specified angular point pair as the second calibration point according to the distance between two angular points included in the specified angular point pair and the included angle corresponding to the specified angular point pair.
Optionally, the calibration plate is rectangular, and the designated position of the calibration plate is the vertex of the calibration plate;
the determining a first index point in the point cloud data includes:
identifying a target point cluster consisting of light spots related to the calibration plate in the point cloud data;
performing linear fitting on the light spots at the boundary in the target point cluster to obtain a plurality of fitting straight lines;
and taking the intersection point between the fitting straight lines as the first calibration point.
Optionally, the identifying a target point cluster composed of light spots related to the calibration plate in the point cloud data includes:
identifying a point cluster consisting of light spots belonging to the same plane in the point cloud data;
and according to the size of the plane corresponding to each point cluster, taking the point cluster with the plane size meeting the size condition of the calibrated plate as a target point cluster.
Optionally, before the identifying a target point cluster composed of light points related to the calibration plate in the point cloud data, the method further includes:
filtering the light spots in the point cloud data according to the position of each light spot in the point cloud data to obtain the light spots of which the positions meet the filtering condition;
wherein the filtration conditions include any one or more of the following conditions: the horizontal distance between the position corresponding to the light spot and the radar is not more than a preset horizontal distance, the vertical distance between the position corresponding to the light spot and the radar is not more than a preset vertical distance, the height distance between the position corresponding to the light spot and the radar is not more than a preset height distance, and the angle between the position where the radar scans the light spot and the initial scanning position of the radar belongs to a preset angle range.
Optionally, after determining a coordinate transformation relationship between the coordinate system of the radar and the coordinate system of the camera according to the calibration point group corresponding to each point cloud data, the method further includes:
aiming at each frame of point cloud data collected by the radar, determining an error between a first calibration point in the point cloud data and a second calibration point in target image data corresponding to the point cloud data according to the coordinate conversion relation;
judging whether the error corresponding to the cloud data of each point meets the error limiting condition or not;
and if the error corresponding to the cloud data of each point does not meet the error limiting condition, returning to the step of setting the calibration plate in the common visual area of the vehicle-mounted radar and the vehicle-mounted camera until the error corresponding to the cloud data of each point meets the error limiting condition.
Optionally, after determining a coordinate transformation relationship between the coordinate system of the radar and the coordinate system of the camera according to the calibration point group corresponding to each point cloud data, the method further includes:
selecting a preset number of point cloud data from the cloud data of each point collected by the radar;
and aiming at each frame of selected point cloud data, determining light projection points of each light spot in the point cloud data on target image data corresponding to the point cloud data according to the coordinate conversion relation, and displaying each light projection point on the target image data corresponding to the point cloud data in a visualization module.
Optionally, the determining, for each frame of point cloud data, a first calibration point in the point cloud data and a second calibration point in the target image data matched with the point cloud data includes:
the main thread puts each pair of matched point cloud data and target image data into a work queue;
based on the respective working states of a plurality of parallel processing working threads, each working thread acquires matched point cloud data and target image data from the working queue according to the sequential arrangement of the matched point cloud data and the target image data in the working queue, extracts a first calibration point from the acquired point cloud data, and extracts a second calibration point from the acquired target image data.
In a second aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing the steps of any radar and camera calibration method when executing the program stored in the memory.
In a third aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps of any one of the above radar and camera calibration methods are implemented.
In a fourth aspect, an embodiment of the present invention further provides a computer program product containing instructions, which when run on a computer, causes the computer to execute any one of the above radar and camera calibration methods.
The calibration method, the electronic device and the storage medium for the radar and the camera provided by the embodiment of the invention can be used for collecting multi-frame point cloud data through the vehicle-mounted radar and multi-frame image data through the vehicle-mounted camera in the process that a vehicle drives to the calibration plate, and calibrating based on the obtained data. The positions of the camera and the radar are changed continuously during the running process of the vehicle, while the position of the calibration board is not changed, namely the relative positions of the radar and the camera with the calibration board respectively are changed continuously. Therefore, data acquired by the camera and the radar under various different poses are obtained equivalently, the positions of the camera and the radar change along with the running position of the vehicle, manual adjustment is not needed, and therefore calibration efficiency is improved.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other embodiments can be obtained by referring to these drawings.
Fig. 1 is a flowchart of a calibration method for a radar and a camera according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for distortion compensation of point cloud data according to an embodiment of the present invention;
FIG. 3 is a flowchart of a method for determining a second index point in target image data according to an embodiment of the present invention;
FIG. 4 is an exemplary diagram of a calibration board in target image data according to an embodiment of the present invention;
FIG. 5 is an exemplary diagram of a calibration board in another target image data provided by an embodiment of the invention;
fig. 6 is a flowchart of a method for determining a first calibration point in point cloud data according to an embodiment of the present invention;
FIG. 7 is an exemplary diagram of a light point associated with a calibration plate in point cloud data according to an embodiment of the present invention;
FIG. 8 is an exemplary diagram of multithreading according to an embodiment of the present invention;
FIG. 9 is an exemplary diagram of a radar and camera mounting location provided by an embodiment of the present invention;
FIG. 10 is a flowchart of another method for calibrating a radar and a camera according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived from the embodiments given herein by one of ordinary skill in the art, are within the scope of the invention.
In order to improve the calibration efficiency of the radar and the camera, the embodiment of the invention provides a calibration method of the radar and the camera, which can be applied to a vehicle-mounted industrial personal computer, wherein the vehicle where the vehicle-mounted industrial personal computer is located is provided with the radar and the camera. The radar and the camera may be installed at a vehicle roof or a vehicle head, etc., so that the radar and the camera can scan or photograph a calibration panel located outside the vehicle. The radar in the embodiments of the present invention is a multiline radar, such as a laser radar and/or a millimeter wave radar.
As shown in fig. 1, the calibration method for a radar and a camera provided in the embodiment of the present invention includes the following steps:
s101, a calibration board is arranged in a common visual area of a vehicle-mounted radar and a vehicle-mounted camera.
In the embodiment of the invention, the common view area of the vehicle-mounted radar and camera represents: the overlapping area of the scene scanned by the radar and the scene photographed by the camera. The common viewing area of the radar and the camera may be disposed in front of the vehicle, with the calibration plate disposed in front of the vehicle. For example, the calibration plate may be mounted on an outboard calibration bracket, located 20 meters (m) directly in front of the vehicle.
Optionally, the height of the calibration board may be adjusted so that the calibration board is located in the common view area of the radar and the camera.
S102, in the process that the vehicle drives to the calibration plate, a vehicle-mounted radar collects multi-frame point cloud data, and a vehicle-mounted camera collects multi-frame image data.
Optionally, in order to improve the calibration accuracy, the vehicle can be set to run towards the calibration board at a constant speed. For example, the speed of constant speed driving may be any one of 0-10 km/h. Where km is kilo meter, i.e. km, and h is hour, hour.
S103, determining target image data matched with each frame of point cloud data according to the time stamp of each frame of point cloud data in the multi-frame point cloud data and the time stamp of each frame of image data in the multi-frame image data.
In the embodiment of the invention, the frequency of the image data shot by the camera is greater than that of the point cloud data scanned by the radar, for example, the frequency of the image data shot by the camera is 20 Hertz (Hertz, Hz), which means that 20 frames of image data are obtained by shooting the camera one second; the frequency of the point cloud data scanned by the radar is 10Hz, which indicates that the radar scans 10 frames of point cloud data in one second.
In the same time, the image data shot by the camera is more than the point cloud data scanned by the radar, and the image data with the difference value between the time stamp and the time stamp of the point cloud data smaller than the preset difference value can be used as the target image data matched with the point cloud data for each frame of point cloud data. For example, the preset difference is 25 milliseconds (ms).
And S104, aiming at each frame of point cloud data, determining a first calibration point in the point cloud data, and determining a second calibration point in the target image data matched with the point cloud data to obtain a calibration point group corresponding to the point cloud data.
The first calibration point is a light spot corresponding to the designated position of the calibration plate in the point cloud data, and the second calibration point is a pixel point corresponding to the designated position of the calibration plate in the target image data.
Optionally, there may be a plurality of first calibration points in the point cloud data and a plurality of second calibration points in the target image data, where the calibration point group includes a plurality of pairs of calibration points, and each pair of calibration points includes a first calibration point and a second calibration point. The physical positions corresponding to each pair of calibration points are the same, and each pair of calibration points correspond to each other and are used for calculating the coordinate conversion relation between the coordinate system of the radar and the coordinate system of the camera.
And S105, determining a coordinate conversion relation between a coordinate system of the radar and a coordinate system of the camera according to the calibration point group corresponding to the cloud data of each point.
Alternatively, the coordinate transformation relation may be a rotation matrix and a translation vector from the coordinate system of the radar to the coordinate system of the camera, or the coordinate transformation relation may be a rotation matrix and a translation vector from the coordinate system of the camera to the coordinate system of the radar.
In one embodiment, the cv:::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::. The opencv library is a cross-platform computer vision and machine learning software library, and the cv function solvepnPRansac () is used for calibration calculation, namely determining the coordinate conversion relation between coordinate systems.
According to the calibration method of the radar and the camera, provided by the embodiment of the invention, in the process that the vehicle drives to the calibration plate, multi-frame point cloud data are collected through the vehicle-mounted radar, multi-frame image data are collected through the vehicle-mounted camera, and calibration is carried out based on the obtained data. The positions of the camera and the radar are changed continuously during the running process of the vehicle, while the position of the calibration board is not changed, namely the relative positions of the radar and the camera with the calibration board respectively are changed continuously. Therefore, data acquired by the camera and the radar under various different poses are obtained equivalently, the positions of the camera and the radar change along with the running position of the vehicle, manual adjustment is not needed, and therefore calibration efficiency is improved.
In this embodiment of the present invention, in step S103, the manner of determining the target image data matched with each frame of point cloud data may be implemented as follows: executing the following steps for each frame of point cloud data in the plurality of frames of point cloud data: and searching image data with the minimum difference value with the time stamp of the point cloud data from the multi-frame image data. And if the difference value between the time stamp of the point cloud data and the time stamp of the searched image data is smaller than a preset difference value, using the searched image data as target image data matched with the point cloud data.
For example, the preset difference is 25ms, the time stamp of the point cloud data 1 is 8:00:17.96, the time stamp of the image data 1 is 8:01:13.25, the time stamp of the image data 2 is 8:00:59.15, and the time stamp of the image data 3 is 8:00: 17.96. The image data with the smallest difference from the time stamp of the point cloud data 1 is the image data 3, and the time stamp difference between the image data 3 and the point cloud data 1 is 0ms <25ms, so that the target image data matched with the point cloud data 1 is the image data 3.
Optionally, if the difference between the timestamp of the point cloud data and the timestamp of the searched image data is not less than the preset difference, it is determined that the point cloud data does not have matched target image data.
Under the condition that the camera has the function of setting the timestamp for the image data, before the point cloud data and the image data are acquired in the S102, the time source of the radar and the time source of the camera can be set to be the same time source through the time synchronization module of the vehicle-mounted industrial personal computer, namely, the time synchronization of the radar and the camera is realized, and the image data and the point cloud data can be conveniently and accurately matched.
Or, in the case that the camera does not have the function of setting the time stamp for the image data, the time synchronization module of the vehicle-mounted industrial personal computer may set the time stamp for the image data shot by the camera. Before the point cloud data and the image data are acquired in the S102, the time source of the radar and the time source of the time synchronization module can be set to be the same time source through the time synchronization module of the vehicle-mounted industrial personal computer, namely, the time synchronization of the radar and the time synchronization module is realized, and the image data and the point cloud data can be matched conveniently and accurately.
In the embodiment of the invention, as the radar rotates 360 degrees in the scanning process to obtain one frame of point cloud data, and the vehicle is in a driving state in the radar rotating process, each light spot in the one frame of point cloud data is obtained at different time, the positions where the radar scans each light spot are different, namely the coordinate systems where each light spot is located are different.
Therefore, before the first calibration point in the point cloud data is determined in S104, distortion compensation may be performed on the point cloud data to unify the light points in each frame of point cloud data into the same coordinate system. Referring to fig. 2, the process of performing distortion compensation on each frame of point cloud data acquired by the radar in S102 by the vehicle-mounted industrial personal computer includes the following steps:
s201, obtaining motion information corresponding to the point cloud data. Wherein the motion information includes: and the radar collects the displacement and the speed of the vehicle in the process of the point cloud data.
In one embodiment, the speed may be acquired by an Inertial Measurement Unit (IMU) in the vehicle, and the travel speed of the vehicle is multiplied by the time consumed by the radar to scan a frame of point cloud data, so as to obtain the travel displacement of the vehicle during the process of acquiring the point cloud data by the radar.
Each piece of motion information is tagged with a time stamp that the IMU collects the speed at which the vehicle is traveling. The motion information having the smallest difference from the time stamp of the point cloud data may be used as the motion information corresponding to the point cloud data. Optionally, the motion information may also include angular velocity, which is collected by the IMU.
Before S201, a time source of the radar and a time source of the IMU can be set to be the same time source through a time synchronization module of the vehicle-mounted industrial personal computer, namely, the time synchronization of the radar and the IMU is realized, and the point cloud data and the motion information can be matched conveniently and accurately.
S202, according to the motion information, distortion compensation is carried out on the point cloud data, and coordinates of each light spot in the point cloud data under a specified coordinate system are obtained. Wherein, the appointed coordinate system is a radar coordinate system where the first collected light spot in the point cloud data is located.
S203, for each light spot, replacing the original coordinates of the light spot with the coordinates of the light spot in the designated coordinate system.
The embodiment of the invention can perform distortion compensation on the point cloud data, reduce the influence of the motion of the vehicle on the accuracy of the position of the light spot, and further improve the accuracy of the position of the first calibration point in the determined point cloud data, namely improve the accuracy of calibration.
For the above S202, according to the motion information, distortion compensation is performed on the point cloud data to obtain coordinates of each light spot in the point cloud data in the specified coordinate system, which may be implemented as the following four steps:
step one, obtaining the relative scanning time of each light spot in the process of scanning the point cloud data by the radar based on the obtained radar beam to which each light spot belongs.
In one embodiment, for each light spot, a radar beam to which the light spot belongs may be determined according to the coordinates of the light spot (the radar transmits multiple beams, and the light spot is collected by each beam). And then, based on the coordinates of the light spot and the radar beam to which the light spot belongs, calculating the rotation angle of the position of the radar when scanning the light spot relative to the position of the radar when scanning the first light spot in the point cloud data. And then obtaining the total time consumption of a frame of point cloud data and the calculated rotation angle based on radar scanning, and obtaining the relative scanning time of the light spot in the process of scanning the point cloud data by the radar.
In an embodiment of the present invention, the relative scan time of one spot can be understood as: the time difference between the time the radar scans the spot relative to the time the radar scans the first spot in the point cloud data.
And secondly, interpolating between the acquired motion information and the adjacent motion information to obtain the displacement and the speed of each moment in a time period between the moment corresponding to the acquired operation information and the moment corresponding to the adjacent motion information.
In the embodiment of the present invention, the motion information is discrete, that is, one piece of motion information is obtained at intervals. And the motion information of the vehicle at each moment is not necessarily identical. In order to improve the accuracy of distortion compensation, it is necessary to obtain the displacement and speed of the vehicle when the radar scans each light point in the point cloud data.
In one embodiment, the motion information adjacent to the motion information acquired in S201 may be determined based on the time stamp of each motion information. Then, interpolation is performed between the motion information acquired in S201 and the determined adjacent motion information, and the displacement and velocity at each time in the time period between the time corresponding to the operation information acquired in S201 and the time corresponding to the adjacent motion information are obtained. I.e. corresponding to the acquisition of a continuous displacement and speed of travel of the vehicle during the time period. Wherein the time corresponding to the motion information is determined based on the time stamp of the motion information.
For example, the timestamp of the motion information B acquired in S201 is 005, and the motion information adjacent to the motion information is: motion information a with a time stamp of 004 and motion information C with a time stamp of 006. Interpolation is performed between the motion information a and the motion information B, and interpolation is performed between the motion information B and the motion information C, thereby obtaining continuous displacement and velocity in the period from the time stamp 004 to the time stamp 006.
Alternatively, the number of selection neighboring motion information may be determined based on the acquisition frequency of the motion information. In performing step two, a preset number of motion information adjacent to the motion information acquired in S201 may be selected. The larger the acquisition frequency of the motion information is, the more the preset number is, the smaller the acquisition frequency of the motion information is, and the less the preset number is. And D, ensuring that the time period of the point cloud data scanned by the radar is within the time period between the moment corresponding to the operation information acquired in the step two and the moment corresponding to the adjacent motion information.
And thirdly, determining a compensation transformation matrix of the coordinate system where the light spot is located relative to the specified coordinate system according to the relative scanning time of the light spot in the process of scanning the point cloud data by the radar and the displacement and the speed corresponding to the relative scanning time for each light spot included in the point cloud data.
For example, assume that step two obtains the displacement and velocity at each time in the [100,200] time period, i.e., the continuous displacement and velocity, while the timestamp of the first light point of the point cloud data is 120. The relative scan time of a spot is 20, the displacement and velocity of the spot are: displacement and velocity corresponding to 120+20=140 for the [100,200] time period.
And step four, multiplying the coordinates of the light spot by the determined compensation transformation matrix to obtain the coordinates of the light spot in the specified coordinate system.
According to the embodiment of the invention, the point cloud data can be subjected to distortion compensation based on the displacement and the speed of the vehicle running when each light spot is scanned by the radar, so that the accuracy of the distortion compensation of the point cloud data is improved.
In the embodiment of the present invention, the calibration board may be a checkerboard calibration board, and the designated position is a vertex of the checkerboard calibration board. When the thickness is not considered, the checkerboard calibration plate is rectangular, and the vertexes of the calibration plate are four vertexes of the rectangle.
Based on this, referring to fig. 3, the manner of determining the second calibration point in the target image data in S104 includes the following two steps:
s301, the coordinates of the designated corner point pair in the checkerboard pattern included in the target image data are extracted. Wherein each corner point pair is positioned on a straight line with one vertex of the chessboard pattern calibration plate.
In the embodiment of the present invention, before S301, the target image data may be further preprocessed. Wherein the preprocessing process comprises format conversion and distortion removal.
The format conversion process comprises the following steps: the method comprises the steps of obtaining sensor _ msgs in Image format of target Image data size from a camera driving module of a vehicle-mounted industrial personal computer, and opening up cv in Mat format of storage space not smaller than the obtained size in a memory. And converting the format of the target image data into cv:: Mat through a Graphic Processing Unit (GPU) of the vehicle-mounted industrial personal computer, and storing the cv:: Mat into a storage space opened in the memory. The camera driving module is used for driving the camera to collect image data.
The format of the target image data is converted into the format of the cv: (Mat), so that the target image data can be conveniently processed by utilizing the opencv library in the follow-up process.
The distortion removal treatment process comprises the following steps: acquiring target image data in a cv-Mat format from the storage space, carrying out distortion removal processing on the target image data based on an internal reference matrix of the camera, and converting the target image data subjected to distortion removal processing into a gray-scale image.
In S301, the coordinates of each corner point in the checkerboard pattern included in the target image data may be extracted by calling an Application Programming Interface (API) specified by the opencv library, and then the coordinates of the specified corner point pair may be obtained from the extracted coordinates of the corner point.
With reference to fig. 4, each corner point in the checkerboard pattern is: of the four vertices of each black/white square, the vertices other than the vertices located at the edges of the checkerboard pattern. The coordinates of each corner point obtained by the API are provided with indexes, and the coordinates of the corner points of the specified indexes can be obtained, namely the coordinates of the specified corner point pairs.
S302, according to the coordinates of each designated corner pair, determining a pixel point of a vertex on a straight line with the designated corner pair in the target image data as a second calibration point.
In one embodiment, for each designated corner pair, the distance between two corner points included in the designated corner pair and the included angle between the connecting line between the two corner points included in the designated corner pair and the lower edge of the checkerboard pattern may be determined according to the coordinates of the designated corner pair. And determining a pixel point of a vertex which is positioned on a straight line with the specified angular point pair as a second calibration point according to the distance between two angular points included in the specified angular point pair and the included angle degree corresponding to the specified angular point pair.
As shown in fig. 4, a pair of designated corner points is P1 and P2, a vertex on a straight line with the pair of designated corner points is P, a vertical distance between P and P1 is H, a lateral distance between P and P1 is W, and a straight line distance between P and P1 is S. The vertical distance between P1 and P2 is h, the lateral distance between P1 and P2 is w, and the linear distance between P1 and P2 is s. The connecting line between P1 and P2 is in a checkerboard patternThe angle between the base sides, i.e. the angle between s and w, is
Figure 431118DEST_PATH_IMAGE001
From the physical positions of P1, P2, and P on the calibration plate, a linear proportionality k = S/S can be obtained. Wherein the content of the first and second substances,
Figure 144996DEST_PATH_IMAGE002
Figure 472204DEST_PATH_IMAGE003
according to the coordinates of P1 on the target image data (S) ((S))
Figure 664150DEST_PATH_IMAGE004
Figure 275260DEST_PATH_IMAGE005
) And the coordinates of P2 on the target image data ((S))
Figure 645193DEST_PATH_IMAGE006
Figure 708964DEST_PATH_IMAGE007
) Calculating the distances of P1 and P2 on the target image data
Figure 455334DEST_PATH_IMAGE009
Calculating the included angle according to the coordinates of P1 and P2 on the target image data
Figure 655371DEST_PATH_IMAGE010
Then, the coordinates of P on the target image data are calculated by formula (1) ((
Figure 445473DEST_PATH_IMAGE011
Figure 747272DEST_PATH_IMAGE012
):
Figure 402726DEST_PATH_IMAGE013
Wherein the content of the first and second substances,
Figure 270319DEST_PATH_IMAGE011
as the abscissa of P on the target image data,
Figure 496901DEST_PATH_IMAGE012
is the ordinate of P on the target image data,
Figure 535264DEST_PATH_IMAGE004
as the abscissa of P1 on the target image data,
Figure 357858DEST_PATH_IMAGE005
as the ordinate on the target image data of P1,
Figure 798066DEST_PATH_IMAGE014
in order to have a linear proportional relationship,
Figure 770501DEST_PATH_IMAGE016
the distances of P1 and P2 on the target image data,
Figure 296160DEST_PATH_IMAGE017
Figure 171712DEST_PATH_IMAGE018
similarly, the other three vertices of the calibration plate in the target image data other than the P point may also be determined in the above manner. For example, as shown in FIG. 5, pixel 4 is determined using pixel 2 and pixel 3. Pixel 5 is determined using pixel 1 and pixel 6. Pixel 9 is determined using pixel 7 and pixel 8. Pixel 10 is determined using pixel 11 and pixel 12.
Based on the checkerboard calibration plate, referring to fig. 6, the manner of extracting the first calibration points corresponding to the four vertices of the checkerboard calibration plate from the point cloud data in S104 may include the following steps:
s601, identifying a target point cluster formed by light spots related to the calibration plate in the point cloud data. Wherein, the light spot related to the calibration plate is the light spot obtained by scanning the calibration plate by the radar.
The format of the point cloud data acquired by the vehicle-mounted industrial personal computer from the radar driving module of the vehicle-mounted industrial personal computer is velodyne _ msgs. In order to facilitate the subsequent extraction of the first calibration Point from the Point Cloud Library (PCL), the format of the Point Cloud data may be converted into PCL:: Point Cloud < PCL:: Point xyz > before S601. Wherein PCL:: PointCloud < PCL:: PointXYZ > is a data format of PCL library.
Then, filtering the light spots in the point cloud data by using a PCL library, wherein the filtering mode comprises the following steps: and filtering the light spots in the point cloud data according to the position of each light spot in the point cloud data to obtain the light spots with the positions meeting the filtering condition. Wherein the filtration conditions include any one or more of the following conditions: the horizontal distance between the position corresponding to the light spot and the radar is not more than the preset horizontal distance, the vertical distance between the position corresponding to the light spot and the radar is not more than the preset vertical distance, the height distance between the position corresponding to the light spot and the radar is not more than the preset height distance, and the angle between the position where the radar scans the light spot and the initial scanning position of the radar belongs to the preset angle range.
Because the coordinates of the light spot are three-dimensional coordinates, the horizontal, vertical and height distances between the physical position corresponding to the light spot and the radar can be reflected, and therefore the light spot which is too close to or too far away from the radar can be filtered according to the coordinates of the light spot.
And defines the angle of rotation of the radar as it scans the spot. For example, the preset angle range is [0 °,180 ° ], that is, the light spot scanned when the rotation angle of the radar is 180 ° or more is filtered out, and the light spot scanned when the rotation angle of the radar is within 180 ° is retained.
In the embodiment of the present invention, the light spots in the point cloud data are filtered before S601, which can reduce the number of light spots on which the first calibration point is determined, reduce the amount of calculation for determining the first calibration point, and improve the efficiency of extracting the first calibration point.
In one embodiment, S601 may be implemented as: and identifying a point cluster consisting of light spots belonging to the same plane in the point cloud data, and then taking the point cluster of which the plane size meets the size condition of a calibrated plate as a target point cluster according to the size of the plane corresponding to each point cluster.
Optionally, a region growing algorithm in PCL may be used to determine the target point cluster. The principle is as follows: and comparing angles between the normals of the light spots, and classifying the adjacent light spots of which the angles between the normals meet the preset smooth constraint condition into a cluster.
The dimensional condition of the calibration plate may be preset, for example, the dimensional condition of the calibration plate includes a length and a width. The size of the plane to which the cluster of points corresponds can be determined from the coordinates of the spots located at the edges of the cluster of points.
S602, performing straight line fitting on the light spots at the boundary in the target point cluster to obtain a plurality of fitting straight lines.
Prior to S602, the cluster of target points may also be projected from the three-dimensional plane to the two-dimensional plane, i.e. the height distance and horizontal distance between the spots in the cluster of reference target points and the radar, irrespective of the vertical distance (i.e. depth distance) between the spots in the cluster of target points and the radar. And then performing linear fitting on light spots at the boundary in a target point cluster on the two-dimensional plane by using a point cloud random sampling consistency segmentation algorithm to obtain a plurality of fitting straight lines when S602 is executed.
According to the embodiment of the invention, the target point cluster is projected to the two-dimensional plane, so that the linear fitting is more convenient to perform in the two-dimensional plane compared with the three-dimensional plane, and the calculation amount of the linear fitting is reduced.
And S603, taking the intersection point between the fitting straight lines as a first calibration point.
For example, the two line equations are shown in equation (2).
Figure 217160DEST_PATH_IMAGE019
Wherein the content of the first and second substances,
Figure 785544DEST_PATH_IMAGE020
Figure 283653DEST_PATH_IMAGE021
Figure 962896DEST_PATH_IMAGE022
Figure 112117DEST_PATH_IMAGE023
Figure 336557DEST_PATH_IMAGE024
and
Figure 571229DEST_PATH_IMAGE025
are all constants. The angular points of the two lines can be obtained by solving the linear equation of two variables shown in formula (2).
Illustratively, as shown in fig. 7, the black dots in fig. 7 are light points in the point cloud data associated with the calibration plate, the four straight lines are L1, L2, L3, and L4, and the four intersection points of the four straight lines (the intersection points in the circle of the dotted line in fig. 7) are the first calibration points.
Because the point cloud data can not obtain the texture and color information of the object, but can obtain the position information of the object, the first calibration point is determined based on the position of the light spot related to the calibration plate, so that the first calibration point is more accurately extracted.
After S105, the embodiment of the present invention may further perform automatic quality inspection on the currently calculated coordinate transformation relationship, and if the quality inspection fails, the coordinate transformation relationship may also be updated, where the process may be implemented as:
and aiming at each frame of point cloud data acquired by a radar, determining an error between a first calibration point in the point cloud data and a second calibration point in target image data corresponding to the point cloud data according to a coordinate conversion relation. And judging whether the error corresponding to the cloud data of each point meets the error limiting condition. And if the error corresponding to each point of cloud data does not meet the error limiting condition, returning to the step S101 until the error corresponding to each point of cloud data meets the error limiting condition.
In one embodiment, for each first projective point, the first calibration point may be projected into the target image data according to a coordinate transformation relationship to obtain a light projective point, and then a coordinate difference between the light projective point and the corresponding second calibration point is calculated. And when the sum of the calculated difference values is smaller than a preset threshold value or when the sum of the squares of the calculated difference values is smaller than the preset threshold value, determining whether the error corresponding to the cloud data of each point meets the error limiting condition.
In the embodiment of the present invention, if the coordinate transformation relationship cannot be calculated in S105, that is, an error between the first calibration point and the corresponding second calibration point is large, which results in a calibration calculation failure, the process may return to S101, that is, the calibration plate is reset, the point cloud data and the image data are obtained, and the calibration calculation is performed again.
The embodiment of the invention can recalculate the coordinate conversion relationship when the calculated coordinate conversion relationship has larger error, thereby improving the accuracy of determining the coordinate conversion relationship between the coordinate system of the camera and the coordinate system of the radar.
After the coordinate transformation relationship is successfully obtained in S105, the embodiment of the present invention may further display the calibration result to a manual quality inspection, including: and selecting a preset number of point cloud data from the cloud data of each point collected by the radar. And then aiming at each frame of selected point cloud data, determining light projection points of each light spot in the point cloud data on target image data corresponding to the point cloud data according to a coordinate conversion relation, and displaying each light projection point on the target image data corresponding to the point cloud data in a visualization module.
Optionally, the manner of selecting the point cloud data may be selected randomly, may also be selected in sequence, and may also be selected at certain intervals, which is not specifically limited in this embodiment of the present invention.
The visualization module of the embodiment of the invention can be a vehicle-mounted display screen, the vehicle-mounted display screen is controlled by a vehicle-mounted industrial personal computer, and the vehicle-mounted industrial personal computer projects the point cloud data onto the target image data through a cv:: projectPoints () function in an opencv library. And then the vehicle-mounted industrial personal computer sends the target image data containing the light projection point to the vehicle-mounted display screen, so that the vehicle-mounted display screen displays the target image data containing the light projection point. The inspector can view the error between the light projection point and the pixel point in the image data in the vehicle-mounted display screen. If the error is large, the detection personnel can control the vehicle-mounted industrial personal computer to return to S101, namely, the calibration plate is reset, and point cloud data and image data are obtained so as to determine the coordinate conversion relationship again.
The embodiment of the invention has less manual intervention in the calibration process of the radar and the camera, is not limited by the field, and can realize efficient, stable and automatic calibration.
Compared with the mode that point cloud data and image data are acquired off line firstly and then the radar and the camera are calibrated, the method and the device can calibrate the radar and the camera in real time when the point cloud data and the image data are acquired, so that the calibration efficiency is improved.
When the radar and the camera are calibrated, only one radar and one camera are calibrated, and the calibration efficiency is low. The embodiment of the invention can realize the calibration of multiple radars and cameras or the calibration of multiple cameras and radars. The calibration of multiple cameras and radars, the calibration of multiple radars and cameras, and the calibration of multiple cameras and radars are described below in a similar manner.
In the embodiment of the present invention, the vehicle is equipped with a plurality of cameras, and the plurality of cameras each acquire image data while the vehicle is traveling toward the calibration board in the execution of S102 described above.
In the execution of S103, for each frame of point cloud data, target image data matching the point cloud data is determined from the image data acquired by each camera. Each frame of point cloud data is matched with multiple frames of target image data, and the multiple frames of target image data matched with one frame of point cloud data are respectively collected by different cameras.
When the above S104 is executed, for each frame of point cloud data, a calibration point group of each target image data matched with the point cloud data is determined, and a plurality of calibration point groups corresponding to the point cloud data are obtained. One index point group corresponding to one frame of point cloud data comprises a first index point and a second index point belonging to the same frame of target image data.
For example, the vehicle is mounted with a camera a and a camera B, and the point cloud data 1 matches the target image data 1 in the image data collected by the camera a, and the point cloud data 1 matches the target image data 2 in the image data collected by the camera B. The first calibration point in the point cloud data 1 and the second calibration point in the target image data 1 are a calibration point group, and the first calibration point in the point cloud data 1 and the second calibration point in the target image data 2 are a calibration point group.
Alternatively, referring to fig. 8, the above S104 may be implemented by: and the main thread of the vehicle-mounted industrial personal computer puts each pair of matched point cloud data and target image data into a work queue. Based on the respective working states of a plurality of parallel processing working threads, each working thread acquires the matched point cloud data and the matched target image data from the working queue according to the sequential arrangement of the matched point cloud data and the matched target image data in the working queue, extracts a first calibration point from the acquired point cloud data, and extracts a second calibration point from the acquired target image data.
In the embodiment of the invention, the working state of the working thread represents the load condition of the working thread. Optionally, the working thread in the idle state preferentially obtains the matched point cloud data and the target image data from the working queue. The working thread acquires a pair of matched point cloud data and target image data from the working queue each time.
In the embodiment of the present invention, the method for extracting the first calibration point from the point cloud data and the method for extracting the second calibration point from the target image data refer to the above description, and details are not repeated here.
In the execution of S105, the coordinate conversion relationship between the coordinate system of the radar and the coordinate system of each camera is determined in parallel by multithreading according to the calibration point group between each point cloud data and the target image data acquired by each camera.
In one embodiment, after the worker thread extracts the calibration point in S104, the second calibration point extracted from the target image data collected by the same camera and the first calibration point corresponding to the second calibration point are stored in the same storage area of the memory. When the step S105 is executed, the parallel processing work threads respectively acquire the calibration point groups from different storage areas, and perform calibration calculation, that is, a coordinate transformation relationship between the coordinate system of the radar and the coordinate system of one camera is obtained through each work thread.
Therefore, the calibration point can be extracted in real time and calibrated and resolved when the point cloud data and the image data are acquired, and the time consumption of calibration is reduced. Moreover, the embodiment of the invention can calibrate a plurality of groups of equipment simultaneously, thereby improving the calibration efficiency.
The following describes the overall process of the embodiment of the present invention by a specific example:
the vehicle roof of the embodiment of the invention is provided with one radar and three cameras, and referring to fig. 9, 1 is a left camera, 2 is a middle camera, 3 is a right camera, 5 is a camera mounting table, 4 is a radar, and 6 is a radar mounting table. The Field of View (FOV) of the left and right cameras is 60 °, and the FOV of the middle camera is 120 °. The vehicle-mounted industrial personal computer is in communication connection with the flat plate (pad) or the vehicle machine, a designated Application program (APP) is installed in the pad or the vehicle machine, the pad or the vehicle machine is used for receiving a control instruction sent by a user through the APP, and then the vehicle-mounted industrial personal computer is controlled to calibrate.
Further, the vehicle-mounted industrial personal computer can be integrated in a driving computer or a vehicle machine.
The calibration process for the radar and 3 cameras is shown in fig. 10:
s1001, a calibration plate is arranged in a common visual area of a vehicle-mounted radar and a camera, a visualization module in the vehicle is controlled to be started by a vehicle-mounted industrial control computer, point cloud data scanned by the radar and image data shot by the camera are displayed, and if the calibration plate is not completely positioned in the common visual area of the radar and the camera, the position of the calibration plate is adjusted until the calibration plate is completely positioned in the common visual area of the radar and the camera.
The calibration board is completely in the common-view area of the radar and the camera, namely the calibration board is completely in the field of view of the radar, and the calibration board is completely in the field of view of the camera.
And S1002, the pad or the vehicle machine receives a calibration starting instruction sent by a user and informs the vehicle-mounted industrial personal computer to start calibration.
S1003, the vehicle-mounted industrial personal computer controls the vehicle to run at a constant speed in the direction of the calibration board, informs the radar driving module to control the vehicle-mounted radar to collect multi-frame point cloud data, and informs the camera driving module to respectively control the three vehicle-mounted cameras to collect multi-frame image data.
For example, the vehicle travels at a constant speed of 10km/h in the direction of the calibration plate.
And S1004, determining target image data matched with the point cloud data from the image data acquired by the three cameras respectively according to each frame of point cloud data by the vehicle-mounted industrial personal computer, and obtaining the three target image data matched with the point cloud data.
And S1005, respectively extracting a first calibration point and a second calibration point by the vehicle-mounted industrial personal computer aiming at each pair of matched point cloud data and target image data to obtain three calibration point groups corresponding to each frame of point cloud data.
Wherein, in the three calibration point groups corresponding to each frame of point cloud data, one calibration point group comprises a first calibration point in the point cloud data and a second calibration point in the target image data 1 matched with the point cloud data in the image data collected by the left camera; one set of index points comprises a first index point in the point cloud data and a second index point in the target image data 2 that the point cloud data matches in the image data acquired by the camera; one set of index points comprises a first index point in the point cloud data and a second index point in the target image data 3 that matches the point cloud data in the image data acquired by the right camera.
And S1006, the vehicle-mounted industrial personal computer respectively determines coordinate conversion relations between the coordinate system of the radar and the coordinate systems of the three cameras according to the three calibration point groups corresponding to the cloud data of each point.
And S1007, the vehicle-mounted industrial personal computer performs quality inspection on the three coordinate conversion relations, and determines whether the quality inspection passes or not according to a quality inspection result. If the quality inspection fails, the process returns to S1001. If the quality inspection is passed, S1008 is executed.
S1007 performs quality inspection on the three coordinate transformation relations, which may be manual quality inspection or automatic quality inspection, and the processes of manual quality inspection and automatic quality inspection refer to the above description, and are not described herein again.
And S1008, the vehicle-mounted industrial personal computer obtains coordinate conversion relations between the coordinate systems of the radar and the coordinate systems of the three cameras respectively.
The embodiment of the invention also provides a vehicle-mounted industrial personal computer, as shown in fig. 11, which comprises a processor 1101, a communication interface 1102, a memory 1103 and a communication bus 1104, wherein the processor 1101, the communication interface 1102 and the memory 1103 complete mutual communication through the communication bus 1104,
a memory 1103 for storing a computer program;
the processor 1101 is configured to implement the method steps in the above-described method embodiments when executing the program stored in the memory 1103.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In another embodiment of the present invention, a computer-readable storage medium is further provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of any of the above radar and camera calibration methods.
In a further embodiment of the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the above described radar and camera calibration methods.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the embodiment of the electronic device, since it is substantially similar to the embodiment of the method, the description is simple, and for the relevant points, reference may be made to part of the description of the embodiment of the method.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (14)

1. A calibration method for radar and a camera is characterized by comprising the following steps:
setting a calibration plate in a common visual area of a vehicle-mounted radar and a camera;
in the process that the vehicle drives to the calibration plate, a vehicle-mounted radar collects multi-frame point cloud data, and a vehicle-mounted camera collects multi-frame image data;
determining target image data matched with each frame of point cloud data according to the time stamp of each frame of point cloud data in the multi-frame point cloud data and the time stamp of each frame of image data in the multi-frame image data;
aiming at each frame of point cloud data, determining a first calibration point in the point cloud data, and determining a second calibration point in target image data matched with the point cloud data to obtain a calibration point group corresponding to the point cloud data; the first calibration point is a light spot corresponding to the specified position of the calibration plate in the point cloud data, and the second calibration point is a pixel point corresponding to the specified position of the calibration plate in the target image data;
and determining a coordinate conversion relation between the coordinate system of the radar and the coordinate system of the camera according to the calibration point group corresponding to each point cloud data.
2. The method of claim 1, wherein determining the target image data matching each frame of point cloud data according to the timestamp of each frame of point cloud data in the plurality of frames of point cloud data and the timestamp of each frame of image data in the plurality of frames of image data comprises:
performing the following operations for each frame of point cloud data in the plurality of frames of point cloud data:
searching image data with the minimum difference value with the time stamp of the point cloud data from the multi-frame image data;
and if the difference value between the time stamp of the point cloud data and the time stamp of the searched image data is smaller than a preset difference value, using the searched image data as target image data matched with the point cloud data.
3. The method of claim 1, wherein prior to said determining the first index point in the point cloud data, the method further comprises:
acquiring motion information corresponding to the point cloud data, wherein the motion information comprises: the radar collects the displacement and the speed of the vehicle in the process of the point cloud data;
according to the motion information, distortion compensation is carried out on the point cloud data to obtain the coordinate of each light spot in the point cloud data under a specified coordinate system, wherein the specified coordinate system is a radar coordinate system where the first collected light spot in the point cloud data is located;
and for each light spot, replacing the original coordinates of the light spot by the coordinates of the light spot in the specified coordinate system.
4. The method of claim 3, wherein performing distortion compensation on the point cloud data according to the motion information to obtain coordinates of each light point in the point cloud data in a specified coordinate system comprises:
obtaining the relative scanning time of each light spot in the process of scanning the point cloud data by the radar based on the obtained radar beam to which each light spot belongs;
interpolating between the acquired motion information and the adjacent motion information to obtain the displacement and the speed of each moment in a time period between the moment corresponding to the acquired operation information and the moment corresponding to the adjacent motion information;
aiming at each light spot included in the point cloud data, determining a compensation transformation matrix of a coordinate system where the light spot is located relative to the specified coordinate system according to the relative scanning time of the light spot in the process of scanning the point cloud data by the radar and the displacement and the speed corresponding to the relative scanning time;
and multiplying the coordinates of the light spot by the determined compensation transformation matrix to obtain the coordinates of the light spot in the specified coordinate system.
5. The method of claim 1, wherein said calibration board is a checkerboard calibration board, said designated location being a vertex of said checkerboard calibration board;
the determining a second index point in the target image data matching the point cloud data includes:
extracting coordinates of designated angle point pairs in a checkerboard pattern included in the target image data, wherein each angle point pair and one vertex of the checkerboard calibration plate are positioned on the same straight line;
and determining pixel points of vertexes, which are positioned on a straight line with the specified corner points in the target image data, as second calibration points according to the coordinates of each specified corner point pair.
6. The method according to claim 5, wherein determining, as the second calibration point, a pixel point of a vertex of the target image data located on a straight line with the designated corner point according to the coordinates of each designated corner point pair, comprises:
for each appointed angle point pair, determining the distance between two angle points included in the appointed angle point pair and the included angle degree between a connecting line between the two angle points included in the appointed angle point pair and the lower bottom edge of the checkerboard pattern according to the coordinates of the appointed angle point pair;
and determining a pixel point of a vertex which is positioned on a straight line with the specified angular point pair as the second calibration point according to the distance between two angular points included in the specified angular point pair and the included angle corresponding to the specified angular point pair.
7. The method of any of claims 1-6, wherein the calibration plate is rectangular and the designated location of the calibration plate is the apex of the calibration plate;
the determining a first index point in the point cloud data includes:
identifying a target point cluster consisting of light spots related to the calibration plate in the point cloud data;
performing linear fitting on the light spots at the boundary in the target point cluster to obtain a plurality of fitting straight lines;
and taking the intersection point between the fitting straight lines as the first calibration point.
8. The method of claim 7, wherein identifying a cluster of target points in the point cloud data that is comprised of light points associated with the calibration plate comprises:
identifying a point cluster consisting of light spots belonging to the same plane in the point cloud data;
and according to the size of the plane corresponding to each point cluster, taking the point cluster with the plane size meeting the size condition of the calibrated plate as a target point cluster.
9. The method of claim 7, wherein prior to said identifying a cluster of target points in the point cloud data comprised of light points associated with the calibration plate, the method further comprises:
filtering the light spots in the point cloud data according to the position of each light spot in the point cloud data to obtain the light spots of which the positions meet the filtering condition;
wherein the filtration conditions include any one or more of the following conditions: the horizontal distance between the position corresponding to the light spot and the radar is not more than a preset horizontal distance, the vertical distance between the position corresponding to the light spot and the radar is not more than a preset vertical distance, the height distance between the position corresponding to the light spot and the radar is not more than a preset height distance, and the angle between the position where the radar scans the light spot and the initial scanning position of the radar belongs to a preset angle range.
10. The method according to claim 1, wherein after determining the coordinate transformation relationship between the coordinate system of the radar and the coordinate system of the camera according to the calibration point group corresponding to each point cloud data, the method further comprises:
aiming at each frame of point cloud data collected by the radar, determining an error between a first calibration point in the point cloud data and a second calibration point in target image data corresponding to the point cloud data according to the coordinate conversion relation;
judging whether the error corresponding to the cloud data of each point meets the error limiting condition or not;
and if the error corresponding to the cloud data of each point does not meet the error limiting condition, returning to the step of setting the calibration plate in the common visual area of the vehicle-mounted radar and the vehicle-mounted camera until the error corresponding to the cloud data of each point meets the error limiting condition.
11. The method according to claim 1, wherein after determining the coordinate transformation relationship between the coordinate system of the radar and the coordinate system of the camera according to the calibration point group corresponding to each point cloud data, the method further comprises:
selecting a preset number of point cloud data from the cloud data of each point collected by the radar;
and aiming at each frame of selected point cloud data, determining light projection points of each light spot in the point cloud data on target image data corresponding to the point cloud data according to the coordinate conversion relation, and displaying each light projection point on the target image data corresponding to the point cloud data in a visualization module.
12. The method of claim 1, wherein determining a first index point in each frame of point cloud data and determining a second index point in target image data that matches the point cloud data comprises:
the main thread puts each pair of matched point cloud data and target image data into a work queue;
based on the respective working states of a plurality of parallel processing working threads, each working thread acquires matched point cloud data and target image data from the working queue according to the sequential arrangement of the matched point cloud data and the target image data in the working queue, extracts a first calibration point from the acquired point cloud data, and extracts a second calibration point from the acquired target image data.
13. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1-12 when executing a program stored in the memory.
14. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of the claims 1-12.
CN202110722650.7A 2021-06-29 2021-06-29 Calibration method of radar and camera, electronic device and storage medium Pending CN113256740A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110722650.7A CN113256740A (en) 2021-06-29 2021-06-29 Calibration method of radar and camera, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110722650.7A CN113256740A (en) 2021-06-29 2021-06-29 Calibration method of radar and camera, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN113256740A true CN113256740A (en) 2021-08-13

Family

ID=77190084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110722650.7A Pending CN113256740A (en) 2021-06-29 2021-06-29 Calibration method of radar and camera, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113256740A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113792797A (en) * 2021-09-16 2021-12-14 智道网联科技(北京)有限公司 Point cloud data screening method and storage medium
CN114333418A (en) * 2021-12-30 2022-04-12 深兰人工智能(深圳)有限公司 Data processing method for automatic driving and related device
CN115359130A (en) * 2022-10-19 2022-11-18 北京格镭信息科技有限公司 Radar and camera combined calibration method and device, electronic equipment and storage medium
CN115909272A (en) * 2022-11-09 2023-04-04 杭州枕石智能科技有限公司 Method for acquiring obstacle position information, terminal device and computer medium
CN116071431A (en) * 2021-11-03 2023-05-05 北京三快在线科技有限公司 Calibration method and device, storage medium and electronic equipment
WO2023193690A1 (en) * 2022-04-06 2023-10-12 华为技术有限公司 Sensor calibration method and apparatus
WO2023213083A1 (en) * 2022-05-05 2023-11-09 北京京东乾石科技有限公司 Object detection method and apparatus and driverless car

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data
CN109975792A (en) * 2019-04-24 2019-07-05 福州大学 Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111427020A (en) * 2020-06-11 2020-07-17 交通运输部公路科学研究所 Combined calibration method, device and system for environmental information data acquisition equipment
CN111640158A (en) * 2020-06-11 2020-09-08 武汉斌果科技有限公司 End-to-end camera based on corresponding mask and laser radar external reference calibration method
CN112230240A (en) * 2020-09-30 2021-01-15 深兰人工智能(深圳)有限公司 Space-time synchronization system, device and readable medium for laser radar and camera data
CN112669393A (en) * 2020-12-31 2021-04-16 中国矿业大学 Laser radar and camera combined calibration method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data
CN109975792A (en) * 2019-04-24 2019-07-05 福州大学 Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111427020A (en) * 2020-06-11 2020-07-17 交通运输部公路科学研究所 Combined calibration method, device and system for environmental information data acquisition equipment
CN111640158A (en) * 2020-06-11 2020-09-08 武汉斌果科技有限公司 End-to-end camera based on corresponding mask and laser radar external reference calibration method
CN112230240A (en) * 2020-09-30 2021-01-15 深兰人工智能(深圳)有限公司 Space-time synchronization system, device and readable medium for laser radar and camera data
CN112669393A (en) * 2020-12-31 2021-04-16 中国矿业大学 Laser radar and camera combined calibration method

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113792797A (en) * 2021-09-16 2021-12-14 智道网联科技(北京)有限公司 Point cloud data screening method and storage medium
CN113792797B (en) * 2021-09-16 2024-04-26 智道网联科技(北京)有限公司 Point cloud data screening method and storage medium
CN116071431A (en) * 2021-11-03 2023-05-05 北京三快在线科技有限公司 Calibration method and device, storage medium and electronic equipment
CN114333418A (en) * 2021-12-30 2022-04-12 深兰人工智能(深圳)有限公司 Data processing method for automatic driving and related device
CN114333418B (en) * 2021-12-30 2022-11-01 深兰人工智能(深圳)有限公司 Data processing method for automatic driving and related device
WO2023193690A1 (en) * 2022-04-06 2023-10-12 华为技术有限公司 Sensor calibration method and apparatus
WO2023213083A1 (en) * 2022-05-05 2023-11-09 北京京东乾石科技有限公司 Object detection method and apparatus and driverless car
CN115359130A (en) * 2022-10-19 2022-11-18 北京格镭信息科技有限公司 Radar and camera combined calibration method and device, electronic equipment and storage medium
CN115909272A (en) * 2022-11-09 2023-04-04 杭州枕石智能科技有限公司 Method for acquiring obstacle position information, terminal device and computer medium

Similar Documents

Publication Publication Date Title
CN113256740A (en) Calibration method of radar and camera, electronic device and storage medium
US11731762B2 (en) Crisscross boustrophedonic flight patterns for UAV scanning and imaging
CN111179358B (en) Calibration method, device, equipment and storage medium
Fawzy 3D laser scanning and close-range photogrammetry for buildings documentation: A hybrid technique towards a better accuracy
US20180247121A1 (en) Systems and methods for surface and subsurface damage assessments, patch scans, and visualization
US9207069B2 (en) Device for generating a three-dimensional model based on point cloud data
JP2001524228A (en) Machine vision calibration target and method for determining position and orientation of target in image
US20170140537A1 (en) System and method for scoring clutter for use in 3d point cloud matching in a vision system
CN111080662A (en) Lane line extraction method and device and computer equipment
CN111192331A (en) External parameter calibration method and device for laser radar and camera
CN112967344A (en) Method, apparatus, storage medium, and program product for camera external reference calibration
CN112036359B (en) Method for obtaining topological information of lane line, electronic device and storage medium
CN113945937A (en) Precision detection method, device and storage medium
US11544839B2 (en) System, apparatus and method for facilitating inspection of a target object
CN111709995A (en) Position calibration method between laser radar and camera
Li et al. Triangulation-based edge measurement using polyview optics
CN116386373A (en) Vehicle positioning method and device, storage medium and electronic equipment
CN116203976A (en) Indoor inspection method and device for transformer substation, unmanned aerial vehicle and storage medium
Laureshyn et al. Automated video analysis as a tool for analysing road user behaviour
JP7363545B2 (en) Calibration judgment result presentation device, calibration judgment result presentation method and program
CN113593026A (en) Lane line marking auxiliary map generation method and device and computer equipment
WO2021111613A1 (en) Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program
JP2018041169A (en) Information processing device and control method and program thereof
JP2961140B2 (en) Image processing method
CN115909183B (en) Monitoring system and monitoring method for external environment of fuel gas delivery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220328

Address after: 430051 No. b1336, chuanggu startup area, taizihu cultural Digital Creative Industry Park, No. 18, Shenlong Avenue, Wuhan Economic and Technological Development Zone, Hubei Province

Applicant after: Yikatong (Hubei) Technology Co.,Ltd.

Address before: 430056 building B (qdxx-f7b), No.7 building, qiedixiexin science and Technology Innovation Park, South taizihu innovation Valley, Wuhan Economic and Technological Development Zone, Hubei Province

Applicant before: HUBEI ECARX TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right