CN114966578A - Radar external parameter calibration method and device based on shooting equipment and computer equipment - Google Patents

Radar external parameter calibration method and device based on shooting equipment and computer equipment Download PDF

Info

Publication number
CN114966578A
CN114966578A CN202210559839.3A CN202210559839A CN114966578A CN 114966578 A CN114966578 A CN 114966578A CN 202210559839 A CN202210559839 A CN 202210559839A CN 114966578 A CN114966578 A CN 114966578A
Authority
CN
China
Prior art keywords
radar
target object
point cloud
shooting device
external reference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210559839.3A
Other languages
Chinese (zh)
Inventor
肖梓栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DeepRoute AI Ltd
Original Assignee
DeepRoute AI Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DeepRoute AI Ltd filed Critical DeepRoute AI Ltd
Priority to CN202210559839.3A priority Critical patent/CN114966578A/en
Publication of CN114966578A publication Critical patent/CN114966578A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application relates to a radar external reference calibration method and device based on shooting equipment, computer equipment, storage media and computer program products. The method comprises the following steps: acquiring an acquired image of target shooting equipment and point cloud data of a radar to be calibrated, wherein the target shooting equipment and the radar to be calibrated have a view field overlapping region, and the radar to be calibrated comprises a first radar and a second radar; performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud; determining a target object image position based on a collected image of a target shooting device; and carrying out external reference calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external reference calibration result. By adopting the method, the radar external parameter calibration can be realized.

Description

Radar external parameter calibration method and device based on shooting equipment and computer equipment
Technical Field
The application relates to the technical field of intelligent driving, in particular to a radar external reference calibration method and device based on shooting equipment and computer equipment.
Background
The radar external reference calibration refers to external reference calibration of a radar installed on a vehicle, and is a very important link in the field of intelligent driving, positioning and sensing in the field of intelligent driving need to depend on accurate external reference, and the external reference calibration refers to relative position relation between the radars, including translation and rotation.
In the conventional technology, the radar external parameter calibration mode is that one of the radars on the vehicle is selected as a main radar, and then calibration is performed through the field overlapping area of the main radar and the other radars. However, for a solid-state radar, the field of view is relatively small, and the solid-state radar may not have field of view overlap with other radars, so that calibration based on the field of view overlap is difficult to perform, and calibration of radars without field of view overlap cannot be performed according to a conventional method.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a shooting device-based radar external reference calibration method, device, computer readable storage medium and computer program product capable of implementing radar external reference calibration.
In a first aspect, the application provides a radar external reference calibration method based on shooting equipment. The method comprises the following steps:
acquiring an acquired image of target shooting equipment and point cloud data of a radar to be calibrated, wherein the target shooting equipment and the radar to be calibrated have a view field overlapping region, and the radar to be calibrated comprises a first radar and a second radar;
performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud;
determining a target object image position based on a collected image of a target shooting device;
and carrying out external parameter calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external parameter calibration result.
In one embodiment, the target shooting device is a single shooting device, and the single shooting device and the first radar and the second radar have field-of-view overlapping regions;
performing external reference calibration on the first radar and the second radar based on the target object image position, the first target object point cloud and the second target object point cloud, and obtaining a radar external reference calibration result comprises the following steps:
calibrating single shooting equipment and a first radar based on the target object image position and the first target object point cloud to obtain a first external reference calibration result;
calibrating the single shooting device and the second radar based on the target object image position and the second target object point cloud to obtain a second external reference calibration result;
and obtaining a radar external reference calibration result based on the first external reference calibration result and the second external reference calibration result.
In one embodiment, calibrating a single shooting device and a first radar based on a target object image position and a first target object point cloud to obtain a first external reference calibration result comprises:
based on the obtained external parameter transformation relation, projecting the first target object point cloud to a coordinate system where the collected image is located to obtain a point cloud image position matched with the first target object point cloud, wherein the external parameter transformation relation is the external parameter transformation relation between a single shooting device and a first radar;
and calibrating the single shooting device and the first radar according to the position of the point cloud image and the position of the target object image to obtain a first external reference calibration result.
In one embodiment, the target shooting device comprises a first shooting device with a field of view overlapping region with a first radar, a second shooting device with a field of view overlapping region with a second radar, and a third shooting device with a field of view overlapping region with the first shooting device and the second shooting device respectively;
determining the target object image location based on the captured image of the target capture device includes:
and respectively determining the position of the target object image matched with each shooting device based on the acquired image of each shooting device.
In one embodiment, the external reference calibration of the first radar and the second radar is performed based on the target object image position, the first target object point cloud and the second target object point cloud, and obtaining a radar external reference calibration result includes:
calibrating the first radar and the first shooting equipment according to the first target object point cloud and the target object image position matched with the first shooting equipment to obtain a third external reference calibration result;
calibrating the second radar and the second shooting equipment according to the second target object point cloud and the target object image position matched with the second shooting equipment to obtain a fourth external reference calibration result;
calibrating the first shooting equipment, the second shooting equipment and the third shooting equipment based on the position of the target object image matched with each shooting equipment to obtain a shooting equipment calibration result;
and obtaining a radar external reference calibration result according to the third external reference calibration result, the fourth external reference calibration result and the shooting equipment calibration result.
In one embodiment, calibrating the first shooting device, the second shooting device and the third shooting device based on the target object image position matched with each shooting device, and obtaining the shooting device calibration result comprises:
calibrating the first shooting device and the third shooting device based on the target object image positions matched with the first shooting device and the third shooting device respectively to obtain a first device calibration result;
calibrating the second shooting equipment and the third shooting equipment based on the target object image positions matched with the second shooting equipment and the third shooting equipment respectively to obtain a second equipment calibration result;
and obtaining a calibration result of the shooting equipment according to the calibration result of the first equipment and the calibration result of the second equipment.
In a second aspect, the application further provides a radar external reference calibration device based on the shooting equipment. The device comprises:
the system comprises an acquisition module, a calibration module and a calibration module, wherein the acquisition module is used for acquiring an acquired image of target shooting equipment and point cloud data of a radar to be calibrated, the target shooting equipment and the radar to be calibrated have a field-of-view overlapping region, and the radar to be calibrated comprises a first radar and a second radar;
the target detection module is used for carrying out target detection on the point cloud data of the first radar to obtain a first target object point cloud and carrying out target detection on the point cloud data of the second radar to obtain a second target object point cloud;
the processing module is used for determining the position of the target object image based on the acquired image of the target shooting equipment;
and the calibration module is used for carrying out external parameter calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external parameter calibration result.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
acquiring an acquired image of target shooting equipment and point cloud data of a radar to be calibrated, wherein the target shooting equipment and the radar to be calibrated have a view field overlapping region, and the radar to be calibrated comprises a first radar and a second radar;
performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud;
determining a target object image position based on a collected image of a target shooting device;
and carrying out external reference calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external reference calibration result.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
acquiring an acquired image of target shooting equipment and point cloud data of a radar to be calibrated, wherein the target shooting equipment and the radar to be calibrated have a view field overlapping region, and the radar to be calibrated comprises a first radar and a second radar;
performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud;
determining a target object image position based on a collected image of a target shooting device;
and carrying out external reference calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external reference calibration result.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
acquiring an acquired image of target shooting equipment and point cloud data of a radar to be calibrated, wherein the target shooting equipment and the radar to be calibrated have a view field overlapping region, and the radar to be calibrated comprises a first radar and a second radar;
performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud;
determining a target object image position based on a collected image of a target shooting device;
and carrying out external parameter calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external parameter calibration result.
The radar external reference calibration method, the device, the computer equipment, the storage medium and the computer program product based on the shooting equipment can respectively carry out target detection on the point cloud data of the first radar and the second radar by acquiring the acquired image of the target shooting equipment and the point cloud data of the radar to be calibrated, can detect a target object from the point cloud data of the first radar and the second radar to obtain a first target object point cloud and a second target object point cloud, can determine the position of the target object image based on the acquired image of the target shooting equipment, can detect the target object from the acquired image, further can carry out external reference calibration on the first radar and the second radar by taking the target object as a reference according to the target object image position of the target object in the acquired image, the first target object point cloud matched with the first radar and the second target object point cloud matched with the second radar, and obtaining a radar external reference calibration result, and in the whole process, by using target shooting equipment in a field-of-view overlapping region with the radar to be calibrated, and taking a target object captured at the same time as a reference, realizing the radar external reference calibration.
Drawings
FIG. 1 is an application environment diagram of a radar external reference calibration method based on a shooting device in an embodiment;
FIG. 2 is a schematic flow chart of a radar external reference calibration method based on a shooting device in one embodiment;
FIG. 3 is a schematic diagram of a radar to be calibrated in one embodiment;
FIG. 4 is a schematic diagram of point cloud data of a radar to be calibrated in one embodiment;
FIG. 5 is a schematic illustration of an area of overlap of fields of view in one embodiment;
FIG. 6 is a schematic diagram illustrating an embodiment of a first target object point cloud projected into a coordinate system of a captured image;
FIG. 7 is a schematic view of an overlapping region of fields of view in another embodiment;
FIG. 8 is a schematic flow chart of a radar external reference calibration method based on a shooting device in another embodiment;
FIG. 9 is a schematic flowchart of a radar external parameter calibration method based on a shooting device in yet another embodiment;
FIG. 10 is a schematic flowchart of a radar external reference calibration method based on a shooting device in yet another embodiment;
FIG. 11 is a diagram of an application environment of a radar external reference calibration method based on a shooting device in another embodiment;
FIG. 12 is a block diagram illustrating an exemplary embodiment of a radar external reference calibration apparatus based on a camera;
FIG. 13 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The radar external reference calibration method based on the shooting equipment provided by the embodiment of the application can be applied to the application environment shown in fig. 1. The method comprises the steps that a server 102 obtains a collected image of a target shooting device 104 and point cloud data of a radar to be calibrated, the target shooting device 104 and the radar to be calibrated have a field-of-view overlapping area, the radar to be calibrated comprises a first radar 106 and a second radar 108, target detection is respectively carried out on the point cloud data of the first radar 106 and the second radar 108, a first target object point cloud matched with the first radar 106 and a second target object point cloud matched with the second radar 108 are obtained, the position of a target object image is determined based on the collected image of the target shooting device 104, external reference calibration is carried out on the first radar 106 and the second radar 108 according to the position of the target object image, the first target object point cloud and the second target object point cloud, and a radar external reference calibration result is obtained. The server 102 may be implemented as a stand-alone server or a server cluster composed of a plurality of servers.
In one embodiment, as shown in fig. 2, a radar external reference calibration method based on a shooting device is provided, which is described by taking the method as an example applied to the server 102 in fig. 1, and includes the following steps:
step 202, acquiring an acquired image of a target shooting device and point cloud data of a radar to be calibrated, wherein the target shooting device and the radar to be calibrated have a view field overlapping area, and the radar to be calibrated comprises a first radar and a second radar.
The target shooting equipment is shooting equipment which has a field-of-view overlapping area with the radar to be calibrated. For example, the target shooting device may specifically refer to a camera or a camera having a field-of-view overlapping area with a field of view of the radar to be calibrated. The captured image refers to an image including a target object captured by a target photographing apparatus. The target object refers to an object which is simultaneously in the field of view of the target shooting device and the radar to be calibrated and can be used for external reference calibration between the camera and the radar. For example, the target object may be a calibration board that can be used for calibration. The calibration plate is used for correcting lens distortion, determining a conversion relation between a physical size and a pixel and determining a mutual relation between a three-dimensional geometric position of a certain point on the surface of a space object and a corresponding point in an image in the applications of machine vision, image measurement, photogrammetry, three-dimensional reconstruction and the like, and a camera imaging geometric model needs to be established. As another example, the target object may be a vehicle that may be used for calibration. As another example, the target object may be a building that may be used for calibration.
The radar to be calibrated refers to a radar which needs to calibrate external parameters on a vehicle. For example, the radar to be calibrated may specifically refer to a solid-state radar on a vehicle, which needs to calibrate external parameters. The point cloud data of the radar to be calibrated refers to a point cloud image acquired by the radar to be calibrated. For example, the point cloud data of the radar to be calibrated may be a frame of point cloud obtained by referring to the calibration radar. It should be noted that the radar to be calibrated usually includes at least two radars, i.e. a first radar and a second radar, and there is no field-of-view overlapping area between the first radar and the second radar, or there is only a small field-of-view overlapping area between the first radar and the second radar. For example, as shown in fig. 3, the first radar and the second radar may specifically be a forward laser radar and a backward laser radar which are assembled on the vehicle and have no overlapping area of the fields of view, and the point cloud data of the corresponding radar to be calibrated may be as shown in fig. 4, where the first area 402 is the point cloud data of the first radar, and the second area 404 is the point cloud data of the second radar, and as can be seen from fig. 4, the point cloud data of the first radar and the second radar have no overlapping area.
Specifically, when radar external reference calibration is performed on a radar to be calibrated on any vehicle, a server can acquire an acquired image of a target shooting device and point cloud data of the radar to be calibrated, wherein the target shooting device and the radar to be calibrated have a view field overlapping area, and the radar to be calibrated comprises a first radar and a second radar. For example, the target shooting device may be a shooting device placed on the same vehicle and having a field-of-view overlapping region with the radar to be calibrated. For another example, the target shooting device may be a shooting device that is placed on a nearby vehicle and has an overlapping area of fields of view with the radar to be calibrated, where the nearby vehicle may be located on the left side, the right side, and the like of the vehicle equipped with the radar to be calibrated, and this embodiment is not limited in this embodiment.
And 204, performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud.
The target detection is to detect a target object in the point cloud data. The first target object point cloud refers to a point cloud including a target object in the first radar. The second target object point cloud refers to a point cloud including a target object in the second radar.
Specifically, after point cloud data of a first radar and a second radar are obtained, the server respectively performs target detection on the point cloud data of the first radar and the point cloud data of the second radar to obtain a first target object point cloud and a second target object point cloud. In this embodiment, the manner of performing target detection on the point cloud data of the first radar and the second radar is not limited as long as target detection can be achieved. For example, the method for performing target detection on the point cloud data of the first radar and the second radar may be to calculate a normal vector corresponding to each point data based on point information of the point cloud data in the point cloud data, determine a plane where the point data is located according to the normal vector corresponding to each point data, determine a target object based on the plane where the point data is located, and determine the point data corresponding to the target object in one plane, where the point information of the point data includes coordinate information and reflection intensity of the point data. For another example, when the target object is a calibration plate, the corner points of the calibration plate are high-reflectivity points, and the way of performing target detection on the point cloud data of the first radar and the second radar may be to determine the reflection intensity corresponding to each point data based on the point information of the point data in the point cloud data, locate the corner points of the calibration plate according to the reflection intensity, and determine the target object based on the corner points of the calibration plate.
Step 206, determining the target object image position based on the acquired image of the target shooting device.
The target object image position refers to the position of the target object in the acquired image. For example, the target object image position may specifically refer to a set of pixel coordinates of the target object in the captured image.
Specifically, after acquiring the collected image of the target shooting device, the server performs target detection based on the collected image of the target shooting device to determine the position of the target object image. In this embodiment, the manner of performing target detection on the captured image is not limited as long as the target detection can be achieved. For example, when the target object is a vehicle or a building, the target detection of the captured image may be implemented by using a target detection network capable of implementing the target detection. For another example, when the target object is a calibration board, the method of performing target detection on the collected image may be to perform corner detection on the collected image, determine a corner of the calibration board, and then determine the position of the target object in the collected image based on the corner of the calibration board, where the corner of the standard calibration board is usually a two-dimensional code corner, and the corner detection may be implemented by identifying a two-dimensional code.
And 208, performing external reference calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external reference calibration result.
Specifically, if the target shooting device is a single shooting device, the first radar and the second radar have field overlapping areas, the server can respectively calibrate the single shooting device, the first radar, the single shooting device and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud, and then obtain radar external reference calibration results according to two external reference calibration results. Specifically, if the target shooting device at least comprises a first shooting device and a second shooting device, wherein the first shooting device and the second shooting device respectively have a view field overlapping region with a first radar, and the second shooting device respectively has a view field overlapping region with a second radar, when the first radar and the second radar are subjected to external reference calibration according to the target object image position, the first target object point cloud and the second target object point cloud, the server can calibrate the first radar and the first shooting device, calibrate the second radar and the second shooting device, and obtain a radar external reference calibration result based on two calibration results.
Further, when the radar external reference calibration result is obtained based on the two calibration results, if the external reference between the first shooting device and the second shooting device is known, the server may obtain the radar external reference calibration result based on the two calibration results and the external reference between the first shooting device and the second shooting device. The external reference between the first shooting device and the second shooting device is known, namely the external reference between the first shooting device and the second shooting device is determined after external reference calibration is carried out on the first shooting device and the second shooting device in advance.
Further, when the radar external reference calibration result is obtained based on the two calibration results, if the external reference between the first shooting device and the second shooting device is unknown, the target shooting device further comprises a third shooting device which has a view field overlapping region with the first shooting device and the second shooting device respectively, at this time, the server further calibrates the first shooting device, the second shooting device and the third shooting device according to the target object image position, and then obtains the radar external reference calibration result by combining the two calibration results and the calibration result of the shooting device.
The radar external reference calibration method based on the shooting equipment comprises the steps of respectively carrying out target detection on point cloud data of a first radar and a second radar by acquiring a collected image of target shooting equipment and point cloud data of the radar to be calibrated, detecting a target object from the point cloud data of the first radar and the second radar to obtain a first target object point cloud and a second target object point cloud, determining the image position of the target object based on the collected image of the target shooting equipment, detecting the target object from the collected image, externally calibrating the first radar and the second radar according to the image position of the target object in the collected image, the first target object point cloud matched with the first radar and the second target object point cloud matched with the second radar by taking the target object as a reference to obtain a radar external reference calibration result, in the whole process, the radar external reference calibration can be realized by using the target shooting equipment which has a view field overlapping region with the radar to be calibrated and taking the target object captured at the same time as a reference.
In one embodiment, the target shooting device is a single shooting device, and the single shooting device and the first radar and the second radar have field-of-view overlapping areas;
performing external reference calibration on the first radar and the second radar based on the target object image position, the first target object point cloud and the second target object point cloud, and obtaining a radar external reference calibration result comprises the following steps:
calibrating single shooting equipment and a first radar based on the target object image position and the first target object point cloud to obtain a first external reference calibration result;
calibrating the single shooting device and the second radar based on the target object image position and the second target object point cloud to obtain a second external reference calibration result;
and obtaining a radar external reference calibration result based on the first external reference calibration result and the second external reference calibration result.
The target shooting device can be a single shooting device, and when the target shooting device is the single shooting device, the single shooting device and the first radar and the second radar both have a view field overlapping region. For example, the schematic view of the field of view overlapping regions may be as shown in fig. 5, where 502 is the field of view region of the first radar, 504 is the field of view region of the second radar, and as can be seen from fig. 5, there is no field of view overlapping region between the first radar and the second radar, 506 is the field of view region of the single shooting device, and there is a field of view overlapping region between the single shooting device and both the first radar and the second radar.
The first external reference calibration result refers to an external reference transformation relation between single shooting equipment and the first radar, and comprises translation and rotation. The second external reference calibration result refers to the external reference transformation relation between the single shooting device and the second radar, and comprises translation and rotation. The radar external reference calibration result refers to the external reference transformation relation between the first radar and the second radar, and comprises translation and rotation.
Specifically, when the target shooting device is a single shooting device, the server calibrates the single shooting device and the first radar to obtain a first external reference calibration result based on the obtained external reference transformation relationship between the single shooting device and the first radar, the target object image position and the first target object point cloud, and calibrates the single shooting device and the second radar based on the obtained external reference transformation relationship between the single shooting device and the second radar, the target object image position and the second target object point cloud to obtain a second external reference calibration result.
Specifically, since the first external reference calibration result is an external reference transformation relationship between the single shooting device and the first radar, and the second external reference calibration result is an external reference transformation relationship between the single shooting device and the second radar, the server may determine the external reference transformation relationship between the first radar and the second radar based on the first external reference calibration result and the second external reference calibration result by using the single shooting device as an intermediary, so as to obtain the radar external reference calibration result.
In this embodiment, by determining a first external reference calibration result between the single shooting device and the first radar and then determining a second external reference calibration result between the single shooting device and the second radar, the external reference calibration between the first radar and the second radar can be achieved based on the first external reference calibration result and the second external reference calibration result by using the single shooting device as an intermediary, and a radar external reference calibration result is obtained.
In one embodiment, calibrating the single shooting device and the first radar based on the target object image position and the first target object point cloud, and obtaining the first external reference calibration result includes:
based on the obtained external parameter transformation relation, projecting the first target object point cloud to a coordinate system where the collected image is located to obtain a point cloud image position matched with the first target object point cloud, wherein the external parameter transformation relation is the external parameter transformation relation between a single shooting device and a first radar;
and calibrating the single shooting device and the first radar according to the position of the point cloud image and the position of the target object image to obtain a first external reference calibration result.
The external parameter transformation relation refers to an external parameter initial value between the single shooting device and the first radar, can be obtained based on a position relation between the single shooting device and the first radar, and is a rough external parameter transformation relation. The point cloud image position matched with the first target object point cloud is the position of the first target object point cloud under the coordinate system of the acquired image. For example, the point cloud image position matched with the first target object point cloud may specifically refer to a pixel coordinate set of the first target object point cloud in a coordinate system where the acquired image is located, and each pixel coordinate in the pixel coordinate set is matched with each point data in the first target object point cloud.
Specifically, when a single shooting device and a first radar are calibrated, a server projects a first target object point cloud to a coordinate system where an acquired image is located based on an obtained external parameter transformation relation between the single shooting device and the first radar, and a point cloud image position matched with the first target object point cloud is obtained. At this time, since the obtained external reference transformation relationship is rough, as shown in fig. 6, after the first target object point cloud is projected to the coordinate system where the captured image is located, the first target object point cloud does not completely overlap with the target object image position in the captured image, where the error is the precision error between the single shooting device and the first radar when the device is installed, and the calibration aims at eliminating the error.
Specifically, in order to eliminate errors, the server continuously performs optimization iteration on the obtained external parameter transformation relationship according to the positions of the point cloud images and the positions of the target object images to correct the external parameter transformation relationship, so that the positions of the point cloud images and the positions of the target object images are approximately overlapped, and a first external parameter calibration result is obtained. Further, continuously performing optimization iteration on the obtained external reference transformation relation, and modifying the external reference transformation relation, wherein the server determines a position error based on the position of the point cloud image and the position of the target object image, then optimizes the external reference transformation relation, projects the point cloud of the first target object to the coordinate system where the acquired image is located again based on the optimized external reference transformation relation to obtain a new position of the point cloud image, and finally determines a new position error according to the new position of the point cloud image and the position of the target object image.
The preset iteration stop condition may be set as needed, and this embodiment is not specifically limited herein. For example, the preset iteration stop condition may specifically be that the iteration number reaches an iteration number threshold, and the iteration number threshold may be set as required. For another example, the preset iteration stop condition may specifically be that the latest position error satisfies an error threshold, and the error threshold may be set as required.
Taking the stop iteration condition as an example, the latest position error satisfies the error threshold, and for the point cloud based on the target object image position and the first target object point, calibrating a single shooting device and a first radar to obtain a first external reference calibration result, assuming that a first target object point cloud is A, the obtained external reference transformation relation is B (internal reference transformation of a camera is coupled in the internal reference transformation, and accurate calibration of the internal reference of the single shooting device is assumed to be finished), and the target object image position is C, then A & ltB & gt & gtC should be used, but because the external reference transformation relation B is not accurate enough, actually obtained A & ltB & gtC', namely, the position error is equal to C-C ═ C-a ═ B, and through continuous optimization iteration, the external parameter in B is corrected, so that the error approaches to the error threshold value finally, convergence is reached, and the calibration is completed.
In this embodiment, the point cloud image position matched with the point cloud of the first target object can be obtained by projecting the point cloud of the first target object to the coordinate system where the collected image is located based on the obtained external reference transformation relationship, and thus calibration of a single shooting device and a first radar can be realized according to the point cloud image position and the target object image position.
In one embodiment, the target photographing apparatus includes a first photographing apparatus having a field of view overlapping region with a first radar, a second photographing apparatus having a field of view overlapping region with a second radar, and a third photographing apparatus having a field of view overlapping region with the first photographing apparatus and the second photographing apparatus, respectively;
determining the target object image location based on the captured image of the target capture device includes:
and respectively determining the position of the target object image matched with each shooting device based on the acquired image of each shooting device.
When the target shooting device is at least three shooting devices, the target shooting device comprises a first shooting device with a view field overlapping region with a first radar, a second shooting device with a view field overlapping region with a second radar, and a third shooting device with a view field overlapping region with the first shooting device and the second shooting device respectively. For example, a schematic view of the field of view overlapping region may be as shown in fig. 7, where 702 is a field of view region of the first radar, 704 is a field of view region of the second radar, and as can be seen from fig. 7, there is no field of view overlapping region between the first radar and the second radar, 706 is a field of view region of the first shooting device, where the first shooting device has a field of view overlapping region with the first radar, 708 is a field of view region of the second shooting device, where the second shooting device has a field of view overlapping region with the second radar, 710 is a field of view region of the third shooting device, and where the third shooting device has a field of view overlapping region with the first shooting device and the second shooting device, respectively.
Specifically, when the shooting device is at least three shooting devices, each shooting device has a corresponding captured image, and when the server determines the position of the target object image, the server determines the position of the target object image matched with each shooting device, that is, the position of the target object in the captured image, based on the captured image of each shooting device.
In this embodiment, when the target shooting device includes at least three shooting devices, the target object image position matched with each shooting device can be respectively determined based on the collected image of each shooting device.
In one embodiment, the external reference calibration of the first radar and the second radar is performed based on the target object image position, the first target object point cloud and the second target object point cloud, and obtaining a radar external reference calibration result includes:
calibrating the first radar and the first shooting equipment according to the first target object point cloud and the target object image position matched with the first shooting equipment to obtain a third external reference calibration result;
calibrating the second radar and the second shooting equipment according to the second target object point cloud and the target object image position matched with the second shooting equipment to obtain a fourth external reference calibration result;
calibrating the first shooting equipment, the second shooting equipment and the third shooting equipment based on the position of the target object image matched with each shooting equipment to obtain a shooting equipment calibration result;
and obtaining a radar external reference calibration result according to the third external reference calibration result, the fourth external reference calibration result and the shooting equipment calibration result.
The third external reference calibration result refers to an external reference transformation relation between the first radar and the first shooting device, and comprises translation and rotation. The fourth external reference calibration result refers to the external reference transformation relation between the second radar and the second shooting device, and comprises translation and rotation. The shooting equipment calibration result refers to the external parameter transformation relation between the first shooting equipment and the third shooting equipment and between the second shooting equipment and the third shooting equipment.
Specifically, when the target shooting device comprises a first shooting device, a second shooting device and a third shooting device, because a field of view overlapping region exists between the first shooting device and the first radar and a field of view overlapping region exists between the second shooting device and the second radar, the server can calibrate the first radar and the first shooting device according to the point cloud of the first target object and the position of the target object image matched with the first shooting device to obtain a third external reference calibration result, and can calibrate the second radar and the second shooting device according to the point cloud of the second target object and the position of the target object image matched with the second shooting device to obtain a fourth external reference calibration result.
Specifically, because the third shooting device has a field-of-view overlapping area with the first shooting device and the second shooting device respectively, the server can calibrate the first shooting device, the second shooting device and the third shooting device based on the position of the target object image matched with each shooting device to obtain a shooting device calibration result, after obtaining a third external reference calibration result, a fourth external reference calibration result and a shooting device calibration result, the server can determine the calibration relationship between the first radar and the second shooting device based on the third external reference calibration result and the shooting device calibration result by taking the first shooting device, the second shooting device and the third shooting device as an intermediary, determine the external reference transformation relationship between the first radar and the second shooting device based on the calibration relationship between the first radar and the second shooting device and the fourth external reference calibration result through parameter transformation, and obtaining the radar external reference calibration result.
For example, in this embodiment, assuming that the first shooting device is a camera a, the second shooting device is a camera B, the third shooting device is a camera C, the first radar is a radar a, the second radar is a radar B, a third external reference calibration result, i.e., a calibration relationship of the radar a- > camera a, can be obtained by calibrating the first radar and the first shooting device, a fourth external reference calibration result, i.e., a calibration relationship of the radar B- > camera B, can be obtained by calibrating the first shooting device, the second shooting device, and the third shooting device, i.e., a calibration relationship of the camera a- > camera C- > camera B, according to the third external reference calibration result (the calibration relationship of the radar a- > camera a) and the shooting device calibration result (the calibration relationship of the camera a- > camera B), the calibration relation between the first radar and the second shooting device can be obtained, namely the radar A- > camera C- > camera B ═ radar A- > camera B, and then the calibration relation between the radar A- > camera B and the radar B- > camera B can be obtained on the basis of the calibration relation between the radar A- > camera B and the calibration relation between the radar B- > camera B, so that the radar external parameter calibration result can be obtained, and the calibration of the radar A and the radar B can be completed.
In this embodiment, the radar external reference calibration result is obtained by calibrating the first radar and the first shooting device, the second radar and the second shooting device are calibrated to obtain the fourth external reference calibration result, and the first shooting device, the second shooting device and the third shooting device are calibrated to obtain the shooting device calibration result.
In one embodiment, calibrating the first shooting device, the second shooting device and the third shooting device based on the target object image position matched with each shooting device, and obtaining the shooting device calibration result comprises:
calibrating the first shooting device and the third shooting device based on the target object image positions matched with the first shooting device and the third shooting device respectively to obtain a first device calibration result;
calibrating the second shooting device and the third shooting device based on the target object image positions respectively matched with the second shooting device and the third shooting device to obtain a second device calibration result;
and obtaining a calibration result of the shooting equipment according to the calibration result of the first equipment and the calibration result of the second equipment.
The first device calibration result refers to a calibration relation between the first shooting device and the third shooting device. The second apparatus calibration result refers to a calibration relationship between the second photographing apparatus and the third photographing apparatus.
Specifically, the server may calibrate the first shooting device and the third shooting device based on the target object image positions respectively matched with the first shooting device and the third shooting device to obtain a first device calibration result, calibrate the second shooting device and the third shooting device based on the target object image positions respectively matched with the second shooting device and the third shooting device to obtain a second device calibration result, and finally obtain a shooting device calibration result according to the first device calibration result and the second device calibration result.
For example, in this embodiment, it is assumed that the first shooting device is a camera a, the second shooting device is a camera B, and the third shooting device is a camera C, the server calibrates the first shooting device and the third shooting device to obtain a first device calibration result (calibration relationship between the camera a- > and the camera C), calibrates the second shooting device and the third shooting device to obtain a second device calibration result (calibration relationship between the camera a- > and the camera C), and obtains a shooting device calibration result (calibration relationship between the camera a- > and the camera C- > and the camera B) according to the first device calibration result and the second device calibration result.
Specifically, when the first shooting device and the third shooting device are calibrated, the server back-projects the target object acquired by the first shooting device to the coordinate system of the acquired image of the third shooting device based on the acquired initial calibration relationship between the first shooting device and the third shooting device and the image position of the target object matched with the first shooting device, and then compares whether the projected image position of the target object is overlapped with the image position of the target object matched with the third shooting device, continuously optimizing and iterating the initial calibration relation between the first shooting device and the third shooting device until the optimization iteration reaches a preset stop condition, and overlapping or nearly overlapping the projected image position of the target object with the image position of the target object matched with the third shooting equipment to finish calibration. The preset stop condition may be set as required, for example, the preset stop condition may specifically be that the optimized iteration number reaches an optimized iteration number threshold, and the optimized iteration number threshold may be set as required. For another example, the preset stop condition may specifically be that an image position error between two image positions is smaller than a preset image position error threshold, the preset image position error threshold may be set by itself as needed, and the image position error may specifically be an euclidean distance, a mahalanobis distance, and the like, which is not specifically limited in this embodiment.
In this embodiment, the first shooting device and the third shooting device are calibrated first to obtain a first device calibration result, and then the second shooting device and the third shooting device are calibrated to obtain a second device calibration result.
In one embodiment, the target shooting device comprises a first shooting device having a field of view overlapping region with the first radar, and a second shooting device having a field of view overlapping region with the second radar, and in the case that shooting device external reference between the first shooting device and the second shooting device is known, external reference calibration is performed on the first radar and the second radar based on the target object image position, the first target object point cloud, and the second target object point cloud, and obtaining the radar external reference calibration result comprises:
calibrating the first radar and the first shooting equipment according to the first target object point cloud and the target object image position matched with the first shooting equipment to obtain a third external reference calibration result;
calibrating the second radar and the second shooting equipment according to the second target object point cloud and the target object image position matched with the second shooting equipment to obtain a fourth external reference calibration result;
and obtaining a radar external reference calibration result based on the shooting equipment external reference, the third external reference calibration result and the fourth external reference calibration result.
The shooting equipment external parameter is an external parameter obtained by calibrating the first shooting equipment and the second shooting equipment and is used for representing a conversion relation between parameters of the first shooting equipment and the second shooting equipment.
Specifically, if the camera external parameter between the first camera and the second camera is known, it indicates that the first camera and the second camera are calibrated in advance, at this time, the server only needs to calibrate the first radar and the first camera first to obtain a third external parameter calibration result, and calibrate the second radar and the second camera to obtain a fourth external parameter calibration result, and the radar external parameter calibration result can be obtained based on the camera external parameter, the third external parameter calibration result, and the fourth external parameter calibration result. In this embodiment, a manner of calibrating the first shooting device and the second shooting device is not limited, as long as calibration can be achieved, and the calibration manner may be the same as the manner of calibrating the first shooting device and the third shooting device in the above embodiment, which is not described herein again.
In this embodiment, in the case that the camera external parameter between the first camera and the second camera is known, the radar external parameter calibration can be implemented based on the third external parameter calibration result, the fourth external parameter calibration result, and the camera external parameter.
In an embodiment, when the target shooting device is a single shooting device, and the single shooting device and the first radar and the second radar both have a field-of-view overlapping region, as shown in fig. 8, a flowchart is used to describe the shooting device-based radar external reference calibration method of the present application, where the shooting device-based radar external reference calibration method specifically includes the following steps:
step 802, acquiring an acquired image of a single shooting device and point cloud data of a radar to be calibrated, wherein the single shooting device and the radar to be calibrated have a field-of-view overlapping area, and the radar to be calibrated comprises a first radar and a second radar;
step 804, performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud;
step 806, determining the position of the target object image based on the acquired image of the single shooting device;
step 808, projecting the first target object point cloud to a coordinate system where the collected image is located based on the obtained external parameter transformation relation to obtain a point cloud image position matched with the first target object point cloud, wherein the external parameter transformation relation is the external parameter transformation relation between a single shooting device and a first radar;
step 810, calibrating the single shooting device and the first radar according to the position of the point cloud image and the position of the target object image to obtain a first external reference calibration result;
step 812, calibrating the single shooting device and the second radar based on the target object image position and the second target object point cloud to obtain a second external reference calibration result;
and 814, obtaining a radar external reference calibration result based on the first external reference calibration result and the second external reference calibration result.
In an embodiment, when the target shooting device includes a first shooting device having a field of view overlapping region with a first radar, a second shooting device having a field of view overlapping region with a second radar, and a third shooting device having a field of view overlapping region with the first shooting device and the second shooting device, respectively, as shown in fig. 9, a shooting device-based radar external reference calibration method of the present application is described by a flowchart, and specifically includes the following steps:
step 902, acquiring an acquired image of a target shooting device and point cloud data of a radar to be calibrated, wherein the target shooting device and the radar to be calibrated have a field-of-view overlapping region, and the radar to be calibrated comprises a first radar and a second radar;
step 904, performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud;
step 906, respectively determining the position of the target object image matched with each shooting device based on the acquired image of each shooting device;
step 908, calibrating the first radar and the first shooting device according to the first target object point cloud and the target object image position matched with the first shooting device to obtain a third external reference calibration result;
step 910, calibrating a second radar and a second shooting device according to the second target object point cloud and the target object image position matched with the second shooting device to obtain a fourth external reference calibration result;
step 912, calibrating the first shooting device and the third shooting device based on the target object image positions matched with the first shooting device and the third shooting device respectively to obtain a first device calibration result;
step 914, calibrating the second shooting device and the third shooting device based on the target object image positions respectively matched with the second shooting device and the third shooting device to obtain a second device calibration result;
step 916, obtaining a calibration result of the shooting device according to the calibration result of the first device and the calibration result of the second device;
and 918, obtaining a radar external reference calibration result according to the third external reference calibration result, the fourth external reference calibration result and the shooting equipment calibration result.
In an embodiment, when the target shooting device includes a first shooting device having a field of view overlapping region with a first radar, a second shooting device having a field of view overlapping region with a second radar, and the shooting device external reference between the first shooting device and the second shooting device is known, as shown in fig. 10, a shooting device-based radar external reference calibration method according to the present application is described by a flowchart, and specifically includes the following steps:
step 1002, acquiring an acquired image of a target shooting device and point cloud data of a radar to be calibrated, wherein the target shooting device and the radar to be calibrated have a field-of-view overlapping area, and the radar to be calibrated comprises a first radar and a second radar;
step 1004, performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud;
step 1006, respectively determining the position of the target object image matched with each shooting device based on the collected image of each shooting device;
step 1008, calibrating the first radar and the first shooting device according to the first target object point cloud and the target object image position matched with the first shooting device to obtain a third external reference calibration result;
step 1010, calibrating a second radar and a second shooting device according to the second target object point cloud and the target object image position matched with the second shooting device to obtain a fourth external reference calibration result;
and step 1012, obtaining a radar external reference calibration result based on the shooting equipment external reference, the third external reference calibration result and the fourth external reference calibration result. In an embodiment, the radar external reference calibration method based on the shooting device provided by the embodiment of the present application may be applied to an application environment as shown in fig. 11.
Assuming that two vehicles are stopped on a road, the vehicle a is a target vehicle equipped with a laser radar a (first radar) and a laser radar B (second radar), no field of view overlapping area exists between the laser radar a and the laser radar B, and the vehicle B is an automatic driving vehicle equipped with a camera (target shooting device), then the vehicle a can perform online calibration and inspection work by using image information of the vehicle B, namely, the external reference calibration of the laser radar a and the laser radar B on the vehicle a is realized by using the radar external reference calibration method based on the shooting device provided by the embodiment of the application, at this time, the server can acquire an acquired image of the target shooting device and point cloud data of the laser radar a and the laser radar B, respectively perform target detection on the point cloud data of the laser radar a and the laser radar B, and obtain a first target object point cloud matched with the laser radar a and a second target object point cloud matched with the laser radar B, and determining the image position of a target object based on the acquired image of the camera, and carrying out external reference calibration on the laser radar A and the laser radar B according to the image position of the target object, the point cloud of the first target object and the point cloud of the second target object to obtain a radar external reference calibration result.
By the operation of updating the calibration parameters on line by using the collected images of the cameras of the surrounding vehicles during running, the calibration parameters of the vehicles can be always kept in the most accurate state, and even if the sensor is displaced due to certain vibration, the calibration parameters can be quickly corrected by self during running.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application also provides a shooting equipment-based radar external reference calibration device for realizing the shooting equipment-based radar external reference calibration method. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme recorded in the method, so that specific limitations in one or more embodiments of the shooting-device-based radar external reference calibration device provided below can be referred to the limitations of the shooting-device-based radar external reference calibration method in the foregoing, and details are not repeated herein.
In one embodiment, as shown in fig. 12, there is provided a radar external reference calibration apparatus based on a shooting device, including: an acquisition module 1202, a target detection module 1204, a processing module 1206, and a calibration module 1208, wherein:
the acquisition module 1202 is configured to acquire an acquired image of a target shooting device and point cloud data of a radar to be calibrated, where the target shooting device and the radar to be calibrated have a field-of-view overlapping region, and the radar to be calibrated includes a first radar and a second radar;
a target detection module 1204, configured to perform target detection on the point cloud data of the first radar to obtain a first target object point cloud, and perform target detection on the point cloud data of the second radar to obtain a second target object point cloud;
a processing module 1206 for determining a target object image position based on the captured image of the target capture device;
and the calibration module 1208 is configured to perform external reference calibration on the first radar and the second radar according to the target object image position, the first target object point cloud, and the second target object point cloud, so as to obtain a radar external reference calibration result.
The radar external reference calibration device based on the shooting equipment is used for respectively carrying out target detection on the point cloud data of the first radar and the second radar by acquiring the acquired image of the target shooting equipment and the point cloud data of the radar to be calibrated, detecting a target object from the point cloud data of the first radar and the second radar to obtain a first target object point cloud and a second target object point cloud, determining the image position of the target object based on the acquired image of the target shooting equipment, detecting the target object from the acquired image, and carrying out external reference calibration on the first radar and the second radar according to the image position of the target object in the acquired image, the first target object point cloud matched with the first radar and the second target object point cloud matched with the second radar by taking the target object as a reference to obtain a radar external reference calibration result, in the whole process, the radar external reference calibration can be realized by using the target shooting equipment which has a view field overlapping region with the radar to be calibrated and taking the target object captured at the same time as a reference.
In one embodiment, the target shooting device is a single shooting device, the first radar and the second radar have field-of-view overlapping regions, the calibration module is further configured to calibrate the single shooting device and the first radar based on the target object image position and the first target object point cloud to obtain a first external reference calibration result, calibrate the single shooting device and the second radar based on the target object image position and the second target object point cloud to obtain a second external reference calibration result, and obtain a radar external reference calibration result based on the first external reference calibration result and the second external reference calibration result.
In an embodiment, the calibration module is further configured to project the first target object point cloud to a coordinate system where the acquired image is located based on the obtained external reference transformation relationship, to obtain a point cloud image position where the first target object point cloud matches, where the external reference transformation relationship is an external reference transformation relationship between the single shooting device and the first radar, and calibrate the single shooting device and the first radar according to the point cloud image position and the target object image position, to obtain a first external reference calibration result.
In one embodiment, the target shooting device comprises a first shooting device with a field of view overlapping area with the first radar, a second shooting device with a field of view overlapping area with the second radar, and a third shooting device with a field of view overlapping area with the first shooting device and the second shooting device respectively, and the processing module is further used for determining the image position of the target object matched with each shooting device respectively based on the acquired image of each shooting device.
In one embodiment, the calibration module is further configured to calibrate the first radar and the first camera according to the first target point cloud and the target object image position matched with the first camera to obtain a third external reference calibration result, calibrate the second radar and the second camera according to the second target point cloud and the target object image position matched with the second camera to obtain a fourth external reference calibration result, calibrate the first camera, the second camera and the third camera based on the target object image position matched with each camera to obtain a camera calibration result, and obtain a radar external reference calibration result according to the third external reference calibration result, the fourth external reference calibration result and the camera calibration result.
In an embodiment, the calibration module is further configured to calibrate the first shooting device and the third shooting device based on the target object image positions respectively matched with the first shooting device and the third shooting device to obtain a first device calibration result, calibrate the second shooting device and the third shooting device based on the target object image positions respectively matched with the second shooting device and the third shooting device to obtain a second device calibration result, and obtain a shooting device calibration result according to the first device calibration result and the second device calibration result.
In one embodiment, the target shooting device comprises a first shooting device having a field of view overlapping region with the first radar, and a second shooting device having a field of view overlapping region with the second radar, and the calibration module is further configured to calibrate the first radar and the first shooting device according to the first target object point cloud and the target object image position matched with the first shooting device to obtain a third external reference calibration result, calibrate the second radar and the second shooting device according to the second target object point cloud and the target object image position matched with the second shooting device to obtain a fourth external reference calibration result, and obtain a radar external reference calibration result based on the shooting device external reference, the third external reference calibration result, and the fourth external reference calibration result.
The respective modules in the photographing apparatus-based radar external reference calibration apparatus may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 13. The computer device includes a processor, a memory, an Input/Output interface (I/O for short), and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer equipment is used for storing data such as collected images of the target shooting equipment, point cloud data of the radar to be calibrated and the like. The input/output interface of the computer device is used for exchanging information between the processor and an external device. The communication interface of the computer device is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to realize a radar external reference calibration method based on shooting equipment.
Those skilled in the art will appreciate that the architecture shown in fig. 13 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, in which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer-readable storage medium. The computer instructions are read by a processor of a computer device from a computer-readable storage medium, and the computer instructions are executed by the processor to cause the computer device to perform the steps in the above-mentioned method embodiments.
It should be noted that the data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with relevant laws and regulations and standards of relevant countries and regions.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by hardware instructions of a computer program, which may be stored in a non-volatile computer-readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (10)

1. A radar external reference calibration method based on shooting equipment is characterized by comprising the following steps:
acquiring an acquired image of target shooting equipment and point cloud data of a radar to be calibrated, wherein the target shooting equipment and the radar to be calibrated have a field-of-view overlapping region, and the radar to be calibrated comprises a first radar and a second radar;
performing target detection on the point cloud data of the first radar to obtain a first target object point cloud, and performing target detection on the point cloud data of the second radar to obtain a second target object point cloud;
determining a target object image position based on the acquired image of the target shooting device;
and carrying out external reference calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external reference calibration result.
2. The method of claim 1, wherein the target capture device is a single capture device having an overlapping region of fields of view with both the first radar and the second radar;
the external reference calibration of the first radar and the second radar is performed based on the target object image position, the first target object point cloud and the second target object point cloud, and obtaining a radar external reference calibration result comprises:
calibrating the single shooting device and the first radar based on the target object image position and the first target object point cloud to obtain a first external reference calibration result;
calibrating the single shooting device and the second radar based on the target object image position and the second target object point cloud to obtain a second external reference calibration result;
and obtaining a radar external reference calibration result based on the first external reference calibration result and the second external reference calibration result.
3. The method of claim 2, wherein calibrating the single camera and the first radar based on the target object image position and the first target object point cloud to obtain a first external reference calibration result comprises:
based on the obtained external parameter transformation relation, projecting the first target object point cloud to a coordinate system where the collected image is located to obtain a point cloud image position matched with the first target object point cloud, wherein the external parameter transformation relation is the external parameter transformation relation between the single shooting device and the first radar;
and calibrating the single shooting device and the first radar according to the position of the point cloud image and the position of the target object image to obtain a first external reference calibration result.
4. The method according to claim 1, wherein the target photographing apparatus includes a first photographing apparatus having a field of view overlapping region with the first radar, a second photographing apparatus having a field of view overlapping region with the second radar, and a third photographing apparatus having a field of view overlapping region with the first photographing apparatus and the second photographing apparatus, respectively;
the determining a target object image position based on the captured image of the target capturing device comprises:
and respectively determining the position of the target object image matched with each shooting device based on the acquired image of each shooting device.
5. The method of claim 4, wherein the performing external reference calibration on the first radar and the second radar based on the target object image position, the first target object point cloud, and the second target object point cloud, and obtaining a radar external reference calibration result comprises:
calibrating the first radar and the first shooting equipment according to the first target object point cloud and the target object image position matched with the first shooting equipment to obtain a third external reference calibration result;
calibrating the second radar and the second shooting equipment according to the second target object point cloud and the target object image position matched with the second shooting equipment to obtain a fourth external reference calibration result;
calibrating the first shooting device, the second shooting device and the third shooting device based on the target object image position matched with each shooting device to obtain shooting device calibration results;
and obtaining a radar external reference calibration result according to the third external reference calibration result, the fourth external reference calibration result and the shooting equipment calibration result.
6. The method according to claim 5, wherein the calibrating the first shooting device, the second shooting device and the third shooting device based on the target object image position matched with each shooting device comprises:
calibrating the first shooting device and the third shooting device based on the positions of the target object images respectively matched with the first shooting device and the third shooting device to obtain a first device calibration result;
calibrating the second shooting device and the third shooting device based on the target object image positions matched with the second shooting device and the third shooting device respectively to obtain a second device calibration result;
and obtaining a calibration result of the shooting equipment according to the first equipment calibration result and the second equipment calibration result.
7. A radar external reference calibration device based on shooting equipment is characterized by comprising:
the system comprises an acquisition module, a calibration module and a calibration module, wherein the acquisition module is used for acquiring an acquired image of target shooting equipment and point cloud data of a radar to be calibrated, the target shooting equipment and the radar to be calibrated have a field-of-view overlapping region, and the radar to be calibrated comprises a first radar and a second radar;
the target detection module is used for carrying out target detection on the point cloud data of the first radar to obtain a first target object point cloud and carrying out target detection on the point cloud data of the second radar to obtain a second target object point cloud;
the processing module is used for determining the position of a target object image based on the acquired image of the target shooting equipment;
and the calibration module is used for carrying out external parameter calibration on the first radar and the second radar according to the target object image position, the first target object point cloud and the second target object point cloud to obtain a radar external parameter calibration result.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 6.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 6 when executed by a processor.
CN202210559839.3A 2022-05-23 2022-05-23 Radar external parameter calibration method and device based on shooting equipment and computer equipment Pending CN114966578A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210559839.3A CN114966578A (en) 2022-05-23 2022-05-23 Radar external parameter calibration method and device based on shooting equipment and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210559839.3A CN114966578A (en) 2022-05-23 2022-05-23 Radar external parameter calibration method and device based on shooting equipment and computer equipment

Publications (1)

Publication Number Publication Date
CN114966578A true CN114966578A (en) 2022-08-30

Family

ID=82984720

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210559839.3A Pending CN114966578A (en) 2022-05-23 2022-05-23 Radar external parameter calibration method and device based on shooting equipment and computer equipment

Country Status (1)

Country Link
CN (1) CN114966578A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116148823A (en) * 2023-04-12 2023-05-23 北京集度科技有限公司 External parameter calibration method, device, vehicle and computer program product

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116148823A (en) * 2023-04-12 2023-05-23 北京集度科技有限公司 External parameter calibration method, device, vehicle and computer program product
CN116148823B (en) * 2023-04-12 2023-09-19 北京集度科技有限公司 External parameter calibration method, device, vehicle and computer program product

Similar Documents

Publication Publication Date Title
CN113643378B (en) Active rigid body pose positioning method in multi-camera environment and related equipment
CN111127422B (en) Image labeling method, device, system and host
US20210041236A1 (en) Method and system for calibration of structural parameters and construction of affine coordinate system of vision measurement system
US9747680B2 (en) Inspection apparatus, method, and computer program product for machine vision inspection
CN108364253B (en) Vehicle damage assessment method and system and electronic equipment
WO2021063128A1 (en) Method for determining pose of active rigid body in single-camera environment, and related apparatus
CN113689578B (en) Human body data set generation method and device
US20080297502A1 (en) Method and System for Detecting and Evaluating 3D Changes from Images and a 3D Reference Model
CN114646932B (en) Radar external parameter calibration method and device based on external radar and computer equipment
CN111915681B (en) External parameter calibration method, device, storage medium and equipment for multi-group 3D camera group
Perdigoto et al. Calibration of mirror position and extrinsic parameters in axial non-central catadioptric systems
CN114966576A (en) Radar external reference calibration method and device based on prior map and computer equipment
CN114966578A (en) Radar external parameter calibration method and device based on shooting equipment and computer equipment
CN109827607B (en) Calibration method and device for line structured light welding seam tracking sensor
CN112241984A (en) Binocular vision sensor calibration method and device, computer equipment and storage medium
CN110232715B (en) Method, device and system for self calibration of multi-depth camera
CN113884188A (en) Temperature detection method and device and electronic equipment
CN109242894B (en) Image alignment method and system based on mobile least square method
CN111145268A (en) Video registration method and device
KR20210084339A (en) Image association method, system and device
CN112581537B (en) Binocular camera external parameter checking method, binocular camera external parameter checking device, computer equipment and storage medium
CN117549825B (en) Calibration method and device for car lamp control angle, computer equipment and storage medium
Fasogbon et al. Automatic feature extraction for wide-angle and fish-eye camera calibration
CN113781583B (en) Camera self-calibration method, device, equipment and medium
Marko et al. Automatic Stereo Camera Calibration in Real-World Environments Without Defined Calibration Objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination