CN111383279A - External parameter calibration method and device and electronic equipment - Google Patents

External parameter calibration method and device and electronic equipment Download PDF

Info

Publication number
CN111383279A
CN111383279A CN201811636001.XA CN201811636001A CN111383279A CN 111383279 A CN111383279 A CN 111383279A CN 201811636001 A CN201811636001 A CN 201811636001A CN 111383279 A CN111383279 A CN 111383279A
Authority
CN
China
Prior art keywords
point cloud
cloud data
calibration
reference object
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811636001.XA
Other languages
Chinese (zh)
Other versions
CN111383279B (en
Inventor
李方震
孙伟健
王兵
王刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuzhou Online E Commerce Beijing Co ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201811636001.XA priority Critical patent/CN111383279B/en
Publication of CN111383279A publication Critical patent/CN111383279A/en
Application granted granted Critical
Publication of CN111383279B publication Critical patent/CN111383279B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The embodiment of the application discloses an external reference calibration method, an external reference calibration device and electronic equipment, wherein the method comprises the following steps: acquiring image data acquired by a camera device for a target calibration reference object and point cloud data acquired by a radar device for the target calibration reference object, wherein the target calibration reference object comprises a plurality of plane bodies with preset colors and preset angles; processing the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics; projecting the point cloud data in the collection into an image acquired by the camera device; and determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment. Through the embodiment of the application, the external reference calibration result with higher precision can be obtained more simply and conveniently.

Description

External parameter calibration method and device and electronic equipment
Technical Field
The application relates to the technical field of multi-sensor fusion environment perception, in particular to an external parameter calibration method, an external parameter calibration device and electronic equipment.
Background
The environmental perception is a core technology in the industries of robots, automatic driving, intelligent manufacturing, intelligent monitoring, intelligent transportation and the like, and common sensors for realizing the environmental perception comprise laser radars, cameras, millimeter wave radars and the like, wherein the laser radars and the cameras are most commonly used. Various sensors have advantages and disadvantages due to different principle characteristics, and in order to better realize environment perception, data of various sensors are generally required to be fused for use. For example, in the automatic driving industry, a laser radar and a camera device may need to be equipped on the same roadside device at the same time, and sensing of information such as specific road conditions in the environment is achieved by fusing data of two sensors. In order to realize data fusion between different sensors, the problem of uniform coordinate systems of data of the sensors needs to be solved. That is, a roto-translational relationship between the coordinate systems of the different sensors is required. In practical applications, different sensor devices are mounted at a certain angle or a certain translational distance, but the angle or the distance formed after actual mounting may not be exactly equal to a certain set value, but may have a certain error. For example, in a scenario, the angle between the camera and the lidar may be 30 degrees and the distance 50cm, but after actual installation may be 31 degrees, the distance 50.3cm, and so on. At this time, some means are needed to calibrate the actual rotational-translational relationship, otherwise, the fusion result of the environmental perception may also generate errors.
Therefore, how to simply and conveniently obtain the calibration result of the rotational-translational relationship between the camera device and the radar device coordinate system becomes a technical problem to be solved by a person skilled in the art.
Disclosure of Invention
The application provides an external reference calibration method, an external reference calibration device and electronic equipment, which can obtain a high-precision external reference calibration result more simply and conveniently.
The application provides the following scheme:
an external reference calibration method comprises the following steps:
acquiring image data acquired by a camera device for a target calibration reference object and point cloud data acquired by a radar device for the target calibration reference object, wherein the target calibration reference object comprises a plurality of plane bodies with preset colors and preset angles;
processing the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics;
projecting the point cloud data in the collection into an image acquired by the camera device;
and determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment.
A calibration reference object is provided, which comprises a calibration reference object,
the calibration reference object comprises a plurality of plane bodies with preset colors and preset angles, and is used for calibrating the rotation and translation relation between the coordinate systems of the camera equipment and the radar equipment which are installed in the same environment sensing system.
An environment awareness system, comprising:
the camera equipment is used for acquiring image data of a target environment;
the radar equipment is used for carrying out point cloud data acquisition on the target environment;
the data processing device is used for pre-storing a conversion relation calibration result between the camera device coordinate system and the radar device coordinate system, fusing data collected by the camera device and the radar device according to the calibration result and obtaining a perception result of the target environment; the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles.
An environment awareness method, comprising:
pre-storing a conversion relation calibration result between a camera equipment coordinate system and a radar equipment coordinate system, wherein the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
and fusing the data collected by the camera equipment and the radar equipment according to the calibration result to obtain a perception result of the target environment.
An external reference calibration device, comprising:
the system comprises a data obtaining unit, a data processing unit and a data processing unit, wherein the data obtaining unit is used for obtaining image data acquired by camera equipment on a target calibration reference object and point cloud data acquired by radar equipment on the target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
the point cloud data set determining unit is used for processing according to the point cloud data and determining a plurality of point cloud data sets with different specific position characteristics;
a projection unit for projecting the point cloud data in the set into an image acquired by the camera device;
and the calibration result determining unit is used for determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment.
An environment sensing device comprising:
the device comprises a storage unit, a calibration unit and a control unit, wherein the storage unit is used for storing a conversion relation calibration result between a camera equipment coordinate system and a radar equipment coordinate system in advance, the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
and the sensing unit is used for fusing the data acquired by the camera equipment and the radar equipment according to the calibration result to obtain a sensing result of the target environment.
An electronic device, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
acquiring image data acquired by a camera device for a target calibration reference object and point cloud data acquired by a radar device for the target calibration reference object, wherein the target calibration reference object comprises a plurality of plane bodies with preset colors and preset angles;
processing the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics;
projecting the point cloud data in the collection into an image acquired by the camera device;
and determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment.
An electronic device, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
pre-storing a conversion relation calibration result between a camera equipment coordinate system and a radar equipment coordinate system, wherein the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
and fusing the data collected by the camera equipment and the radar equipment according to the calibration result to obtain a perception result of the target environment.
According to the specific embodiments provided herein, the present application discloses the following technical effects:
according to the embodiment of the application, a calibration reference object can be formed by adopting a plane body with a preset color and a preset angle, then data acquisition is carried out on the calibration reference object through camera equipment and radar equipment, and then a plurality of point cloud data sets with different specific position characteristics can be determined from point cloud data and projected into images acquired by the camera equipment respectively, and then a calibration result can be determined according to the overlapping degree between the point cloud data in the sets and the projection result in the images acquired by the camera equipment. By the mode, the calibration reference object is an object in a three-dimensional space instead of a two-dimensional calibration plate, so that the radar equipment can obtain higher data acquisition precision without a sensor with higher precision. In addition, because a plurality of planes have a preset angle and a priori information different from the preset angle in color, the position characteristics of the specific point cloud data can be obtained from the point cloud data acquired by the radar equipment, for example, whether the specific point cloud data is located on a certain plane, whether the specific point cloud data is located on a certain characteristic straight line, and the like, and further, according to the projection of the position characteristic information and the specific point cloud data in the image acquired by the camera equipment, the overlapping degree between the two accords with a preset condition by gradually adjusting the angle, the translation distance, and the like, so that a corresponding calibration result can be obtained. In addition, because the calibration reference object is provided with a plurality of planes, the characteristic planes of a plurality of angles can be extracted at one time, so that in the whole calibration process, the camera equipment and the radar equipment only need to execute one-time data acquisition operation to obtain a satisfactory calibration result, multiple acquisition is not needed, and multiple operation is not needed, so that the efficiency can be improved.
In addition, in a preferred implementation scheme, the characteristics of the image acquired by the camera device after gray gradient inverse depth transformation can be matched with the laser point cloud data, and compared with the method of directly using image edge characteristics, the method can enable matching optimization to be more stable and obtain a global optimal solution more easily.
Moreover, under the condition of using the same set of calibration data, the final calibration result can be solved by two steps of rough calibration and fine calibration, and compared with the traditional scheme of calculating the calibration result at one time, the calibration precision can be higher.
Of course, it is not necessary for any product to achieve all of the above-described advantages at the same time for the practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic diagram of a calibration system provided in an embodiment of the present application;
FIG. 2 is a flow chart of a first method provided by an embodiment of the present application;
FIG. 3 is a schematic view of a calibration reference provided by embodiments of the present application;
fig. 4 is a schematic diagram of a first application system provided in an embodiment of the present application;
FIG. 5 is a diagram of a second application system provided by an embodiment of the present application;
FIG. 6 is a flow chart of a second method provided by embodiments of the present application;
FIG. 7 is a schematic diagram of a first apparatus provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a second apparatus provided by an embodiment of the present application;
fig. 9 is a schematic diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments that can be derived from the embodiments given herein by a person of ordinary skill in the art are intended to be within the scope of the present disclosure.
The inventor finds that if the rotation and translation relation between the camera equipment and the radar equipment coordinate system is required, the camera can be subjected to external reference calibration by using the coordinate system of the laser radar as a reference coordinate system. The specific calibration scheme comprises calibration based on a calibration plate or calibration based on a calibration workshop. For the calibration mode based on the calibration plate, in a specific implementation mode, the planar calibration plate is used as a calibration reference object, and the camera and the laser radar shoot or scan the calibration plate simultaneously. Then, the coordinates of the corner points (or other characteristic point coordinates) of the calibration plate are extracted from the image collected by the camera, the three-dimensional space coordinates (or corresponding characteristic point coordinates) of each corresponding corner point are extracted from the point cloud data collected by the laser radar, and then the external parameters of the camera and the laser radar are calculated by solving the problem of PNP (pose estimation) of the camera by using the 2D-3D coordinate corresponding relation. Or, a checkerboard calibration board can be adopted, each grid angular point of the checkerboard is extracted from the image collected by the camera, and the 2D coordinates of the angular points are returned to the 3D space of the camera by utilizing the camera internal reference calibration principle. Meanwhile, 3D coordinates of angular points of the checkerboard are extracted from the laser point cloud, and external parameters of the laser radar and the camera are solved by utilizing methods such as ICP (inductively coupled plasma) and the like.
The calibration scheme based on the calibration plate has the disadvantages that the calibration plate is a two-dimensional plane object, and the laser beam of the laser radar is not high in precision when the laser beam of the laser radar strikes the single-plane object. When the data collected by the laser radar is used for external reference calibration of the camera equipment, certain errors can be caused, and the calibration precision is not high.
The method based on the calibration workshop is basically the same as the calibration plate, namely, a plurality of calibration plates are hung on the wall surface of a room for calibration, camera equipment uniformly collects images of the calibration plates under a plurality of angle distributions to reduce calibration errors, the process of collecting data is complex and time-consuming, and the calibration result is not stable.
And another scheme for improving the calibration precision is to introduce a third sensor device with higher precision, calibrate the external reference of the camera device and the external reference of the laser radar device respectively through the third sensor device, and then convert the rotation and translation relation between the coordinate systems of the camera device and the laser radar device through the two calibration results. However, this solution is still cumbersome.
Aiming at the problems in the various schemes, the scheme capable of obtaining the calibration result with higher precision more conveniently and rapidly is provided in the embodiment of the application. In the scheme, firstly, the calibration reference object is improved, and a specific calibration method is provided on the basis of the improved calibration reference object. The calibration reference object can be composed of a plurality of plane bodies with preset colors and preset angles. Therefore, because a plurality of planes with preset colors exist, the interface lines among the planes are convenient to identify, and the prior information forming a certain angle with each other is provided, the position characteristics of different point clouds, such as which plane a certain point cloud is positioned on, whether the certain point cloud is positioned on a certain straight line (the interface line among different planes or the boundary line of the planes) and the like, can be conveniently determined from the point cloud data collected by the radar equipment, and then the image collected by the camera equipment can be projected by utilizing the position characteristic information, and the final calibration is completed according to the overlapping degree between the point cloud data and the corresponding projection. By the method, high-precision external reference calibration can be realized without changing the angle to acquire data for multiple times and without the aid of other high-precision third sensors.
In a specific implementation, from the perspective of a system architecture, first, an object to be specifically calibrated may be an external parameter of a camera device, and the camera device needs to cooperate with a radar device to perform environment sensing in a certain scene. Therefore, the camera device and the radar device are usually installed in advance according to a certain rule, for example, in a scenario where the roadside device collects road condition information, the camera device and the radar device may be installed on the same roadside device at the same time, a certain angle may be formed between the two devices, and a certain translation distance may be provided, and the calibration is performed to accurately identify an actual angle and a translation distance between the two devices. Therefore, when calibration is specifically performed, as shown in fig. 1, the camera device 101 and the radar device 102 may be placed at an angle and a distance that are assumed when the radar device is actually installed, and data collection is performed on a calibration reference object. In addition, a processing device 103 (e.g. a computer or the like, in which an application program specifically for performing calibration may be run) may be provided, and after a set of data about the calibration reference object is acquired by the camera device and the radar device, the set of data may be transmitted to the processing device, and the processing device may perform a specific calibration operation to obtain a calibration result. The specific calibration result may be a set of data describing an angle (angles in multiple directions), a distance (distances in multiple directions), and the like of the camera device coordinate system relative to the radar device coordinate system. After calibration is completed, specific camera equipment and radar equipment can be installed in an actual application scene according to the angle and the translation distance formed during calibration, a specific calibration result can be provided for a data fusion processing system in the actual application scene, the data fusion processing system can fuse data actually acquired by the camera equipment and the radar equipment according to the calibration result, namely, the data actually acquired by the camera equipment is rotated and translated according to the calibration result, then is fused with the data acquired by the radar equipment, and environment perception is realized according to the fusion result.
The following describes in detail specific implementations provided in embodiments of the present application.
Example one
This embodiment provides an external reference calibration method mainly from the perspective of the processing device shown in fig. 1, and the execution subject of the method may be a computer program running in the processing device, or may also be a processing module solidified in the processing device by means of hardware, and so on. Specifically, referring to fig. 2, the method may specifically include:
s210: acquiring image data acquired by a camera device for a target calibration reference object and point cloud data acquired by a radar device for the target calibration reference object, wherein the target calibration reference object comprises a plurality of plane bodies with preset colors and preset angles;
the calibration in the embodiment of the present application is specifically performed to obtain a conversion relationship between a coordinate system of a radar device (e.g., lidar) and a coordinate system of a camera device, where an internal parameter of the camera device is known, and the conversion relationship can be represented by a rotational euler angle and a translation distance of the coordinate system of the camera device relative to the coordinate system of the radar device. That is, it is necessary to specify how much the coordinate system of the camera device is rotated in which direction and how much translation is necessary in each translation direction, and to be able to overlap the coordinate system of the radar device. Only under the condition that the coordinate systems of the two systems are overlapped, the data actually acquired by the two systems can be fused, and the perception of a specific environment space is realized.
Specifically, when calibration is performed, some preparation stages may be performed in advance, specifically including: the camera equipment and the radar equipment can be aligned to the calibration reference object, so that the radar equipment can scan each plane, and meanwhile, each plane surface in the camera equipment is clear and visible. Then, the specific radar device and the specific camera device may be started, and data acquisition may be performed on the calibration reference object to obtain a set of data acquired by the radar device and the camera device, respectively. The group of data may specifically include: the calibration reference object comprises point cloud data acquired by one circle of rotation of radar equipment (the radar equipment generally comprises a rotating part, and under the rotation of the rotating part, a laser beam emitted by the radar equipment can acquire 360-degree point cloud data), and one frame of image data extracted from a calibration reference object after the camera equipment shoots the calibration reference object.
In the embodiment of the present application, the specific calibration reference object is a plurality of plane bodies having preset colors and forming preset angles, wherein, in a specific implementation, the specific angles, colors, and the like may be determined according to actual requirements, and in a preferred implementation, the plurality of plane bodies orthogonal to each other may be used as the calibration reference object. That is, the angle between the respective planes is 90 degrees. Regarding the color information, in one mode, the same plane body may have the same color, and different plane bodies may have different colors. Alternatively, the same plane body may have the same color, and two plane bodies having an intersecting relationship have different colors, that is, as long as the boundary between different planes can be easily distinguished by color. For example, as shown in fig. 3, in a specific implementation, the calibration reference object may be specifically three orthogonal planes, and the color of the three planes may be one of three primary colors, that is, the three planes are respectively painted with red, green and blue colors. It should be noted that the calibration reference object shown in fig. 3 has three planes in total, instead of a rectangular parallelepiped, the rectangular parallelepiped is cut by half along a diagonal line, and the remaining three planes have inner surfaces painted with three colors of red, green, and blue, respectively. Of course, in practical applications, more planes may be used for implementation. In general, each plane may have a single color, with the colors of the different planes differing to facilitate identification of the respective planes from the particular collected data. Moreover, because a plurality of planes have mutually orthogonal prior information, it is also convenient to identify the position characteristics of a specific point or pixel, for example, whether the specific point or pixel is located on one of the planes, whether the specific point or pixel is located on a boundary line between two of the planes, whether the specific point or pixel is located on a boundary line of one of the planes, and the like.
S220: processing the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics;
after a group of data respectively acquired by the radar equipment and the camera equipment is acquired, a specific calibration process can be executed by using the group of data. When the calibration is specifically performed, firstly, processing can be performed according to the point cloud data, and a plurality of point cloud data sets with different specific position characteristics are determined.
The data collected by the laser radar may be the result of one cycle of laser radar rotation, but the calibration reference object exists only within a certain scanning angle, only the region where the calibration reference object is located is the region that needs to be concerned, and the other part of data is invalid data. Therefore, the ROI area can be roughly defined, the point cloud falling on the calibration reference object is extracted, and then specific position characteristics of specific point cloud data are determined respectively.
The specific location-specific features may be varied, for example, one of the cases may be: for a point, whether it lies on a particular plane in the calibration reference. Specifically, assuming that the calibration reference object includes three planes, which are the planes A, B, C, after the radar device collects specific point cloud data, it can be identified through specific processing operations which points are located on the plane a, which points are located on the plane B, which points are located on the plane C, and so on. That is to say, the point cloud data may be processed to determine point cloud data respectively located on different plane bodies, and then determine a plurality of first point cloud data sets corresponding to a plurality of different plane bodies. The feature data can be projected into image data acquired by the camera device, and then a specific calibration result can be obtained according to the contact ratio of the specific projection result. The specific position features are features of a plane where the point clouds are located, so that the point cloud data in the plurality of first point cloud data sets can be directly projected into three-channel images of the color original image collected by the camera device during projection.
In one embodiment, the plane equations corresponding to the plurality of different plane bodies are obtained by fitting the point cloud data collected by the radar device and associated with the calibration reference object (for example, fitting may be performed according to the RANSCA algorithm, and the like), and the point cloud data belonging to the different plane bodies are segmented, of course, since the points included in the point cloud data collected by the laser radar may be very many and dense, if all the point cloud data are subjected to matching operation, the operation amount is very large, and may not be necessary, therefore, in an alternative embodiment, the point cloud data in each plane body may be sampled according to the size information of the planes of the plurality of plane equations, at preset intervals, to obtain the first point cloud data sets corresponding to the different plane bodies, for example, in the case of using a tri-orthogonal plane as the calibration reference object, the number of the sampling points on each plane is equal to { p ^ n } n, p 2 ^ n, p ^ n } points, p 2 ^ n, p ^ n { 2, p ^ n } of the sampling points on each plane represents the number of the three plane bodies { 2, p ^ n } planes { 2, p ^ n } plane { 2, n } representing the number of.
It should be noted that some errors may exist in the process of fitting the plane equations, for example, the angles between the fitted plane equations may not be exactly equal to the actual angles between the planes in the calibration reference. Therefore, in order to further improve the calibration accuracy, the plane equation may be calibrated according to the prior information of the angles between the plurality of plane bodies after the plane equation is fitted. For example, if the planes in the calibration reference are orthogonal to each other, the parameters of the plane equations may be adjusted by rotating the plane equations by an angle step θ using the prior information that the planes are orthogonal to each other, so that the normals of the plane equations are orthogonal to each other, and the sum of the distances from the points on the respective planes to the planes is the minimum, thereby achieving the purpose of calibrating the orthogonal plane equations.
Regarding the specific position feature where the specific point cloud data is located, besides the position relationship between the point cloud and the located plane, the position relationship between the specific point cloud and the located feature straight line may also be included, where the feature straight line may include an intersection line between two planes, a boundary line of a single plane, and the like. Therefore, particularly when determining a plurality of point cloud data sets having different specific position features, it is also possible to determine point cloud data respectively located on different feature straight lines, and determine a plurality of second point cloud data sets corresponding to the plurality of different feature straight lines.
In particular, there may be a plurality of implementation manners when determining point cloud data respectively located on different characteristic straight lines. For example, in one of the manners, an intersection equation between every two planes and a boundary equation of the planes may be calculated according to the plane equations corresponding to the different plane bodies, based on the plane equations corresponding to the different plane bodies that are fitted previously, and then the point cloud data on each straight line may be sampled at preset intervals on each straight line to obtain second point cloud data sets corresponding to the different straight lines, respectively.
For example, in the case of using the three orthogonal planes as the calibration reference, an intersection equation and a plane boundary equation between two planes are respectively obtained according to three plane equations and the plane size, and a point set is generated by uniformly sampling on each straight line at an interval lb, so that PL ═ p _ m ^ j ∈ [ line ] < jj > < ∈ {0,1,2 … 9} }, where j represents the number of plane edges plus the intersection lines, and m represents the number of points sampled on each line.
S230: projecting the point cloud data in the collection into an image acquired by the camera device;
after obtaining the specific position feature where the specific point cloud data is located and obtaining a plurality of point cloud data sets (where the point cloud data in the same set have the same position feature, for example, are located on the same plane, or on the same straight line, etc.), the point cloud data in the sets may be projected into the image collected by the camera device.
Specifically, in the projection process, the plane feature or the straight line feature where the point cloud data is located can be used alone for projection and calibration, or the plane feature and the straight line feature can be combined to obtain a more accurate calibration result.
Specifically, if the planar feature where the point cloud data is located is used for projection, the point cloud data in the plurality of first point cloud data sets may be respectively projected into a three-channel image of the color original image acquired by the camera device. Since each pixel point has 3 values to represent, it is called three-channel image. For example, RGB images are three-channel images, and the RGB color mode is a color standard in the industry, and various colors are obtained by changing three color channels of red (R), green (G), and blue (B) and superimposing them on each other. In the embodiment of the present application, since a plurality of different planes in the calibration reference object have different colors, and the colors of the pixels on the same plane are the same, three channel values are also the same, and it can be determined which pixels are located on which plane. Therefore, after determining the plane characteristics of the point cloud data acquired by the radar equipment and projecting the point cloud data into the three-channel image, the coordinate overlapping degree between the camera equipment and the radar equipment can be determined according to the overlapping degree between the specific point cloud data and the corresponding projection result, if the overlapping degree does not meet the requirement, the overlapping degree can be improved by gradually adjusting the angle, the translation distance and the like, and when the overlapping degree reaches the maximum, a specific calibration result can be obtained.
And if the linear features of the point cloud data are used for projection, preprocessing image data acquired by the camera equipment in advance to obtain image data with characteristic linear information, and then projecting the point cloud data in the second point cloud data sets into the image with the characteristic linear information respectively.
Specifically, when image data with characteristic straight line information is obtained, there may be a plurality of implementation manners, for example, in one implementation manner, the color image data acquired by the camera device may be grayed, a gradient value of a gray value is calculated in a first preset number neighborhood of each pixel, and a maximum gradient value in one neighborhood is taken as a gradient value on a corresponding pixel to obtain a gradient image, so as to determine the characteristic straight line information according to the gradient value. The gradient image may be an image having the same size as the original image captured by the camera device, except that the corresponding information on each pixel is no longer the three-channel color information value on each pixel, but is a grayscale value. Such gray scale gradient values are used to represent the difference in gray scale between one pixel point and the other adjacent pixel points. In the embodiment of the present application, since the colors of the calibration reference objects on the same plane are the same, and different planes have different colors, if a pixel is not at the boundary between a plane and another plane, the gradient gray level between the pixel and each surrounding pixel is 0. When a pixel is located at the boundary between a plane and another plane or on the boundary line of a plane, the gray gradient between the pixel and one of the surrounding 8 pixels may be 255 or other non-0 values. Therefore, it can be determined which pixels are located on the boundary lines of two different planes, which pixels are located on the boundary lines of the planes, which belong to the boundary line of which two planes, which boundary line of which plane, and the like through the calculation of the gray gradient value.
After the straight line characteristics in the image data are obtained, the point cloud data with the straight line characteristics in the radar equipment can be respectively projected into the image data with the characteristic straight line information. Alternatively, in a specific implementation, when the gray gradient is calculated to identify the pixels of the linear feature in the image data, the identified boundary line, etc. may be very "narrow", for example, only one pixel wide. This may be disadvantageous for subsequent specific projection matching, and therefore, in a preferred embodiment, after calculating the above gray gradient image, inverse depth transformation information of each pixel in the neighborhood of a second preset number (e.g., in the neighborhood of 24) may also be calculated in the gradient image, resulting in an inverse depth transformation map, so as to determine the characteristic straight line information according to the inverse depth transformation information. That is, by inverse depth transformation, the pixel points at the boundary line between different planes can be generalized, so that the boundary line becomes "wide", and the comparison in the subsequent steps is facilitated. The inverse depth transformation map is also a map with the same size as the original image collected by the camera device, and the information on each pixel point may also be gray gradient information, but a specific "straight line" is widened relative to the gray gradient map. For example, in the gray gradient map, the gray gradient values of some pixels may be (… 0, 0, 0, 255, 0, 0, 0 …), and after the inverse transformation, may be represented as (… 0, 0, 50, 100, 255, 100, 50, 0, 0 …), that is, the number of pixels that may be located on the boundary line is increased, so that the calibration result may be obtained more efficiently during the projection comparison.
S240: and determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment.
After the projection of the point cloud data in the specific point cloud data set into the image acquired by the camera device is completed, a specific calibration result can be determined according to the overlapping degree between the point cloud data and the projection result. Specifically, when determining the calibration result, an initial value of a rotational euler angle and an initial value of a translation vector may be determined first, and for example, the initial values may be determined according to an approximate angle and distance between the camera device and the radar device when the camera device and the radar device are installed. At the position of the initial value, the coincidence degree between the point cloud data in the specific set and the projection result can be calculated. Then, searching can be performed near the initial value according to a preset angle step length and a translation step length, and the contact ratio between the point cloud data in the specific set and the projection result is respectively determined during each searching. Then, a rotational euler angle and a translational distance corresponding to the coincidence degree meeting the preset condition (for example, the coincidence degree reaches a maximum value) can be obtained, and the rotational euler angle and the translational distance can be determined as the calibration result.
For example, when point cloud data in a first point cloud data set with planar features is used for projection, when the degree of overlap between the point cloud data in the first point cloud data set and the projection in the three-channel image meets a preset condition, a first calibration result may be determined according to a corresponding rotational euler angle and a translation vector.
When the point cloud data in the second point cloud data set with the straight line feature is used for projection, when the overlapping degree between the point cloud data in the second point cloud data set and the projection in the image data with the straight line feature information meets a preset condition, a first calibration result can be determined according to a corresponding rotational Euler angle and a corresponding translation vector. The specific image data having the straight line feature information may include the gray scale gradient map or the inverse depth transform map, and the like.
In addition, the first point cloud data set with the plane features and the second point cloud data set with the straight line features can be combined to obtain a more accurate calibration result. For example, specifically, the point cloud data in the first point cloud data set may be projected into a three-channel image, then, with an approximate angle and a translation distance during installation as initial values, search is performed according to a preset angle step and a translation step, and when an overlap degree between the point cloud data in the first point cloud data set and the projection in the three-channel image meets a preset condition, a first calibration result is determined according to a corresponding rotational euler angle and a translation vector. The first calibration result obtained at this time may be referred to as a "coarse calibration" result. For example, in specific implementation, first, RGB three channel maps in the color original image collected by the camera device may be extracted, then, according to the installation prior information of the camera device and the radar device, the approximate initial values of the rotational euler angle and the translation vector are given, and the point cloud data in the first point cloud data set is projected into the images of the three channels respectively by using the camera pinhole imaging principle. Then, greedy search is performed near the initial value by an angle step of 0.5 degrees and a translation step of 10cm, so as to obtain a value with the maximum overlap between the point cloud data in the first point cloud data set and the projections of the red, green and blue planes in the image, and at this time, a coarse calibration Result1 is obtained, namely { roll1, pitch1, yaw1, x1, y1, z1 }.
And then, projecting the point cloud data in the second point cloud data set into an inverse depth transformation image, taking the first calibration result as an initial value, and determining a second calibration result according to a corresponding rotation Euler angle and a translation vector when the overlapping degree between the point cloud data in the second point cloud data set and the projection in the image with the linear characteristic information meets a preset condition, wherein the obtained result is that the precision is further improved on the basis of a rough calibration result, so that the result can be called as a 'fine calibration' result. For example, on the basis of the coarse calibration Result1, the point cloud data in the second point cloud data set is projected into the inverse depth transformation map by using the camera pinhole imaging principle. And then greedy search is carried out near Result1 according to the angle step of 0.05 degrees and the translation step of 1cm, so as to obtain the value with the maximum projection overlapping of the point cloud data in the second point cloud data set and the characteristic straight line in the inverse depth transformation image, and at this time, a fine calibration final Result is obtained, wherein Result is { roll, pitch, yaw, x, y, z }.
In summary, according to the embodiment of the present application, a calibration reference object may be composed of a plurality of plane bodies having preset colors and forming preset angles, and then data acquisition may be performed on the calibration reference object through a camera device and a radar device, so that a plurality of point cloud data sets having different specific position characteristics may be determined from point cloud data and projected into an image acquired by the camera device, and then a calibration result may be determined according to a degree of overlap between the point cloud data in the sets and a projection result in the image acquired by the camera device. By the mode, the calibration reference object is an object in a three-dimensional space instead of a two-dimensional calibration plate, so that the radar equipment can obtain higher data acquisition precision without a sensor with higher precision. In addition, due to the fact that the planes have the prior information such as the preset angles and the preset colors, the position characteristics of the specific point cloud data can be obtained from the point cloud data collected by the radar device, for example, whether the specific point cloud data are located on a certain plane, whether the specific point cloud data are located on a certain characteristic straight line, and the like, and then the overlapping degree between the specific point cloud data and the position characteristic information can accord with the preset conditions by gradually adjusting the angles, the translation distances and the like according to the projection of the position characteristic information and the specific point cloud data in the image collected by the camera device, and the corresponding calibration result can be obtained. In addition, because the calibration reference object is provided with a plurality of planes, the characteristic planes of a plurality of angles can be extracted at one time, so that in the whole calibration process, the camera equipment and the radar equipment only need to execute one-time data acquisition operation to obtain a satisfactory calibration result, multiple acquisition is not needed, and multiple operation is not needed, so that the efficiency can be improved.
In addition, in a preferred implementation scheme, the characteristics of the image acquired by the camera device after gray gradient inverse depth transformation can be matched with the laser point cloud data, and compared with the method of directly using image edge characteristics, the method can enable matching optimization to be more stable and obtain a global optimal solution more easily.
Moreover, under the condition of using the same set of calibration data, the final calibration result can be solved by two steps of rough calibration and fine calibration, and compared with the traditional scheme of calculating the calibration result at one time, the calibration precision can be higher.
Example two
The second embodiment provides a calibration reference object, wherein the calibration reference object includes a plurality of plane bodies with preset colors and forming preset angles, and the calibration reference object is used for calibrating the rotation-translation relationship between the coordinate systems of the camera device and the radar device installed in the same environment sensing system.
In a specific implementation, the target calibration reference object may include a plurality of mutually orthogonal plane bodies.
In the target calibration reference object, the same plane body has the same color, and different plane bodies have different colors.
Or, the same plane body has the same color, and two plane bodies with an intersecting relationship have different colors.
Specifically, referring to fig. 3, the number of the plane bodies may be three, and the plane bodies are orthogonal to each other, and the color is one of three primary colors, that is, red, green, and blue, respectively. Of course, in practical application, other colors are also possible, as long as they are easily distinguished.
EXAMPLE III
The third embodiment provides an environment sensing system from the perspective of a specific application scenario, wherein the specific environment sensing system may also have a plurality of more specific applications, for example, including road condition information sensing, or sensing of the surrounding environment by the robot device, and the like. Specifically, referring to fig. 4, the system may specifically include:
a camera device 410 for image data acquisition of a target environment;
a radar device 420 for performing point cloud data acquisition on the target environment;
the data processing device 430 is configured to pre-store a calibration result of a conversion relationship between the coordinate system of the camera device and the coordinate system of the radar device, and fuse data acquired by the camera device and the radar device according to the calibration result to obtain a sensing result of the target environment; the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles.
For example, in a specific application scenario, the target environment may be a target road environment, that is, perception of the road environment may be achieved by deploying a camera device and a radar device, so as to guide automatic driving of a vehicle, or provide real-time road condition information for a driver, and the like.
In a specific implementation, as shown in fig. 5, the target road environment further includes a plurality of road side units RSUs deployed in a preset arrangement manner;
at this time, the camera device, the radar device, and the data processing device may be deployed on the RSU, and in addition, a wireless communication module may be deployed on the RSU;
in this way, the data processing device may be further configured to broadcast the perception of the target road environment via a wireless communication module on the RSU.
Accordingly, the traffic participant objects such as specific vehicles can listen to the specific road condition information through the associated terminal devices (mobile terminals, vehicle-mounted terminals, etc.), and specific driving decisions can be made for the automatically driven vehicles.
Alternatively, the camera device, radar device and data processing device may also be deployed on a road participant object in the target road environment. At this time, if the traffic participant object is an automatic driving type traffic participant object, the data processing device may be further configured to make a driving decision according to a sensing result of the target road environment.
Example four
The fourth embodiment corresponds to the third embodiment, and from the perspective of a specific data processing device, an environment sensing method is provided, and referring to fig. 6, the method may specifically include:
s610: pre-storing a conversion relation calibration result between a camera equipment coordinate system and a radar equipment coordinate system, wherein the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
s620: and fusing the data collected by the camera equipment and the radar equipment according to the calibration result to obtain a perception result of the target environment.
For the parts that are not described in detail in the second to fourth embodiments, reference may be made to the description in the first embodiment, which is not described herein again.
Corresponding to the first embodiment, the first embodiment of the present application further provides an external reference calibration apparatus, referring to fig. 7, where the apparatus may specifically include:
a data obtaining unit 710, configured to obtain image data acquired by a camera device for a target calibration reference object and point cloud data acquired by a radar device for the target calibration reference object, where the target calibration reference object includes a plurality of plane bodies having preset colors and forming preset angles;
a point cloud data set determining unit 720, configured to perform processing according to the point cloud data, and determine multiple point cloud data sets with different specific location features;
a projection unit 730 for projecting the point cloud data in the set into an image acquired by the camera device;
a calibration result determining unit 740, configured to determine a calibration result according to an overlap between the point cloud data in the set and the projection result in the image acquired by the camera device.
Wherein the camera device can simultaneously acquire images of the plurality of plane bodies; the radar apparatus may scan to the plurality of planar volumes.
In one implementation manner, the point cloud data set determining unit may be specifically configured to:
processing the point cloud data according to the point cloud data, determining point cloud data respectively positioned on different plane bodies, and determining a plurality of first point cloud data sets corresponding to the different plane bodies;
the projection unit may specifically be configured to:
and respectively projecting the point cloud data in the plurality of first point cloud data sets to a three-channel image of the color original image acquired by the camera equipment.
When determining that the point cloud data sets respectively located on different plane bodies exist, the point cloud data set determining unit may specifically include:
the fitting subunit is used for fitting according to the point cloud data collected by the radar equipment and related to the calibration reference object to obtain plane equations respectively corresponding to a plurality of different plane bodies, and dividing the point cloud data respectively belonging to the different plane bodies;
and the first sampling unit is used for respectively sampling the point cloud data in each plane body according to the size information of the plane equations and preset intervals to obtain first point cloud data sets respectively corresponding to different plane bodies.
In order to further improve the precision, the device can further comprise:
and the calibration unit is used for calibrating the plane equation according to the angle prior information among the plurality of plane bodies after the plane equation is fitted.
At this time, the specific calibration result determining unit may specifically be configured to:
searching is carried out near a preset initial value of a rotational Euler angle and an initial value of a translation vector according to a preset angle step length and a translation step length, and when the overlapping degree between the point cloud data in the first point cloud data set and the projection in the three-channel image meets a preset condition, a first calibration result is determined according to the corresponding rotational Euler angle and the translation vector.
In addition, in a preferred implementation, the point cloud data set determining unit may be further configured to:
processing the point cloud data, determining point cloud data respectively positioned on different characteristic straight lines, and determining a plurality of second point cloud data sets corresponding to the different characteristic straight lines, wherein the characteristic straight lines comprise intersecting lines among a plurality of orthogonal surfaces and/or boundary lines of each plane body;
the apparatus may further include:
the image processing unit is used for processing the image data acquired by the camera equipment to obtain image data with characteristic straight line information;
the projection unit is further configured to: after the first calibration result is determined, respectively projecting the point cloud data in the plurality of second point cloud data sets into the image with the characteristic straight line information;
the calibration result determination unit is further configured to: and searching according to a preset angle step length and a translation step length near the first calibration result, and determining a second calibration result according to a corresponding rotational Euler angle and a translation vector when the overlapping degree between the point cloud data in the second point cloud data set and the projection on the characteristic straight line in the image meets a preset condition.
In another mode, the point cloud data set determining unit may be specifically configured to:
processing the point cloud data, determining point cloud data respectively positioned on different characteristic straight lines, and determining a plurality of second point cloud data sets corresponding to the different characteristic straight lines, wherein the characteristic straight lines comprise intersecting lines among a plurality of orthogonal surfaces and/or boundary lines of each plane body;
the apparatus may further include:
the image processing unit is used for processing the image data acquired by the camera equipment to obtain image data with characteristic straight line information;
the projection unit may specifically be configured to:
and respectively projecting the point cloud data in the plurality of second point cloud data sets into the image with the characteristic straight line information.
The processing of the point cloud data may specifically include:
the fitting unit is used for fitting according to the point cloud data which is collected by the radar equipment and is related to the calibration reference object to obtain plane equations which respectively correspond to a plurality of different plane bodies;
the boundary line equation determining unit is used for calculating an intersection line equation between every two planes and a boundary line equation of the plane body according to the plane equations corresponding to the different plane bodies;
and the second sampling unit is used for sampling the point cloud data on each straight line according to preset intervals on each straight line to obtain second point cloud data sets respectively corresponding to different straight lines.
Wherein the image processing unit may specifically be configured to:
and carrying out gray processing on the color image data acquired by the camera equipment, calculating gradient values of gray values in a first preset number neighborhood of each pixel, and taking the maximum gradient value on one neighborhood as the gradient value on the corresponding pixel to obtain a gradient image so as to determine the characteristic straight line information according to the gradient values.
Alternatively, in a preferred implementation, the image processing unit may be further configured to:
and calculating inverse depth transformation information of each pixel in a second preset number neighborhood in the gradient image to obtain an inverse depth transformation image so as to determine the characteristic straight line information according to the inverse depth transformation information.
Corresponding to the fourth embodiment, an environment sensing apparatus is further provided in the embodiments of the present application, and referring to fig. 8, the apparatus may specifically include:
a storage unit 810, configured to pre-store a calibration result of a conversion relationship between a camera device coordinate system and a radar device coordinate system, where the calibration result is obtained through a target calibration reference object, and the target calibration reference object includes a plurality of plane bodies having preset colors and forming preset angles;
and the sensing unit 820 is configured to fuse the data acquired by the camera device and the radar device according to the calibration result to obtain a sensing result of the target environment.
In addition, an embodiment of the present application further provides an electronic device, including:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
acquiring image data acquired by a camera device for a target calibration reference object and point cloud data acquired by a radar device for the target calibration reference object, wherein the target calibration reference object comprises a plurality of plane bodies with preset colors and preset angles;
processing the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics;
projecting the point cloud data in the collection into an image acquired by the camera device;
and determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment.
And another electronic device, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
pre-storing a conversion relation calibration result between a camera equipment coordinate system and a radar equipment coordinate system, wherein the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
and fusing the data collected by the camera equipment and the radar equipment according to the calibration result to obtain a perception result of the target environment.
Fig. 9 illustrates an architecture of an electronic device, which may specifically include a processor 910, a video display adapter 911, a disk drive 912, an input/output interface 913, a network interface 914, and a memory 920. The processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, and the network interface 914 may be communicatively connected to the memory 920 via a communication bus 930.
The processor 910 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solution provided in the present Application.
The Memory 920 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random access Memory), a static storage device, a dynamic storage device, or the like. The memory 920 may store an operating system 921 for controlling the operation of the electronic device 900, a Basic Input Output System (BIOS) for controlling low-level operations of the electronic device 900. In addition, a web browser 923, a data storage management system 924, a calibration processing system 925, and the like may also be stored. The calibration processing system 925 may be an application program that implements the operations of the foregoing steps in this embodiment of the application. In summary, when the technical solution provided in the present application is implemented by software or firmware, the relevant program code is stored in the memory 920 and invoked by the processor 910 for execution.
The input/output interface 913 is used to connect the input/output module to realize information input and output. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The network interface 914 is used for connecting a communication module (not shown in the figure) to implement communication interaction between the present device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
The bus 930 includes a path to transfer information between the various components of the device, such as the processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, the network interface 914, and the memory 920.
In addition, the electronic device 900 may also obtain information of specific pickup conditions from the virtual resource object pickup condition information database 941 for performing condition judgment, and the like.
It should be noted that although the above-mentioned devices only show the processor 910, the video display adapter 911, the disk drive 912, the input/output interface 913, the network interface 914, the memory 920, the bus 930 and so on, in a specific implementation, the device may also include other components necessary for normal operation. Furthermore, it will be understood by those skilled in the art that the apparatus described above may also include only the components necessary to implement the solution of the present application, and not necessarily all of the components shown in the figures.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments of the present application.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, the system or system embodiments are substantially similar to the method embodiments and therefore are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described system and system embodiments are only illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The external reference calibration method, the external reference calibration device and the electronic device provided by the application are introduced in detail, specific examples are applied in the description to explain the principle and the implementation of the application, and the description of the embodiments is only used to help understand the method and the core idea of the application; meanwhile, for a person skilled in the art, according to the idea of the present application, the specific embodiments and the application range may be changed. In view of the above, the description should not be taken as limiting the application.

Claims (29)

1. An external reference calibration method is characterized by comprising the following steps:
acquiring image data acquired by a camera device for a target calibration reference object and point cloud data acquired by a radar device for the target calibration reference object, wherein the target calibration reference object comprises a plurality of plane bodies with preset colors and preset angles;
processing the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics;
projecting the point cloud data in the collection into an image acquired by the camera device;
and determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment.
2. The method of claim 1,
the target calibration reference object comprises a plurality of mutually orthogonal plane bodies.
3. The method of claim 1,
in the target calibration reference object, the same plane body has the same color, and different plane bodies have different colors.
4. The method of claim 1,
in the target calibration reference object, the same plane body has the same color, and two plane bodies with an intersection relation have different colors.
5. The method of claim 1,
the camera device can acquire images of the plurality of plane bodies simultaneously; the radar apparatus may scan to the plurality of planar volumes.
6. The method of claim 1,
the processing according to the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics comprises:
processing the point cloud data according to the point cloud data, determining point cloud data respectively positioned on different plane bodies, and determining a plurality of first point cloud data sets corresponding to the different plane bodies;
the projecting point cloud data in the first point cloud data set into an image acquired by the camera device comprises:
and respectively projecting the point cloud data in the plurality of first point cloud data sets to a three-channel image of the color original image acquired by the camera equipment.
7. The method of claim 6,
the determining of the point cloud data respectively located on different plane bodies comprises the following steps:
fitting according to the point cloud data collected by the radar equipment and related to the calibration reference object to obtain plane equations respectively corresponding to a plurality of different plane bodies, and segmenting the point cloud data respectively belonging to the different plane bodies;
and respectively sampling the point cloud data in each plane body according to the size information of the planes of the plane equations and preset intervals to obtain first point cloud data sets respectively corresponding to different plane bodies.
8. The method of claim 7, further comprising:
and after the plane equation is fitted, calibrating the plane equation according to the angle prior information among the plurality of plane bodies.
9. The method of claim 6,
determining a calibration result according to a degree of overlap between point cloud data in the set and a projection result in an image acquired by the camera device, including:
searching is carried out near a preset initial value of a rotational Euler angle and an initial value of a translation vector according to a preset angle step length and a translation step length, and when the overlapping degree between the point cloud data in the first point cloud data set and the projection in the three-channel image meets a preset condition, a first calibration result is determined according to the corresponding rotational Euler angle and the translation vector.
10. The method of claim 9, further comprising:
processing the point cloud data, determining point cloud data respectively positioned on different characteristic straight lines, and determining a plurality of second point cloud data sets corresponding to the different characteristic straight lines, wherein the characteristic straight lines comprise intersecting lines among a plurality of orthogonal surfaces and/or boundary lines of each plane body;
processing the image data acquired by the camera equipment to obtain image data with characteristic straight line information;
after the first calibration result is determined, respectively projecting the point cloud data in the plurality of second point cloud data sets into the image with the characteristic straight line information;
and searching according to a preset angle step length and a translation step length near the first calibration result, and determining a second calibration result according to a corresponding rotational Euler angle and a translation vector when the overlapping degree between the point cloud data in the second point cloud data set and the projection on the characteristic straight line in the image meets a preset condition.
11. The method of claim 1,
the processing according to the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics comprises:
processing the point cloud data, determining point cloud data respectively positioned on different characteristic straight lines, and determining a plurality of second point cloud data sets corresponding to the different characteristic straight lines, wherein the characteristic straight lines comprise intersecting lines among a plurality of orthogonal surfaces and/or boundary lines of each plane body;
the method further comprises the following steps:
processing the image data acquired by the camera equipment to obtain image data with characteristic straight line information;
the projecting the point cloud data in the collection into an image captured by the camera device comprises:
and respectively projecting the point cloud data in the plurality of second point cloud data sets into the image with the characteristic straight line information.
12. The method according to claim 10 or 11,
the determining point cloud data respectively located on different characteristic straight lines comprises:
fitting according to the point cloud data collected by the radar equipment and related to the calibration reference object to obtain plane equations respectively corresponding to a plurality of different plane bodies;
calculating an intersection line equation between every two planes and a boundary line equation of the plane bodies according to the plane equations corresponding to the different plane bodies;
and respectively sampling the point cloud data on each straight line according to preset intervals on each straight line to obtain second point cloud data sets respectively corresponding to different straight lines.
13. The method according to claim 10 or 11,
the processing of the image data acquired by the camera device to obtain the image data with characteristic straight line information includes:
and carrying out gray processing on the color image data acquired by the camera equipment, calculating gradient values of gray values in a first preset number neighborhood of each pixel, and taking the maximum gradient value on one neighborhood as the gradient value on the corresponding pixel to obtain a gradient image so as to determine the characteristic straight line information according to the gradient values.
14. The method of claim 13, further comprising:
and calculating inverse depth transformation information of each pixel in a second preset number neighborhood in the gradient image to obtain an inverse depth transformation image so as to determine the characteristic straight line information according to the inverse depth transformation information.
15. A calibration reference object is characterized in that,
the calibration reference object comprises a plurality of plane bodies with preset colors and preset angles, and is used for calibrating the rotation and translation relation between the coordinate systems of the camera equipment and the radar equipment which are installed in the same environment sensing system.
16. Calibration reference according to claim 15,
the target calibration reference object comprises a plurality of mutually orthogonal plane bodies.
17. Calibration reference according to claim 15,
in the target calibration reference object, the same plane body has the same color, and different plane bodies have different colors.
18. Calibration reference according to claim 15,
in the target calibration reference object, the same plane body has the same color, and two plane bodies with an intersection relation have different colors.
19. Calibration reference according to claim 15,
the number of the plane bodies is three, the plane bodies are mutually orthogonal, and the color is one of three primary colors.
20. An environment awareness system, comprising:
the camera equipment is used for acquiring image data of a target environment;
the radar equipment is used for carrying out point cloud data acquisition on the target environment;
the data processing device is used for pre-storing a conversion relation calibration result between the camera device coordinate system and the radar device coordinate system, fusing data collected by the camera device and the radar device according to the calibration result and obtaining a perception result of the target environment; the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles.
21. The system of claim 20,
the target environment includes a target road environment.
22. The system of claim 21,
the target road environment further comprises a plurality of Road Side Units (RSUs) deployed according to a preset arrangement mode;
the camera equipment, the radar equipment and the data processing equipment are deployed on the RSU, and a wireless communication module is also deployed on the RSU;
the data processing device is further configured to broadcast the sensing result of the target road environment through a wireless communication module on the RSU.
23. The system of claim 21,
the camera device, radar device, and data processing device are deployed on a traffic participant object in the target road environment.
24. The system of claim 23,
the traffic participant object comprises an autopilot-like traffic participant object;
the data processing equipment is further used for making driving decision according to the perception result of the target road environment.
25. An environment awareness method, comprising:
pre-storing a conversion relation calibration result between a camera equipment coordinate system and a radar equipment coordinate system, wherein the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
and fusing the data collected by the camera equipment and the radar equipment according to the calibration result to obtain a perception result of the target environment.
26. An external reference calibration device, comprising:
the system comprises a data obtaining unit, a data processing unit and a data processing unit, wherein the data obtaining unit is used for obtaining image data acquired by camera equipment on a target calibration reference object and point cloud data acquired by radar equipment on the target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
the point cloud data set determining unit is used for processing according to the point cloud data and determining a plurality of point cloud data sets with different specific position characteristics;
a projection unit for projecting the point cloud data in the set into an image acquired by the camera device;
and the calibration result determining unit is used for determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment.
27. An environment sensing device, comprising:
the device comprises a storage unit, a calibration unit and a control unit, wherein the storage unit is used for storing a conversion relation calibration result between a camera equipment coordinate system and a radar equipment coordinate system in advance, the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
and the sensing unit is used for fusing the data acquired by the camera equipment and the radar equipment according to the calibration result to obtain a sensing result of the target environment.
28. An electronic device, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
acquiring image data acquired by a camera device for a target calibration reference object and point cloud data acquired by a radar device for the target calibration reference object, wherein the target calibration reference object comprises a plurality of plane bodies with preset colors and preset angles;
processing the point cloud data to determine a plurality of point cloud data sets with different specific position characteristics;
projecting the point cloud data in the collection into an image acquired by the camera device;
and determining a calibration result according to the overlapping degree between the point cloud data in the set and the projection result in the image acquired by the camera equipment.
29. An electronic device, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
pre-storing a conversion relation calibration result between a camera equipment coordinate system and a radar equipment coordinate system, wherein the calibration result is obtained through a target calibration reference object, and the target calibration reference object comprises a plurality of plane bodies which have preset colors and form preset angles;
and fusing the data collected by the camera equipment and the radar equipment according to the calibration result to obtain a perception result of the target environment.
CN201811636001.XA 2018-12-29 2018-12-29 External parameter calibration method and device and electronic equipment Active CN111383279B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811636001.XA CN111383279B (en) 2018-12-29 2018-12-29 External parameter calibration method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811636001.XA CN111383279B (en) 2018-12-29 2018-12-29 External parameter calibration method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111383279A true CN111383279A (en) 2020-07-07
CN111383279B CN111383279B (en) 2023-06-20

Family

ID=71220951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811636001.XA Active CN111383279B (en) 2018-12-29 2018-12-29 External parameter calibration method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111383279B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111815717A (en) * 2020-07-15 2020-10-23 西北工业大学 Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN112184828A (en) * 2020-08-21 2021-01-05 北京百度网讯科技有限公司 External parameter calibration method and device for laser radar and camera and automatic driving vehicle
CN112419420A (en) * 2020-09-17 2021-02-26 腾讯科技(深圳)有限公司 Camera calibration method and device, electronic equipment and storage medium
CN112614189A (en) * 2020-12-09 2021-04-06 中国北方车辆研究所 Combined calibration method based on camera and 3D laser radar
CN113341401A (en) * 2021-07-12 2021-09-03 广州小鹏自动驾驶科技有限公司 Vehicle-mounted laser radar calibration method and device, vehicle and storage medium
CN113406604A (en) * 2021-06-30 2021-09-17 山东新一代信息产业技术研究院有限公司 Device and method for calibrating positions of laser radar and camera
CN113763478A (en) * 2020-09-09 2021-12-07 北京京东乾石科技有限公司 Unmanned vehicle camera calibration method, device, equipment, storage medium and system
WO2022179549A1 (en) * 2021-02-26 2022-09-01 上海商汤智能科技有限公司 Calibration method and apparatus, computer device, and storage medium
CN116071431A (en) * 2021-11-03 2023-05-05 北京三快在线科技有限公司 Calibration method and device, storage medium and electronic equipment
CN116449347A (en) * 2023-06-14 2023-07-18 蘑菇车联信息科技有限公司 Calibration method and device of roadside laser radar and electronic equipment
CN117784121A (en) * 2024-02-23 2024-03-29 四川天府新区北理工创新装备研究院 Combined calibration method and system for road side sensor and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049912A (en) * 2012-12-21 2013-04-17 浙江大学 Random trihedron-based radar-camera system external parameter calibration method
CN103983961A (en) * 2014-05-20 2014-08-13 南京理工大学 Three-dimensional calibration target for joint calibration of 3D laser radar and camera
CN104484887A (en) * 2015-01-19 2015-04-01 河北工业大学 External parameter calibration method used when camera and two-dimensional laser range finder are used in combined mode
US20180088228A1 (en) * 2016-09-23 2018-03-29 Baidu Online Network Technology (Beijing) Co., Ltd. Obstacle detection method and apparatus for vehicle-mounted radar system
CN108828606A (en) * 2018-03-22 2018-11-16 中国科学院西安光学精密机械研究所 One kind being based on laser radar and binocular Visible Light Camera union measuring method
CN109100707A (en) * 2018-08-21 2018-12-28 百度在线网络技术(北京)有限公司 Scaling method, device, equipment and the storage medium of radar sensor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049912A (en) * 2012-12-21 2013-04-17 浙江大学 Random trihedron-based radar-camera system external parameter calibration method
CN103983961A (en) * 2014-05-20 2014-08-13 南京理工大学 Three-dimensional calibration target for joint calibration of 3D laser radar and camera
CN104484887A (en) * 2015-01-19 2015-04-01 河北工业大学 External parameter calibration method used when camera and two-dimensional laser range finder are used in combined mode
US20180088228A1 (en) * 2016-09-23 2018-03-29 Baidu Online Network Technology (Beijing) Co., Ltd. Obstacle detection method and apparatus for vehicle-mounted radar system
CN108828606A (en) * 2018-03-22 2018-11-16 中国科学院西安光学精密机械研究所 One kind being based on laser radar and binocular Visible Light Camera union measuring method
CN109100707A (en) * 2018-08-21 2018-12-28 百度在线网络技术(北京)有限公司 Scaling method, device, equipment and the storage medium of radar sensor

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DIANA-MARGARITA CÓRDOVA-ESPARZA 等: "A multiple camera calibration and point cloud fusion tool for Kinect V2" *
贾子永;任国全;李冬伟;程子阳;: "基于梯形棋盘格的摄像机和激光雷达标定方法" *
赵松 等: "基于立体标定靶的扫描仪与数码相机联合标定" *
闫利;曹亮;陈长军;黄亮;: "车载全景影像与激光点云数据配准方法研究" *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111815717B (en) * 2020-07-15 2022-05-17 西北工业大学 Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN111815717A (en) * 2020-07-15 2020-10-23 西北工业大学 Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN112184828A (en) * 2020-08-21 2021-01-05 北京百度网讯科技有限公司 External parameter calibration method and device for laser radar and camera and automatic driving vehicle
CN112184828B (en) * 2020-08-21 2023-12-05 阿波罗智联(北京)科技有限公司 Laser radar and camera external parameter calibration method and device and automatic driving vehicle
CN113763478B (en) * 2020-09-09 2024-04-12 北京京东尚科信息技术有限公司 Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN113763478A (en) * 2020-09-09 2021-12-07 北京京东乾石科技有限公司 Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN112419420A (en) * 2020-09-17 2021-02-26 腾讯科技(深圳)有限公司 Camera calibration method and device, electronic equipment and storage medium
CN112614189A (en) * 2020-12-09 2021-04-06 中国北方车辆研究所 Combined calibration method based on camera and 3D laser radar
WO2022179549A1 (en) * 2021-02-26 2022-09-01 上海商汤智能科技有限公司 Calibration method and apparatus, computer device, and storage medium
CN113406604A (en) * 2021-06-30 2021-09-17 山东新一代信息产业技术研究院有限公司 Device and method for calibrating positions of laser radar and camera
CN113341401A (en) * 2021-07-12 2021-09-03 广州小鹏自动驾驶科技有限公司 Vehicle-mounted laser radar calibration method and device, vehicle and storage medium
CN116071431A (en) * 2021-11-03 2023-05-05 北京三快在线科技有限公司 Calibration method and device, storage medium and electronic equipment
CN116449347A (en) * 2023-06-14 2023-07-18 蘑菇车联信息科技有限公司 Calibration method and device of roadside laser radar and electronic equipment
CN116449347B (en) * 2023-06-14 2023-10-03 蘑菇车联信息科技有限公司 Calibration method and device of roadside laser radar and electronic equipment
CN117784121A (en) * 2024-02-23 2024-03-29 四川天府新区北理工创新装备研究院 Combined calibration method and system for road side sensor and electronic equipment

Also Published As

Publication number Publication date
CN111383279B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
CN111383279B (en) External parameter calibration method and device and electronic equipment
CN112894832B (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
CN111179358B (en) Calibration method, device, equipment and storage medium
CN110146869B (en) Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
US9972067B2 (en) System and method for upsampling of sparse point cloud for 3D registration
CN112270713A (en) Calibration method and device, storage medium and electronic device
CN109961468B (en) Volume measurement method and device based on binocular vision and storage medium
CN109801333B (en) Volume measurement method, device and system and computing equipment
CN109918977B (en) Method, device and equipment for determining idle parking space
CN111815716A (en) Parameter calibration method and related device
CN110988849B (en) Calibration method and device of radar system, electronic equipment and storage medium
CN111080662A (en) Lane line extraction method and device and computer equipment
EP2154650A1 (en) 3D time-of-flight camera system and position/orientation calibration method therefor
WO2021098448A1 (en) Sensor calibration method and device, storage medium, calibration system, and program product
WO2021037086A1 (en) Positioning method and apparatus
WO2022217988A1 (en) Sensor configuration scheme determination method and apparatus, computer device, storage medium, and program
CN108362205B (en) Space distance measuring method based on fringe projection
US11880993B2 (en) Image processing device, driving assistance system, image processing method, and program
CN113160328A (en) External reference calibration method, system, robot and storage medium
CN113034612A (en) Calibration device and method and depth camera
EP3782363B1 (en) Method for dynamic stereoscopic calibration
CN111709995A (en) Position calibration method between laser radar and camera
CN111382591B (en) Binocular camera ranging correction method and vehicle-mounted equipment
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230721

Address after: Room 437, Floor 4, Building 3, No. 969, Wenyi West Road, Wuchang Subdistrict, Yuhang District, Hangzhou City, Zhejiang Province

Patentee after: Wuzhou Online E-Commerce (Beijing) Co.,Ltd.

Address before: Box 847, four, Grand Cayman capital, Cayman Islands, UK

Patentee before: ALIBABA GROUP HOLDING Ltd.

TR01 Transfer of patent right