CN113759348B - Radar calibration method, device, equipment and storage medium - Google Patents

Radar calibration method, device, equipment and storage medium Download PDF

Info

Publication number
CN113759348B
CN113759348B CN202110075819.4A CN202110075819A CN113759348B CN 113759348 B CN113759348 B CN 113759348B CN 202110075819 A CN202110075819 A CN 202110075819A CN 113759348 B CN113759348 B CN 113759348B
Authority
CN
China
Prior art keywords
cloud data
point cloud
plane
target
auxiliary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110075819.4A
Other languages
Chinese (zh)
Other versions
CN113759348A (en
Inventor
林金表
徐卓然
许新玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Kunpeng Jiangsu Technology Co Ltd
Original Assignee
Jingdong Kunpeng Jiangsu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong Kunpeng Jiangsu Technology Co Ltd filed Critical Jingdong Kunpeng Jiangsu Technology Co Ltd
Priority to CN202110075819.4A priority Critical patent/CN113759348B/en
Publication of CN113759348A publication Critical patent/CN113759348A/en
Application granted granted Critical
Publication of CN113759348B publication Critical patent/CN113759348B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the invention discloses a radar calibration method, a radar calibration device, radar calibration equipment and a storage medium. The method comprises the following steps: acquiring main point cloud data and auxiliary point cloud data respectively acquired by a main radar and an auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three intersected calibration planes; determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the auxiliary point cloud data; based on each plane equation, a target transformation matrix between the primary radar and the auxiliary radar is determined, and calibration point cloud data is determined based on the target transformation matrix. The embodiment of the invention solves the problem of more limiting factors in the existing radar calibration process, and widens the applicable scene of radar calibration.

Description

Radar calibration method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of radars, in particular to a radar calibration method, a radar calibration device, radar calibration equipment and a storage medium.
Background
Lidar can help robots accurately perceive the surrounding environment, often configured onto different devices, such as a drone or an unmanned aerial vehicle. However, because the unmanned vehicle has a large volume, no matter where the single laser radar is installed on the unmanned vehicle, a detection blind area exists, so that the unmanned vehicle is often configured with a main laser radar and a plurality of auxiliary radars so as to achieve the purpose of information complementation.
Because a plurality of laser radars give detected point cloud coordinates according to a self coordinate system, radar calibration work is an indispensable step for integrating detection results of all the laser radars. The radar calibration refers to unifying the point cloud coordinates given by a plurality of laser radars into the same laser radar coordinate system based on the mutual conversion relations of different coordinate systems, so as to realize the fusion of the point cloud data. The common radar calibration method mainly calibrates the point cloud data of the overlapping areas of the plurality of laser radars under the condition that the point cloud shapes acquired by the plurality of laser radars are similar.
In the process of realizing the invention, the prior art is found to have at least the following technical problems:
The existing radar calibration method has two limiting conditions: the point clouds are similar in shape and more overlapping areas need to exist. Therefore, for the scene adopting various laser radars of different types or the scene with smaller laser radar overlapping area, the existing radar calibration method is not suitable, and the suitable scene of radar calibration is further limited.
Disclosure of Invention
The embodiment of the invention provides a radar calibration method, a device, equipment and a storage medium, which are used for solving the problem of more limiting factors of the existing laser radar and widening the applicable scene of radar calibration.
In a first aspect, an embodiment of the present invention provides a radar calibration method, including:
acquiring main point cloud data and auxiliary point cloud data respectively acquired by a main radar and an auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three intersected calibration planes;
Determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the auxiliary point cloud data;
Based on each plane equation, a target transformation matrix between the primary radar and the auxiliary radar is determined, and calibration point cloud data is determined based on the target transformation matrix.
In a second aspect, an embodiment of the present invention further provides a radar calibration device, including:
the point cloud data acquisition module is used for acquiring main point cloud data and auxiliary point cloud data respectively acquired by the main radar and the auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three intersected calibration planes;
the plane equation determining module is used for determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the auxiliary point cloud data;
and the point cloud data calibration module is used for determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation and determining calibration point cloud data based on the target conversion matrix.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
one or more processors;
A memory for storing one or more programs;
The one or more programs, when executed by the one or more processors, cause the one or more processors to implement any of the radar calibration methods described above.
In a fourth aspect, embodiments of the present invention also provide a storage medium containing computer executable instructions which, when executed by a computer processor, are used to perform any of the radar calibration methods referred to above.
The embodiments of the above invention have the following advantages or benefits:
According to the embodiment of the invention, the plane point cloud data corresponding to the three intersected calibration planes are acquired by the main radar and the auxiliary radar, the plane equation of each calibration plane is determined based on the plane point cloud data, and the target conversion matrix between the main radar and the auxiliary radar is determined based on the plane equation, so that the problem of more limiting factors in the radar calibration process in the prior art is solved, the installation position relationship between the radars and the radar types are not limited, and the applicable scene of radar calibration is widened.
Drawings
Fig. 1 is a flowchart of a radar calibration method according to an embodiment of the present invention.
Fig. 2 is a flowchart of a radar calibration method according to a second embodiment of the present invention.
Fig. 3 is a schematic diagram of coordinates of feature points of an intersecting line according to a second embodiment of the invention.
Fig. 4A is a flowchart of a method for determining a main feature set according to a second embodiment of the present invention.
Fig. 4B is a flowchart of a method for determining an assist feature set according to a second embodiment of the present invention.
Fig. 5 is a schematic diagram of a radar calibration device according to a third embodiment of the present invention.
Fig. 6 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Example 1
Fig. 1 is a flowchart of a radar calibration method according to an embodiment of the present invention, where the method may be applied to a radar calibration situation in a multi-radar navigation or positioning scenario, and the method may be performed by a radar calibration device, where the device may be implemented in software and/or hardware. The method specifically comprises the following steps:
S110, acquiring main point cloud data and auxiliary point cloud data respectively acquired by a main radar and an auxiliary radar.
In this embodiment, at least two radars may be set in a radar navigation or radar positioning scene, specifically, any one of the radars may be used as a main radar, and the other radars may be used as auxiliary radars.
In this embodiment, the main point cloud data and the auxiliary point cloud data respectively include planar point cloud data corresponding to three calibration planes intersecting each other two by two. Specifically, the three calibration planes corresponding to the main radar and the three calibration planes corresponding to the auxiliary radar may be the same set of calibration planes, or may be different sets of calibration planes. The main radar acquires plane point cloud data corresponding to the calibration plane 1, the calibration plane 2 and the calibration plane 3 respectively to obtain main point cloud data; the auxiliary radar collects the plane point cloud data corresponding to the calibration plane 4, the calibration plane 5 and the calibration plane 6 respectively to obtain auxiliary point cloud data, and of course, the auxiliary radar can also collect the plane point cloud data corresponding to the calibration plane 1, the calibration plane 2 and the calibration plane 4 respectively to obtain auxiliary point cloud data, wherein the calibration plane 1, the calibration plane 2 and the calibration plane 4 intersect in pairs.
In one embodiment, when the calibration plane corresponding to the primary radar is different from the calibration plane corresponding to the secondary radar, the plane angle between the calibration planes corresponding to the primary radar is the same as the plane angle between the calibration planes corresponding to the secondary radar. Specifically, the calibration planes corresponding to the main radar and the auxiliary radar may be different, or the three calibration planes may be different, or part of the calibration planes may be different, and part of the calibration planes may be the same. By way of example, assuming that the plane angles between the calibration planes corresponding to the main radar are 90 °, 90 ° and 30 °, respectively, the plane angles between the calibration planes corresponding to the auxiliary radar are 90 °, 90 ° and 30 °, respectively.
In one embodiment, the three calibration planes may alternatively comprise two-by-two perpendicular calibration planes. The three calibration planes are, for example, the ground, two wall surfaces perpendicular to each other and to the ground, and specifically, the plane angles between the calibration planes are 90 °, 90 ° and 90 °, respectively.
In one embodiment, optionally, the point cloud data acquired by the radar is acquired according to a scanning mode of the radar. In an exemplary embodiment, when the main radar is a mechanical rotary laser radar, the scanning mode of the mechanical rotary laser radar is single-frame scanning, and the single-frame point cloud data acquired by the mechanical rotary laser radar is taken as the main point cloud data. When the main radar is a rotary mirror type laser radar, the scanning mode of the rotary mirror type laser radar is non-repeated scanning, and the non-repeated scanning means that even if the main radar is in a static state, the point cloud data acquired by each frame are different. Therefore, the multi-frame point clouds acquired by the rotary mirror type laser radar are accumulated to obtain main point cloud data.
In this example, the primary point cloud data is denoted as P 0 and the auxiliary point cloud data is denoted as Q 0.
S120, determining plane equations corresponding to the plane point cloud data respectively based on the plane point cloud data corresponding to the main point cloud data and the auxiliary point cloud data respectively.
In one embodiment, optionally, determining a plane equation corresponding to each of the planar point cloud data based on the planar point cloud data corresponding to each of the primary point cloud data and the auxiliary point cloud data, includes: and carrying out coordinate conversion on the auxiliary point cloud data based on a preset conversion matrix to obtain pre-auxiliary point cloud data, and determining plane equations corresponding to the main point cloud data and the plane point cloud data respectively based on the plane point cloud data corresponding to the main point cloud data and the pre-auxiliary point cloud data.
The preset transformation matrix may be determined based on a mounting pose of the main radar and a mounting pose of the auxiliary radar, wherein the mounting pose includes a mounting position coordinate and a central axis angle, for example.
In this embodiment, coordinate conversion is performed on auxiliary point cloud data based on a preset conversion matrix, so that the auxiliary point cloud data can be converted into a coordinate system similar to a coordinate system where a main radar is located, and similarity of the coordinate system between the main point cloud data and the pre-auxiliary point cloud data is achieved, so that consistency of subsequent parameter setting is ensured, and algorithm calculation difficulty is reduced.
Wherein, illustratively, the pre-assisted point cloud data is denoted as Q 1.
In one embodiment, optionally, determining a plane equation corresponding to each of the planar point cloud data based on the planar point cloud data corresponding to each of the primary point cloud data and the pre-auxiliary point cloud data, includes: respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining a first target plane equation corresponding to the first calibration plane based on first plane point cloud data corresponding to the target point cloud data; determining a second target plane equation corresponding to the second calibration plane and a third target plane equation corresponding to the third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; wherein the reference planar point cloud data includes second planar point cloud data and third planar point cloud data.
The first plane point cloud data is specifically plane point cloud data corresponding to a first calibration plane. In one embodiment, optionally, the method further comprises: selecting point cloud data with the minimum ordinate in the target point cloud data based on the preset selection quantity, and determining the average height corresponding to the selected point cloud data; and screening the target point cloud data based on the average height and a preset height difference threshold value to obtain first plane point cloud data corresponding to the target point cloud data.
In this embodiment, the first calibration plane is the ground. For example, the preset number of choices may be 1000 or 2000, and the preset height difference threshold may be 0.05. For example, it is assumed that 1000 point cloud data with the smallest ordinate in the target point cloud data are selected to calculate an average height h, and point cloud data with the ordinate not exceeding h+α in the target point cloud data is used as the first plane point cloud data, where α represents a preset height difference threshold. In this example, the first planar point cloud data corresponding to the primary point cloud data is denoted as P G, and the first planar point cloud data corresponding to the auxiliary point cloud data is denoted as Q G.
The algorithm for determining the first target plane equation based on the first plane point cloud data may be, for example, a least squares method or a random sample consensus (RANSAC) algorithm, among others.
In one embodiment, optionally, the method further comprises: determining reference plane point cloud data corresponding to the target point cloud data based on the target point cloud data and a first target plane equation corresponding to the target point cloud data; the first target plane equation is a first main plane equation when the target point cloud data is main point cloud data, and is a first auxiliary plane equation when the target point cloud data is pre-auxiliary point cloud data.
Specifically, the target point cloud data includes plane point cloud data corresponding to three calibration planes, and point cloud data corresponding to a first target plane equation in the target point cloud data is deleted, namely, point cloud data corresponding to the first calibration plane in the target point cloud data is deleted, so as to obtain reference plane point cloud data. In this example, reference plane point cloud data corresponding to the main point cloud data is denoted as P AB, and reference plane point cloud data corresponding to the pre-auxiliary point cloud data is denoted as Q AB.
In one embodiment, optionally, determining the second target plane equation corresponding to the second calibration plane and the third target plane equation corresponding to the third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data includes: determining a second target plane equation corresponding to the second calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; determining third plane point cloud data corresponding to the target point cloud data based on the reference plane point cloud data and the second target plane equation; based on the third plane point cloud data, a third target plane equation corresponding to the third calibration plane is determined.
Wherein the algorithm for determining the second target plane equation based on the reference plane point cloud data may specifically be a random sample consensus algorithm.
Specifically, the reference plane point cloud data includes plane point cloud data corresponding to two calibration planes, and point cloud data corresponding to a second target plane equation in the reference plane point cloud data is deleted, namely point cloud data corresponding to a second calibration plane in the reference plane point cloud data is deleted, so as to obtain third plane point cloud data.
The algorithm for determining the third target plane equation based on the third plane point cloud data may be, for example, a least squares or random sample consensus algorithm.
The exemplary method includes the steps of recording a first principal plane equation corresponding to principal point cloud data as G P, a second principal plane equation corresponding to principal point cloud data as a P, a third principal plane equation corresponding to principal point cloud data as B P, a first auxiliary plane equation corresponding to pre-auxiliary point cloud data as G Q, a second auxiliary plane equation corresponding to pre-auxiliary point cloud data as a Q, and a third auxiliary plane equation corresponding to pre-auxiliary point cloud data as B Q.
S130, determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining the target point cloud data based on the target conversion matrix.
In one embodiment, optionally, if coordinate conversion is performed on auxiliary point cloud data based on a preset conversion matrix to obtain pre-auxiliary point cloud data, and plane equations corresponding to the respective plane point cloud data are determined based on plane point cloud data corresponding to the main point cloud data and the pre-auxiliary point cloud data, determining a target conversion matrix between the main radar and the auxiliary radar based on the respective plane equations includes: and determining a reference conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining a target conversion matrix between the main radar and the auxiliary radar based on a preset conversion matrix and the reference conversion matrix.
For example, the preset transformation matrix is denoted as T 1, and the reference transformation matrix is denoted as T 2, the target transformation matrix T satisfies the formula: t=t 1*T2.
In one embodiment, optionally, determining a reference transformation matrix between the primary radar and the secondary radar based on the respective plane equations includes: determining characteristic point coordinates corresponding to the main point cloud data and the pre-auxiliary point cloud data respectively based on three plane equations corresponding to the main point cloud data and the pre-auxiliary point cloud data respectively; the characteristic point coordinates comprise plane intersection point coordinates and intersection line characteristic point coordinates, wherein the plane intersection point coordinates are used for representing intersection point coordinates of three calibration planes, and the intersection line characteristic point coordinates are used for representing characteristic point coordinates on an intersection line between the three calibration planes; and performing pose estimation operation on the main feature set generated based on the feature point coordinates corresponding to the main point cloud data and the auxiliary feature set generated based on the feature point coordinates corresponding to the pre-auxiliary point cloud data to obtain a reference conversion matrix between the main radar and the auxiliary radar.
The characteristic point coordinates of the intersecting line comprise characteristic point coordinates AB on an intersecting line between a calibration plane A and a calibration plane B, characteristic point coordinates AC on an intersecting line between the calibration plane A and a calibration plane C and characteristic point coordinates BC on an intersecting line between the calibration plane B and the calibration plane C.
A specific implementation of determining the feature point coordinates based on the plane equation is explained in detail in the following examples.
In one embodiment, the algorithm for pose estimation optionally includes a singular value decomposition (Singular Value Decomposition, SVD) algorithm.
Specifically, assuming that the main feature set is x= { X 1,x2,...,xn }, and the auxiliary feature set is y= { Y 1,y2,...,yn }, the reference transformation matrix obtained by solving needs to make the distance between the feature point coordinates in the main feature set and the feature point coordinates in the auxiliary feature set be shortest after coordinate transformation, and the distance satisfies the formula:
Where R represents a rotation matrix and t represents a translation matrix.
Specifically, a mean value X 0 corresponding to the main feature set X and a mean value Y 0 corresponding to the auxiliary feature set Y are calculated, feature point coordinates in the main feature set X are respectively subtracted by the mean value X 0 to obtain X i ', feature point coordinates in the auxiliary feature set Y are respectively subtracted by the mean value Y 0 to obtain Y i', and a matrix H is further obtained, wherein the matrix H satisfies the formula:
decomposing the matrix H based on a singular value decomposition algorithm to obtain an optimal rotation matrix R * with the shortest distance which meets a formula R *=VUT, wherein V and U are unit orthogonal matrices respectively, and an optimal translation matrix t * meets the formula: t *=y0-R*·x0.
Specifically, the reference transformation matrix T 2 satisfies the formula:
Specifically, determining the target point cloud data based on the target transformation matrix includes: and performing coordinate transformation on the main point cloud data or the auxiliary point cloud data based on the target transformation matrix to obtain calibration main point cloud data or calibration auxiliary point cloud data. The coordinate system where the main point cloud data and the auxiliary point cloud data are calibrated is the same, or the coordinate system where the auxiliary point cloud data and the main point cloud data are calibrated is the same.
According to the technical scheme, plane point cloud data corresponding to three intersecting calibration planes are acquired through the main radar and the auxiliary radar, plane equations of the calibration planes are determined based on the plane point cloud data, and a target conversion matrix between the main radar and the auxiliary radar is determined based on the plane equations, so that the problem that the radar calibration process in the prior art is more in limiting factors is solved, the installation position relation between the radars and the radar types are not limited, and the application scene of radar calibration is widened.
Example two
Fig. 2 is a flowchart of a radar calibration method according to a second embodiment of the present invention, and the technical solution of this embodiment is further refinement based on the foregoing embodiment. Optionally, determining, based on the target point cloud data and the first target plane equation corresponding to the target point cloud data, reference plane point cloud data corresponding to the target point cloud data includes: determining parameter data between target point cloud data and a main radar central point, and taking the point cloud data meeting a preset parameter range in the target point cloud data as reference target point cloud data; wherein the parameter data comprises distance data and/or offset angle data; the method includes determining reference plane point cloud data corresponding to target point cloud data based on the reference target point cloud data and a first target plane equation corresponding to the target point cloud data.
The specific implementation steps of the embodiment include:
s210, acquiring main point cloud data and auxiliary point cloud data respectively acquired by a main radar and an auxiliary radar.
S220, taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data respectively, and determining a first target plane equation corresponding to a first calibration plane based on first plane point cloud data corresponding to the target point cloud data.
S230, determining parameter data between the target point cloud data and a main radar center point, and taking the point cloud data meeting a preset parameter range in the target point cloud data as reference target point cloud data.
In this embodiment, the parameter data includes distance data and/or offset angle data. Specifically, the offset angle data is an X-axis deflection angle, a Y-axis deflection angle or a Z-axis deflection angle of the target point cloud data relative to the main radar center point.
For example, assuming that the coordinates of the point cloud data are (x, y, z) and the coordinates of the center point of the main radar are (0, 0), the distance data L satisfies the formula: The X-axis deflection angle θ x satisfies the formula: /(I) The Y-axis deflection angle θ y satisfies the formula: /(I)The Z-axis deflection angle θ z satisfies the formula: /(I)
The preset distance range may be [0,5m ], and the preset angle range may be [0,90 ° ], for example. The preset distance range and the preset angle range are related to the position relation between the main radar and the three calibration planes, and specific parameters of the preset distance range and the preset angle range are not limited.
Therein, exemplary, reference primary point cloud data is denoted as P 2 and reference secondary point cloud data is denoted as Q 2.
S240, determining reference plane point cloud data corresponding to the target point cloud data based on the reference target point cloud data and a first target plane equation corresponding to the target point cloud data.
In one embodiment, optionally, the offset angle data is a Z-axis deflection angle when the first calibration plane is an XOZ plane, the offset angle data is a Y-axis deflection angle when the first calibration plane is an XOZ plane, and the offset angle data is an X-axis deflection angle when the first calibration plane is a YOZ plane. Specifically, when the first calibration plane is the ground, the offset angle data is the Z-axis deflection angle. The Z-axis yaw angle may also be referred to as a horizontal yaw angle.
In this embodiment, theoretically, the reference target point cloud data includes planar point cloud data corresponding to the second calibration plane and planar point cloud data corresponding to the third calibration plane, but is affected by the screening precision, and the reference target point cloud data also includes a small amount of planar point cloud data corresponding to the first calibration plane. Specifically, deleting point cloud data corresponding to the first target plane equation in the reference target point cloud data, namely deleting a small amount of point cloud data corresponding to the first calibration plane in the target point cloud data, and obtaining reference plane point cloud data.
S250, determining a second target plane equation corresponding to the second calibration plane and a third target plane equation corresponding to the third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data.
And S260, determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining the standard point cloud data based on the target conversion matrix.
On the basis of the above embodiment, optionally, determining the feature point coordinates corresponding to the main point cloud data and the pre-auxiliary point cloud data respectively based on three plane equations corresponding to the main point cloud data and the pre-auxiliary point cloud data respectively includes: respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining three target straight-line equations corresponding to three calibration planes based on three plane equations corresponding to the target point cloud data, and determining plane intersection point coordinates corresponding to the target point cloud data based on each target straight-line equation; based on the plane intersection point coordinates and preset distance standards respectively corresponding to the target linear equations, respectively determining intersection line feature point coordinates corresponding to the target linear equations; the preset distance standard comprises the distance between the intersection line characteristic point coordinates and the plane intersection point coordinates meeting the preset standard distance, and the distance between the intersection line characteristic point coordinates and the main radar center point is larger or smaller than the distance between the other characteristic point coordinates meeting the preset standard distance and the main radar center point.
In this embodiment, the main point cloud data is taken as an example for explanation, and correspondingly, the specific embodiment corresponding to the pre-auxiliary point cloud data is similar to the specific embodiment corresponding to the main point cloud data.
Specifically, based on a first principal plane equation G P, a second principal plane equation a P, and a third principal plane equation B P corresponding to the principal point cloud data, a principal straight line equation GA P between the first calibration plane and the second calibration plane, a principal straight line equation GB P between the first calibration plane and the third calibration plane, and a principal straight line equation AB P between the second calibration plane and the third calibration plane are determined, and further, based on a principal straight line equation GA P, a principal straight line equation GB P, and a principal straight line equation AB P, plane intersection coordinates corresponding to the principal point cloud data are determined.
The intersecting characteristic points are characteristic points on a main linear equation. Specifically, preset standard distances in preset distance standards corresponding to different main straight line equations can be the same or different, but the distance relationships in the preset distance standards are the same, that is, the preset distance standards corresponding to different main straight line equations all take feature points with larger distances as intersecting line feature point coordinates or feature points with smaller distances as intersecting feature point coordinates.
In one embodiment, the main linear equation GA P is optionally the same as the preset distance criterion corresponding to the main linear equation GB P. Specifically, the preset standard distance in the preset distance standard corresponding to the main linear equation GA P is the same as the preset standard distance in the preset distance standard corresponding to the main linear equation GB P.
Fig. 3 is a schematic diagram of coordinates of feature points of an intersecting line according to a second embodiment of the invention. Taking a main straight line equation AB P as an example, taking a preset standard distance in a preset distance standard corresponding to a main straight line equation AB P as beta, respectively selecting two characteristic points g P1 and g P2 with a coordinate distance beta from a plane intersection point in two directions corresponding to a main straight line equation AB P, respectively calculating the distances between the coordinates of the two characteristic points and a main radar center point, and taking the characteristic point with a larger distance as an intersection line characteristic point or the characteristic point with a smaller distance as an intersection line characteristic point. Where β may be 0.5, for example. Taking a main linear equation GA P as an example, taking a preset standard distance in a preset distance standard corresponding to the main linear equation GA P as gamma, respectively selecting two characteristic points b P1 and b P2 with a coordinate distance gamma from a plane intersection point in two directions corresponding to the main linear equation GA P, respectively calculating the distance between the coordinates of the two characteristic points and a central point of a main radar, and taking the characteristic point with a larger distance as an intersection line characteristic point or the characteristic point with a smaller distance as an intersection line characteristic point. Where, by way of example, γ may be 1.0.
Specifically, the characteristic points of intersecting lines corresponding to the main linear equation AB P, the main linear equation GA P, and the main linear equation GB P are g P1、bP1 and a P1, respectively, or the characteristic points of intersecting lines corresponding to the main linear equation AB P, the main linear equation GA P, and the main linear equation GB P are g P2、bP2 and a P2, respectively.
Exemplary, plane intersection coordinates corresponding to the main point cloud data are marked as o P, intersection line feature coordinates corresponding to the main linear equation AB P are marked as g P, intersection line feature coordinates corresponding to the main linear equation GA P are marked as b P, and intersection line feature coordinates corresponding to the main linear equation GB P are marked as a P; plane intersection points corresponding to the pre-assist point cloud data are labeled o Q, intersection line feature points corresponding to the pre-assist straight line equation AB Q are labeled g Q, intersection line feature points corresponding to the pre-assist straight line equation GA Q are labeled b Q, and intersection line feature points corresponding to the pre-assist straight line equation GB Q are labeled a Q.
On the basis of the above embodiment, optionally, the method further includes: respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining offset angle data corresponding to each intersection line characteristic point coordinate based on three intersection line characteristic point coordinates corresponding to the target point cloud data; ordering the feature point coordinates of each intersecting line based on the offset angle data to obtain a target feature set; the target feature set is a main feature set when the target point cloud data is main point cloud data, and is an auxiliary feature set when the target point cloud data is pre-auxiliary point cloud data.
And specifically, determining offset angle data between the coordinates of the characteristic points of the intersecting lines and the central point of the main radar. In one embodiment, optionally, the offset angle data is a Z-axis deflection angle when the first calibration plane is an XOZ plane, the offset angle data is a Y-axis deflection angle when the first calibration plane is an XOZ plane, and the offset angle data is an X-axis deflection angle when the first calibration plane is a YOZ plane.
Specifically, the coordinates of the feature points of the intersecting lines are ordered based on the offset angle data, and the ordering manner may be from large to small or from small to large. The advantage of this is that, since the order of determination of the second target plane equation and the third target plane equation has randomness, and the exemplary reference plane point cloud data includes the plane point cloud data corresponding to the calibration plane a and the plane point cloud data corresponding to the calibration plane B, the second main plane equation determined based on the reference plane point cloud data corresponding to the main point cloud data may be the plane equation corresponding to the calibration plane a or the plane equation corresponding to the calibration plane B, and similarly, the second auxiliary plane equation determined based on the reference plane point cloud data corresponding to the pre-auxiliary point cloud data may be the plane equation corresponding to the calibration plane a or the plane equation corresponding to the calibration plane B, so when the main feature set and the auxiliary feature set are generated, B P in the main feature set may correspond to a Q in the auxiliary feature set, which results in a significant error in the reference transformation matrix obtained through the pose estimation operation. By sequencing the intersecting line feature point coordinates based on the offset angle data, consistency of corresponding sequences of the intersecting line feature point coordinates in the main feature set and the auxiliary feature set can be ensured, and accuracy of a reference transformation matrix obtained through pose estimation operation subsequently can be ensured.
Fig. 4A is a flowchart of a method for determining a main feature set according to a second embodiment of the present invention. Specifically, main point cloud data P 0 acquired by a main radar are acquired, first plane point cloud data P G corresponding to a first calibration plane in the main point cloud data P 0 are determined based on the average height and a preset height difference threshold, and plane fitting is performed on the first plane point cloud data P G to obtain a first main plane equation G P. Screening the main point cloud data P 0 based on a preset parameter range to obtain reference main point cloud data P 2, deleting point cloud data corresponding to a first main plane equation G P in the reference main point cloud data P 2 to obtain reference plane point cloud data P AB, and performing plane fitting on the reference plane point cloud data P AB to obtain a second main plane equation A P. And deleting point cloud data corresponding to the second main plane equation A P in the reference plane point cloud data P AB to obtain third plane point cloud data, and performing plane fitting on the third plane point cloud data to obtain a third main plane equation B P. Based on the first main plane equation G P, the second main plane equation a P, and the third main plane equation B P, a main straight line equation GA P, a main straight line equation GB P, and a main straight line equation AB P are determined, and based on the main straight line equation GA P, the main straight line equation GB P, and the main straight line equation AB P, intersection line feature point coordinates a P, intersection line feature point coordinates B P, intersection line feature point coordinates G P, and plane intersection point coordinates o P are determined, and the intersection line feature point coordinates are ordered based on offset angle data corresponding to each intersection line feature point coordinate, to obtain a main feature set, and an exemplary main feature set x= [ o P,gP,aP,bP ].
Fig. 4B is a flowchart of a method for determining an assist feature set according to a second embodiment of the present invention. Specifically, auxiliary point cloud data Q 0 acquired by an auxiliary radar are acquired, and coordinate conversion is performed on the auxiliary point cloud data Q 0 based on a preset conversion matrix, so as to obtain pre-auxiliary point cloud data Q 1. And determining first plane point cloud data Q G corresponding to the first calibration plane in the pre-auxiliary point cloud data Q 1 based on the average height and a preset height difference threshold value, and performing plane fitting on the first plane point cloud data Q G to obtain a first auxiliary plane equation G Q. Screening the pre-auxiliary point cloud data Q 1 based on a preset parameter range to obtain reference auxiliary point cloud data Q 2, deleting point cloud data corresponding to a first auxiliary plane equation G Q in the reference auxiliary point cloud data Q 2 to obtain reference plane point cloud data Q AB, and performing plane fitting on the reference plane point cloud data Q AB to obtain a second auxiliary plane equation A Q. And deleting point cloud data corresponding to the second auxiliary plane equation A Q in the reference plane point cloud data Q AB to obtain third plane point cloud data, and performing plane fitting on the third plane point cloud data to obtain a third auxiliary plane equation B Q. An auxiliary straight line equation GA Q, an auxiliary straight line equation GB Q, and an auxiliary straight line equation AB Q are determined based on the first auxiliary plane equation G Q, the second auxiliary plane equation a Q, and the third auxiliary plane equation B Q, and an intersecting line feature point coordinate a Q, an intersecting line feature point coordinate B Q, an intersecting line feature point coordinate G Q, and a plane intersecting point coordinate o Q are determined based on the auxiliary straight line equation GA Q, the auxiliary straight line equation GB Q, and the auxiliary straight line equation AB Q, and the intersecting line feature point coordinates are ordered based on offset angle data corresponding to the intersecting line feature point coordinates to obtain an auxiliary feature set, and an auxiliary feature set y= [ o Q,gQ,aQ,bQ ] is exemplified.
According to the technical scheme, the reference target point cloud data is obtained by determining the parameter data between the target point cloud data and the main radar center point and screening the target point cloud data based on the preset parameter range, the reference plane point cloud data is determined based on the reference target point cloud data and the first target plane equation corresponding to the target point cloud data, the planar point cloud data corresponding to the first calibration plane in the target point cloud data are screened sequentially based on two different screening modes, the problem that the screening effect of the reference plane point cloud data is poor is solved, the finally screened reference plane point cloud data only comprises the second plane point cloud data and the third plane point cloud data as much as possible, the fitting effect of the subsequent plane equation is improved, and the accuracy of radar calibration results is further improved.
Example III
Fig. 5 is a schematic diagram of a radar calibration device according to a third embodiment of the present invention. The embodiment can be suitable for radar calibration under the condition of multi-radar navigation or positioning, and the device can be realized in a software and/or hardware mode. The radar calibration device comprises: the system comprises a point cloud data acquisition module 310, a plane equation determination module 320 and a point cloud data calibration module 330.
The point cloud data acquisition module 310 is configured to acquire main point cloud data and auxiliary point cloud data acquired by the main radar and the auxiliary radar respectively; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three intersected calibration planes;
A plane equation determining module 320, configured to determine plane equations corresponding to the respective plane point cloud data based on the plane point cloud data corresponding to the main point cloud data and the auxiliary point cloud data, respectively;
the point cloud data calibration module 330 is configured to determine a target transformation matrix between the primary radar and the auxiliary radar based on each plane equation, and determine calibration point cloud data based on the target transformation matrix.
According to the technical scheme, plane point cloud data corresponding to three intersecting calibration planes are acquired through the main radar and the auxiliary radar, plane equations of the calibration planes are determined based on the plane point cloud data, and a target conversion matrix between the main radar and the auxiliary radar is determined based on the plane equations, so that the problem that the radar calibration process in the prior art is more in limiting factors is solved, the installation position relation between the radars and the radar types are not limited, and the application scene of radar calibration is widened.
Based on the above technical solution, optionally, the plane equation determining module 320 includes:
the plane equation determining unit is used for carrying out coordinate conversion on the auxiliary point cloud data based on a preset conversion matrix to obtain pre-auxiliary point cloud data, and determining plane equations respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data;
Accordingly, the point cloud data calibration module 330 includes:
And the reference conversion matrix determining unit is used for determining a reference conversion matrix between the main radar and the auxiliary radar based on each plane equation and determining a target conversion matrix between the main radar and the auxiliary radar based on a preset conversion matrix and the reference conversion matrix.
On the basis of the above technical solution, optionally, the plane equation determining unit includes:
The first target plane equation determining subunit is used for respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining a first target plane equation corresponding to the first calibration plane based on first plane point cloud data corresponding to the target point cloud data;
a second target plane equation determining subunit, configured to determine a second target plane equation corresponding to the second calibration plane and a third target plane equation corresponding to the third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; wherein the reference planar point cloud data includes second planar point cloud data and third planar point cloud data.
On the basis of the above technical solution, optionally, the apparatus further includes:
The first plane point cloud data determining module is used for selecting point cloud data with the minimum ordinate in the target point cloud data based on the preset selection quantity, and determining the average height corresponding to the selected point cloud data; and screening the target point cloud data based on the average height and a preset height difference threshold value to obtain first plane point cloud data corresponding to the target point cloud data.
On the basis of the above technical solution, optionally, the apparatus further includes:
the reference plane point cloud data determining module is used for determining reference plane point cloud data corresponding to the target point cloud data based on the target point cloud data and a first target plane equation corresponding to the target point cloud data; the first target plane equation is a first main plane equation when the target point cloud data is main point cloud data, and is a first auxiliary plane equation when the target point cloud data is pre-auxiliary point cloud data.
On the basis of the above technical solution, optionally, the reference plane point cloud data determining module is specifically configured to:
determining parameter data between target point cloud data and a main radar central point, and taking the point cloud data meeting a preset parameter range in the target point cloud data as reference target point cloud data; wherein the parameter data comprises distance data and/or offset angle data;
The method includes determining reference plane point cloud data corresponding to target point cloud data based on the reference target point cloud data and a first target plane equation corresponding to the target point cloud data.
On the basis of the above technical solution, optionally, the second objective plane equation determining subunit is specifically configured to:
determining a second target plane equation corresponding to the second calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; determining third plane point cloud data corresponding to the target point cloud data based on the reference plane point cloud data and the second target plane equation; based on the third plane point cloud data, a third target plane equation corresponding to the third calibration plane is determined.
On the basis of the above technical solution, optionally, the reference transformation matrix determining unit includes:
The characteristic point coordinate determining subunit is used for determining characteristic point coordinates corresponding to the main point cloud data and the pre-auxiliary point cloud data respectively based on three plane equations corresponding to the main point cloud data and the pre-auxiliary point cloud data respectively; the characteristic point coordinates comprise plane intersection point coordinates and intersection line characteristic point coordinates, wherein the plane intersection point coordinates are used for representing intersection point coordinates of three calibration planes, and the intersection line characteristic point coordinates are used for representing characteristic point coordinates on an intersection line between the three calibration planes;
And the reference transformation matrix determining subunit is used for performing pose estimation operation on the main feature set generated based on the feature point coordinates corresponding to the main point cloud data and the auxiliary feature set generated based on the feature point coordinates corresponding to the pre-auxiliary point cloud data to obtain a reference transformation matrix between the main radar and the auxiliary radar.
On the basis of the above technical solution, optionally, the feature point coordinate determining subunit is specifically configured to:
respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data;
Determining three target straight-line equations corresponding to three calibration planes based on three plane equations corresponding to the target point cloud data, and determining plane intersection point coordinates corresponding to the target point cloud data based on each target straight-line equation;
Based on the plane intersection point coordinates and preset distance standards respectively corresponding to the target linear equations, respectively determining intersection line feature point coordinates corresponding to the target linear equations; the preset distance standard comprises the distance between the intersection line characteristic point coordinates and the plane intersection point coordinates meeting the preset standard distance, and the distance between the intersection line characteristic point coordinates and the main radar center point is larger or smaller than the distance between the other characteristic point coordinates meeting the preset standard distance and the main radar center point.
On the basis of the above technical solution, optionally, the apparatus further includes:
The intersection line characteristic point coordinate ordering module is used for taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data respectively; determining offset angle data corresponding to each intersection line characteristic point coordinate based on three intersection line characteristic point coordinates corresponding to the target point cloud data; ordering the feature point coordinates of each intersecting line based on the offset angle data to obtain a target feature set; the target feature set is a main feature set when the target point cloud data is main point cloud data, and is an auxiliary feature set when the target point cloud data is pre-auxiliary point cloud data.
The radar calibration device provided by the embodiment of the invention can be used for executing the radar calibration method provided by the embodiment of the invention, and has the corresponding functions and beneficial effects of the execution method.
It should be noted that, in the embodiment of the radar calibration device, each unit and module included are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present invention.
Example IV
Fig. 6 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention, which provides services for implementing the radar calibration method according to the above embodiment of the present invention, and the radar calibration device according to the above embodiment may be configured. Fig. 6 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 6 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 6, the electronic device 12 is in the form of a general purpose computing device. Components of the electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, a bus 18 that connects the various system components, including the system memory 28 and the processing units 16.
Bus 18 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, commonly referred to as a "hard disk drive"). Although not shown in fig. 6, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored in, for example, memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
The electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with the electronic device 12, and/or any devices (e.g., network card, modem, etc.) that enable the electronic device 12 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through a network adapter 20. As shown in fig. 6, the network adapter 20 communicates with other modules of the electronic device 12 over the bus 18. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 12, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 16 executes various functional applications and data processing by running programs stored in the system memory 28, for example, to implement the radar calibration method provided by the embodiment of the present invention.
Through the electronic equipment, the problem that the limiting factors are more in the radar calibration process in the prior art is solved, and the installation position relation between the radars and the radar types are not limited, so that the application scene of radar calibration is widened.
Example five
A fifth embodiment of the present invention also provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a radar calibration method, the method comprising:
Acquiring main point cloud data and auxiliary point cloud data respectively acquired by a main radar and an auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three intersected calibration planes;
Determining plane equations corresponding to the respective plane point cloud data based on the plane point cloud data corresponding to the respective main point cloud data and the auxiliary point cloud data;
Based on each plane equation, a target transformation matrix between the primary radar and the secondary radar is determined, and the target point cloud data is determined based on the target transformation matrix.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present invention is not limited to the above method operations, and may also perform the related operations in the radar calibration method provided in any embodiment of the present invention.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (12)

1. A radar calibration method, comprising:
acquiring main point cloud data and auxiliary point cloud data respectively acquired by a main radar and an auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three intersected calibration planes;
Determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the auxiliary point cloud data;
Determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining target point cloud data based on the target conversion matrix;
The determining a plane equation corresponding to each plane point cloud data based on the plane point cloud data corresponding to each of the main point cloud data and the auxiliary point cloud data, includes:
Performing coordinate conversion on the auxiliary point cloud data based on a preset conversion matrix to obtain pre-auxiliary point cloud data, and determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data;
The determining a plane equation corresponding to each of the plane point cloud data based on the plane point cloud data corresponding to each of the main point cloud data and the pre-auxiliary point cloud data, includes:
Respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data;
Determining a first target plane equation corresponding to a first calibration plane based on first plane point cloud data corresponding to the target point cloud data;
determining a second target plane equation corresponding to a second calibration plane and a third target plane equation corresponding to a third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; wherein the reference planar point cloud data includes second planar point cloud data and third planar point cloud data.
2. The method of claim 1, wherein determining a target transformation matrix between the primary radar and the secondary radar based on each of the plane equations comprises:
And determining a reference conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining a target conversion matrix between the main radar and the auxiliary radar based on the preset conversion matrix and the reference conversion matrix.
3. The method according to claim 1, wherein the method further comprises:
Selecting point cloud data with the minimum ordinate in the target point cloud data based on a preset selection quantity, and determining the average height corresponding to the selected point cloud data;
And screening the target point cloud data based on the average height and a preset height difference threshold value to obtain first plane point cloud data corresponding to the target point cloud data.
4. The method according to claim 1, wherein the method further comprises:
determining reference plane point cloud data corresponding to the target point cloud data based on the target point cloud data and a first target plane equation corresponding to the target point cloud data; the first target plane equation is a first main plane equation when the target point cloud data is main point cloud data, and is a first auxiliary plane equation when the target point cloud data is pre-auxiliary point cloud data.
5. The method of claim 4, wherein the determining the reference plane point cloud data corresponding to the target point cloud data based on the target point cloud data and a first target plane equation corresponding to the target point cloud data comprises:
determining parameter data between the target point cloud data and a main radar central point, and taking the point cloud data meeting a preset parameter range in the target point cloud data as reference target point cloud data; wherein the parameter data comprises distance data and/or offset angle data;
And determining reference plane point cloud data corresponding to the target point cloud data based on the reference target point cloud data and a first target plane equation corresponding to the target point cloud data.
6. The method of claim 1, wherein the determining a second target plane equation corresponding to a second calibration plane and a third target plane equation corresponding to a third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data comprises:
determining a second target plane equation corresponding to a second calibration plane based on the reference plane point cloud data corresponding to the target point cloud data;
Determining third plane point cloud data corresponding to the target point cloud data based on the reference plane point cloud data and a second target plane equation;
and determining a third target plane equation corresponding to a third calibration plane based on the third plane point cloud data.
7. The method of claim 2, wherein determining a reference transformation matrix between the primary radar and the secondary radar based on each of the plane equations comprises:
Determining characteristic point coordinates corresponding to the main point cloud data and the pre-auxiliary point cloud data respectively based on three plane equations corresponding to the main point cloud data and the pre-auxiliary point cloud data respectively; the characteristic point coordinates comprise plane intersection point coordinates and intersection line characteristic point coordinates, wherein the plane intersection point coordinates are used for representing intersection point coordinates of three calibration planes, and the intersection line characteristic point coordinates are used for representing characteristic point coordinates on an intersection line between the three calibration planes;
And performing pose estimation operation on a main feature set generated based on feature point coordinates corresponding to the main point cloud data and an auxiliary feature set generated based on feature point coordinates corresponding to the pre-auxiliary point cloud data to obtain a reference conversion matrix between the main radar and the auxiliary radar.
8. The method of claim 7, wherein the determining feature point coordinates corresponding to the primary point cloud data and the pre-auxiliary point cloud data, respectively, based on three plane equations corresponding to the primary point cloud data and the pre-auxiliary point cloud data, respectively, comprises:
Respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data;
Determining three target straight-line equations corresponding to the three calibration planes based on three plane equations corresponding to the target point cloud data, and determining plane intersection point coordinates corresponding to the target point cloud data based on each target straight-line equation;
Based on the plane intersection point coordinates and preset distance standards respectively corresponding to the target linear equations, respectively determining intersection line feature point coordinates corresponding to the target linear equations; the preset distance standard comprises the distance between the intersection line characteristic point coordinates and the plane intersection point coordinates meeting the preset standard distance, and the distance between the intersection line characteristic point coordinates and the main radar center point is larger or smaller than the distance between the other characteristic point coordinates meeting the preset standard distance and the main radar center point.
9. The method of claim 7, wherein the method further comprises:
Respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data;
determining offset angle data corresponding to each intersecting line characteristic point coordinate based on three intersecting line characteristic point coordinates corresponding to the target point cloud data;
based on the offset angle data, ordering the feature point coordinates of the intersecting lines to obtain a target feature set; and when the target point cloud data is the main point cloud data, the target feature set is the main feature set, and when the target point cloud data is the pre-auxiliary point cloud data, the target feature set is the auxiliary feature set.
10. A radar calibration device, comprising:
the point cloud data acquisition module is used for acquiring main point cloud data and auxiliary point cloud data respectively acquired by the main radar and the auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three intersected calibration planes;
the plane equation determining module is used for determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the auxiliary point cloud data;
the point cloud data calibration module is used for determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation and determining calibration point cloud data based on the target conversion matrix;
The plane equation determination module includes:
The plane equation determining unit is used for carrying out coordinate conversion on the auxiliary point cloud data based on a preset conversion matrix to obtain pre-auxiliary point cloud data, and determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data;
the plane equation determining unit includes:
The first target plane equation determining subunit is used for respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining a first target plane equation corresponding to a first calibration plane based on first plane point cloud data corresponding to the target point cloud data;
A second target plane equation determining subunit, configured to determine a second target plane equation corresponding to a second calibration plane and a third target plane equation corresponding to a third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; wherein the reference planar point cloud data includes second planar point cloud data and third planar point cloud data.
11. An electronic device, the electronic device comprising:
one or more processors;
A memory for storing one or more programs;
The one or more programs, when executed by the one or more processors, cause the one or more processors to implement the radar calibration method of any of claims 1-9.
12. A storage medium containing computer executable instructions which, when executed by a computer processor, are for performing the radar calibration method according to any one of claims 1-9.
CN202110075819.4A 2021-01-20 2021-01-20 Radar calibration method, device, equipment and storage medium Active CN113759348B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110075819.4A CN113759348B (en) 2021-01-20 2021-01-20 Radar calibration method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110075819.4A CN113759348B (en) 2021-01-20 2021-01-20 Radar calibration method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113759348A CN113759348A (en) 2021-12-07
CN113759348B true CN113759348B (en) 2024-05-17

Family

ID=78786387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110075819.4A Active CN113759348B (en) 2021-01-20 2021-01-20 Radar calibration method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113759348B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114694138B (en) * 2022-06-01 2022-09-20 远峰科技股份有限公司 Road surface detection method, device and equipment applied to intelligent driving
CN115236690B (en) * 2022-09-20 2023-02-10 图达通智能科技(武汉)有限公司 Data fusion method and device for laser radar system and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613543A (en) * 2018-12-06 2019-04-12 深圳前海达闼云端智能科技有限公司 Method and device for correcting laser point cloud data, storage medium and electronic equipment
CN110031824A (en) * 2019-04-12 2019-07-19 杭州飞步科技有限公司 Laser radar combined calibrating method and device
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
CN111815716A (en) * 2020-07-13 2020-10-23 北京爱笔科技有限公司 Parameter calibration method and related device
CN112233182A (en) * 2020-12-15 2021-01-15 北京云测网络科技有限公司 Method and device for marking point cloud data of multiple laser radars

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613543A (en) * 2018-12-06 2019-04-12 深圳前海达闼云端智能科技有限公司 Method and device for correcting laser point cloud data, storage medium and electronic equipment
CN110031824A (en) * 2019-04-12 2019-07-19 杭州飞步科技有限公司 Laser radar combined calibrating method and device
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
CN111815716A (en) * 2020-07-13 2020-10-23 北京爱笔科技有限公司 Parameter calibration method and related device
CN112233182A (en) * 2020-12-15 2021-01-15 北京云测网络科技有限公司 Method and device for marking point cloud data of multiple laser radars

Also Published As

Publication number Publication date
CN113759348A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
US11480443B2 (en) Method for calibrating relative pose, device and medium
JP6830139B2 (en) 3D data generation method, 3D data generation device, computer equipment and computer readable storage medium
EP3621034B1 (en) Method and apparatus for calibrating relative parameters of collector, and storage medium
CN111127563A (en) Combined calibration method and device, electronic equipment and storage medium
CN109270545B (en) Positioning true value verification method, device, equipment and storage medium
CN110095752B (en) Positioning method, apparatus, device and medium
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
CN109146938B (en) Method, device and equipment for calibrating position of dynamic obstacle and storage medium
CN110764111B (en) Conversion method, device, system and medium of radar coordinates and geodetic coordinates
CN109543680B (en) Method, apparatus, device, and medium for determining location of point of interest
CN113759348B (en) Radar calibration method, device, equipment and storage medium
CN110849363B (en) Pose calibration method, system and medium for laser radar and combined inertial navigation
CN112684478A (en) Parameter calibration method and device based on double antennas, storage medium and electronic equipment
CN111272181B (en) Method, device, equipment and computer readable medium for constructing map
CN111121755B (en) Multi-sensor fusion positioning method, device, equipment and storage medium
CN114049401A (en) Binocular camera calibration method, device, equipment and medium
CN112509135B (en) Element labeling method, element labeling device, element labeling equipment, element labeling storage medium and element labeling computer program product
CN111598930B (en) Color point cloud generation method and device and terminal equipment
CN109389119B (en) Method, device, equipment and medium for determining interest point region
CN117191080A (en) Calibration method, device, equipment and storage medium for camera and IMU external parameters
CN110853098A (en) Robot positioning method, device, equipment and storage medium
CN110634159A (en) Target detection method and device
CN113763457A (en) Method and device for calibrating drop terrain, electronic equipment and storage medium
CN111398961B (en) Method and apparatus for detecting obstacles
CN111950420A (en) Obstacle avoidance method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant