CN113156411B - Vehicle-mounted laser radar calibration method - Google Patents

Vehicle-mounted laser radar calibration method Download PDF

Info

Publication number
CN113156411B
CN113156411B CN202110487006.6A CN202110487006A CN113156411B CN 113156411 B CN113156411 B CN 113156411B CN 202110487006 A CN202110487006 A CN 202110487006A CN 113156411 B CN113156411 B CN 113156411B
Authority
CN
China
Prior art keywords
laser radar
spherical
sphere
vehicle
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110487006.6A
Other languages
Chinese (zh)
Other versions
CN113156411A (en
Inventor
周奎
丁松
张友兵
付勇智
杨亚会
张宇丰
刘瀚文
赵毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei University of Automotive Technology
Original Assignee
Hubei University of Automotive Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei University of Automotive Technology filed Critical Hubei University of Automotive Technology
Priority to CN202110487006.6A priority Critical patent/CN113156411B/en
Publication of CN113156411A publication Critical patent/CN113156411A/en
Application granted granted Critical
Publication of CN113156411B publication Critical patent/CN113156411B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Abstract

The invention provides a vehicle-mounted laser radar calibration method, which comprises the steps of arranging a camera and a spherical barrier in a calibration field of a vehicle-mounted laser radar, obtaining the coordinates of the spherical barrier in a vehicle body coordinate system, and sampling in sequence to obtain an ordered coordinate point set of a characteristic calibration matrix formed by the spherical centers of the spherical barriers; carrying out down-sampling, ground-removing and Euclidean clustering and center-of-sphere position clustering processing on the point cloud of the spherical obstacle detected by the laser radar, and calculating an ordered coordinate point set of the spherical obstacle in a laser radar coordinate system by taking a center-of-sphere position clustering result as input; and calculating to obtain a rotation translation matrix between the vehicle body coordinate system and the laser radar coordinate system by taking the two ordered coordinate point sets as input, namely obtaining a calibration result of the vehicle-mounted laser radar. The method has the advantages of reasonable design and simple operation flow, can automatically and standardize the high-efficiency calibration of the laser radar, and effectively improves the calibration accuracy of the vehicle-mounted laser radar.

Description

Vehicle-mounted laser radar calibration method
Technical Field
The invention relates to the technical field of automatic driving and intelligent networking automobiles, in particular to the field of sensor calibration, and specifically relates to a vehicle-mounted laser radar calibration method.
Background
At present, laser radars are widely applied to intelligent networked automobiles and automatic driving automobiles, and smooth proceeding of all subsequent development and test work can be ensured after the laser radars are quickly and accurately calibrated.
In general, a lidar module in an autonomous vehicle sensing system is composed of one or more lidar(s) to sense the environment around the vehicle body. However, in order to fuse data among multiple laser radars or between a laser radar and other sensors, data of the sensors such as the laser radars needs to be converted into the same coordinate system, generally into a vehicle body coordinate system, and at this time, the rotational and translational relationship between the coordinate system of each laser radar and the vehicle body coordinate system needs to be calibrated, which can be expressed by a rotational matrix R and a translational vector t.
The accurate calibration of the laser radar is the basis for the automatic driving vehicle to accurately sense the external environment, and the higher the calibration precision of the laser radar is, the more accurate the point cloud data can reflect the environment. The common calibration method for the laser radar at present comprises the following steps: recording a section of data by using a laser radar, performing off-line parameter calculation by using a point cloud registration algorithm, parking the vehicle well according to a specific pose, and then calibrating the laser radar by using a calibration object; in practical application, the laser radar calibration steps are complicated and complex, operation is required by professionals, a standard and automatic calibration method is not formed, the accuracy and efficiency of manual calibration are low, and the laser radar calibration work requirement of automatic driving vehicles in batch production cannot be met.
Disclosure of Invention
The invention aims to provide a vehicle-mounted laser radar calibration method to solve the problems that the calibration steps of the conventional vehicle-mounted laser radar are complicated and complex, the laser radar cannot be calibrated efficiently in a standardized and automatic manner, the calibration accuracy and efficiency are low, and the high-efficiency and accurate laser radar calibration work requirements required by automatic driving vehicles in batch production cannot be met.
In order to solve the technical problems, the invention provides a vehicle-mounted laser radar calibration method, which comprises the steps of firstly arranging a camera for obtaining an image of a calibration field above the middle of the calibration field of a vehicle-mounted laser radar, respectively arranging spherical obstacles at two opposite corners of the calibration field of the vehicle-mounted laser radar, then constructing a vehicle body coordinate system of a vehicle to be calibrated with the vehicle-mounted laser radar, calculating coordinates of the center of sphere of the spherical obstacle in the vehicle body coordinate system, and then sampling a characteristic calibration matrix formed by the centers of sphere of the spherical obstacles in sequence to obtain an ordered coordinate point set;
sequentially performing down-sampling, ground removing, point cloud clustering and center of sphere position clustering processing on the point cloud of the spherical obstacle detected by the vehicle-mounted laser radar, taking the obtained center of sphere position clustering result as input, obtaining the center of sphere coordinates of the spherical obstacle in a laser radar coordinate system, and performing ordered sampling on the center of sphere coordinates to obtain an ordered coordinate point set of the spherical obstacle in the laser radar coordinate system;
and finally, taking the obtained ordered coordinate point set of the spherical barrier under the laser radar coordinate system and the ordered coordinate point set of the spherical barrier under the vehicle body coordinate system as input, and calculating a rotation translation matrix between the vehicle body coordinate system and the laser radar coordinate system, namely the calibration result of the vehicle-mounted laser radar.
Preferably, the point cloud of the spherical obstacle detected by the vehicle-mounted laser radar is subjected to downsampling and ground removing processing, namely point cloud data scanned by the laser radar in real time are obtained, the obtained point cloud data are expressed as p ═ x, y, z and tau, wherein x, y and z represent coordinates of a certain point in a laser radar coordinate system, tau represents echo intensity of the point, a frame of laser radar data is obtained, and the point cloud data scanned by the frame of laser radar in real time are subjected to downsampling and ground removing processing by using a voxel downsampling algorithm and a RANSAC algorithm in a PCL point cloud library.
Preferably, the specific process of performing point cloud clustering on the point cloud of the spherical obstacle detected by the vehicle-mounted laser radar comprises the following steps: using laser radar point cloud data subjected to downsampling and ground removing processing as input, performing point cloud clustering by using an Euclidean clustering algorithm to obtain a clustering result of one frame of laser radar point cloud data, and obtaining a laser point cloud cluster of each spherical obstacle, wherein the point cloud cluster can be represented as P ═ P1,p2,…pn]N is the number of points contained in one point cloud cluster;
sorting the points in the clustered point cloud cluster of each obtained spherical barrier by using a Timport sorting algorithm according to the mode that the echo intensity tau of the point cloud is from high to low, intercepting 30% of the points in front of a sorted queue, and dividing an intercepted point set into a group of four points;
for example, the set of points selected and sorted is Q ═ p1,p2,…pk],pi(1<i<k) Represents a certain point therein, k is the number of points in the point set, i.e.:
u1=[p1,p2,p3,p4]
ui=[p4i-3,p4i-2,p4i-1,p4i],(1=<i<=k mod 4)
uj=[p4j-3,p4j-2,p4j-1,p4j],(j=k mod 4)
wherein u1Is a first set of data, ujFor the last set of data, uiFor the ith group of data, the divided set U is represented as:
U=[u1,u2,…,uj];
taking each element of the set U as input, and calculating the sphere center coordinate o corresponding to each group of data according to the Cramer's rule, such as U1Corresponding to a sphere center coordinate of o1=[x1,y1,z1],u1The coordinates of the surface points of the four spherical obstacles are contained;
assuming that the coordinates of the sphere center are (x, y, z), the four coordinates of the surface of the spherical obstacle are (x1, y1, z1), (x2, y2, z2), (x3, y3, z3), (x4, y4, z4), respectively, and the following is obtained according to the property that the distance from any point of the surface of the spherical obstacle to the sphere center is equal to the length of the radius:
Figure BDA0003050802350000041
the above formula (4) is modified to obtain the following formula:
Figure BDA0003050802350000042
to simplify the presentation, let
Figure BDA0003050802350000043
Figure BDA0003050802350000044
The constants on the right side of the equal sign of the formula (5) are represented as P, Q and R, and the determinant which can form constant terms is [ P, Q and R]TLeft side of equal signThe coefficients of the edges x, y, z may form a coefficient matrix D as follows:
Figure BDA0003050802350000045
reissue to order
Figure BDA0003050802350000046
According to the Cramer's law, the sphere center coordinates are respectively:
Figure BDA0003050802350000051
the set of centroids calculated by the same data set U may be denoted as O ═ O1,o2,…oj]And carrying out normalization processing on the sphere centers calculated by the same clustering result, wherein j is the number of the sphere centers, the obtained result is the sphere center coordinate of the sphere corresponding to the set U, and the calculation mode is to calculate the weighted average of all the sphere centers, namely:
Figure BDA0003050802350000052
until the sphere center coordinates of all the spherical obstacles perceived by the laser radar are calculated and expressed as q ═ x ', y ', z ', the sphere center coordinate set obtained by perception calculation is H:
H=[q1,q2,…qi](3=<i<=6)。
preferably, the ball center position clustering processing of the data of the spherical obstacle detected by the vehicle-mounted laser radar is to cluster the ball center coordinate set H according to the euclidean distance, and the specific process is as follows: the method comprises the steps of randomly selecting the sphere center of a spherical obstacle to obtain the coordinate of the spherical obstacle, sequentially reading the sphere center coordinates of the rest spherical obstacles, calculating the Euclidean distance between the sphere center coordinates of the rest spherical obstacles and the sphere center coordinate of the spherical obstacle randomly selected for the first time, then taking the sphere center coordinates of the two spherical obstacles with the minimum Euclidean distance between the sphere center coordinates of the two spherical obstacles randomly selected for the first time and the sphere center coordinate of the spherical obstacle randomly selected for the first time to form three vertexes of a triangle, and extracting the three remaining sphere center coordinates after the two sphere center coordinates are extracted to form three vertexes of another triangle.
Preferably, the rotational translation matrix is calculated by using a singular value decomposition algorithm with an ordered point coordinate set of the spherical obstacle in the vehicle body coordinate system and an ordered point coordinate set of the spherical obstacle in the laser radar coordinate system as inputs.
By adopting the technical scheme, the invention has the following beneficial effects:
the vehicle-mounted laser radar calibration method is reasonable in concept and simple in operation process, can automatically and standardize the laser radar to perform high-efficiency calibration, effectively improves the calibration accuracy of the vehicle-mounted laser radar, and effectively solves the problems that the conventional vehicle-mounted laser radar is complicated and complicated in calibration step, low in manual calibration accuracy and efficiency, and needs to be operated by a professional.
The invention provides a method for using a spherical barrier as an auxiliary tool for vehicle-mounted laser radar calibration, and designs a placing and arranging mode of the spherical barrier, namely three spheres form a calibration characteristic domain, so that the rationality and the technical rationality are provided for obtaining an ordered coordinate point set which is input as a singular value decomposition algorithm and is in a vehicle body coordinate system, and the scheme has strong feasibility and realizability; after the ordered point set of the coordinates of the characteristic calibration object under the vehicle body coordinate system is obtained and stored, the vehicle is only required to be stopped in a calibration field according to the specification subsequently, and the calibration system automatically, quickly and accurately calibrates the laser radar, so that the calibration efficiency can be greatly improved.
The laser radar can only scan the surface of an object and can very accurately output the position of a point on the surface of the object in space, but the output point clouds are different when the laser radar scans the same object every time, the sphere has the property that the distances from any point on the surface to the sphere center are equal, the position of the sphere center of the sphere in space can be very accurately obtained and calculated through the laser radar, the surface of the sphere has countless points, but the sphere center is only one, and even if the laser radar returns to different points on the surface of the same object every time, the unique sphere center coordinate can be accurately calculated through the property; therefore, the sphere is selected as the characteristic calibration object, so that the accuracy of the calculation result is guaranteed, and the stability of the calculation result is also guaranteed.
After the coordinates of the sphere center of the spherical barrier in the space are accurately obtained, an accurate ordered coordinate point set is obtained according to the coordinate point sampling method provided by the invention, and finally, the calculation of the calibration result is input, so that the accuracy and the stability of the sphere center coordinates and the ordered coordinate point set obtained in the early stage also provide a strong guarantee for the calibration effect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description in the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic flow chart of a vehicle-mounted laser radar calibration method of the present invention;
FIG. 2 is a simplified diagram of a point cloud processing flow in the vehicle-mounted laser radar calibration method of the present invention;
FIG. 3 is a schematic diagram of a clustering process of the vehicle-mounted laser radar calibration method of the present invention;
FIG. 4 is a schematic diagram of a spherical obstacle placement in the vehicle-mounted lidar calibration method of the present invention;
FIG. 5 is a schematic view of a camera placement for the vehicle-mounted lidar calibration method of the present invention;
FIG. 6 is a top view of a vehicle-mounted lidar calibration field of the vehicle-mounted lidar calibration method of the present invention;
FIG. 7 is a schematic diagram of gray value conversion of a spherical obstacle in the vehicle-mounted laser radar calibration method according to the present invention;
FIG. 8 is a flowchart of a gray value clustering algorithm in the vehicle-mounted laser radar calibration method of the present invention;
FIG. 9 is a schematic diagram of vehicle body gray scale value conversion in the vehicle-mounted lidar calibration method of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The present invention will be further explained with reference to specific embodiments.
As shown in fig. 1, the vehicle-mounted laser radar calibration method provided in this embodiment specifically includes the following steps:
(1) a group of spherical obstacles are arranged at two opposite corners of a vehicle-mounted laser radar calibration field (as shown in fig. 4, in the embodiment, the upper left corner and the lower right corner of the vehicle-mounted laser radar calibration field) in advance, each group of spherical obstacles is formed by spheres with three high-matte surfaces distributed in a triangular shape, and the specific distribution form is shown in fig. 4; wherein the radius of each sphere is 0.6m, the distance between the centers of adjacent spheres in each group of spherical obstacles is 2.2m, and the spheres are placed at the distance of 8-10m from the origin of the coordinate system of the vehicle body. Meanwhile, as shown in fig. 5, a high-definition camera is placed above the middle of a vehicle-mounted laser radar calibration field, an imaging plane of the camera and a calibration field plane are kept horizontal, the height of the camera is h meters (h is a value in all feasible ranges), namely the height of the camera needs to ensure that the camera has a complete and clear calibration field view, and then internal parameters and external parameters of the camera are calibrated and stored, as shown in fig. 4; and then the vehicle is parked in the middle area of the calibration site shown in figure 6, and the heading angle of the vehicle head is ensured to be between 0 and 180 degrees.
(2) The method comprises the steps of constructing a vehicle body coordinate system of a vehicle of the vehicle-mounted laser radar to be calibrated, acquiring the outline of the vehicle to be calibrated by utilizing a proposed gray value clustering algorithm based on a camera and internal and external parameters of the camera, constructing the vehicle body coordinate system by taking the outline circumscribed rectangle geometric center as the origin of the coordinate system, acquiring coordinates of the sphere centers of all spherical obstacles in the vehicle body coordinate system, and sampling the characteristic calibration matrix (namely a characteristic calibration triangle) formed by the coordinates in sequence to acquire an ordered coordinate point set of the spherical obstacles in the vehicle body coordinate system.
The specific process for acquiring the ordered point set of the spherical barrier in the vehicle body coordinate system in the step (2) is as follows:
(2.1) as shown in fig. 7, acquiring an image (the image only contains the ground and a spherical obstacle) when a vehicle to be calibrated does not enter a calibration field, performing gaussian filtering and graying processing on the acquired image to acquire a gray map of the image, classifying the gray values of the image to acquire a gray value D _ g (which is a value or a range value) of the ground and a gray value B _ g (which is a value or a range value) of the spherical obstacle, processing the gray map according to the D _ g and the B _ g to enable the gray value of a pixel point of which the gray value is in the range of the D _ g to be 0 and the gray value of the pixel point of which the gray value is in the range of the B _ g to be 1, and acquiring a map, wherein the pixel point covered by the spherical obstacle is 1 and the pixel point value of a place covered by the ground is 0;
(2.2) clustering the spherical obstacles in the image by using a gray value clustering algorithm; as shown in fig. 5, the image processed in step (2.1) is retrieved by rows and columns, and the coordinates of the pixel point with gray value 1 retrieved for the first time in each row (left-to-right retrieval) or each column (top-to-bottom retrieval) in the pixel coordinate system are saved; clustering is started from the first 1 to the end of a certain row or a certain column after 0 is touched; dividing the image into a plurality of clustering results, then carrying out secondary combination on the clustering results, randomly extracting the coordinates of pixel points in each clustering result in an image coordinate system, arranging the coordinates from small to large according to the size of a horizontal coordinate, and combining two adjacent clustering results together;
(2.3) each clustering result is the coordinate of the pixel point of the edge of a spherical obstacle in the image coordinate system, fitting a circular curve to the coordinates to obtain the center coordinates of the obtained circle, forming a center coordinate set in a two-dimensional plane, and expressing the center coordinate set as Oc:
Figure BDA0003050802350000091
The center coordinate set O is converted by the conversion relation between the pixel coordinate system and the world coordinate systemcConverting to the world coordinate system, only extracting the coordinates of the sphere center in the two-dimensional world coordinate system (the Z axis is not considered temporarily), and expressing the coordinates in the world coordinate system as Ow:
Figure BDA0003050802350000092
Wherein, the conversion relation between the pixel coordinate system and the world coordinate system is expressed as:
Figure BDA0003050802350000093
wherein u and v in the formula (1) are coordinates of a certain point in a pixel coordinate system, Xw, Yw and Zw are coordinates of the point in a world coordinate system, K is an internal reference matrix of the camera, and R, T is an external reference matrix of the camera;
(2.4) acquiring an image of a vehicle to be calibrated after entering a calibration field, the same as the step (2.1), acquiring a gray scale image of the image at the moment, processing the image according to D _ g and B _ g to enable pixel points with gray values near D _ g and B _ g to be 0 and other gray value ranges to be 1, as shown in FIG. 9, the process is the same as the step (2.1); traversing the image matrix from the upper side and the lower side simultaneously according to rows, traversing the image matrix from the left side and the right side simultaneously according to rows, wherein each direction is a clustering result, the clustering mode is the same as the step (2.2), as shown in fig. 8, pixel points respectively representing four sides of the vehicle outline are respectively subjected to straight line fitting to form a vehicle outline rectangle, the pixel point coordinate of the center point of the rectangle is obtained, and the pixel point coordinate is converted into a world coordinate system according to a formula (1);
(2.5) taking the positive central point of the circumscribed rectangle of the outline of the vehicle to be calibrated as the origin C of the vehicle body coordinate system, and arranging the origin C on a plane with the height being k meters away from the groundOn the surface, its coordinate in the pixel coordinate system is (u)cen,vcen) Drawing perpendicular lines to two sides of the rectangle respectively through the geometric center, and acquiring coordinates of intersection points of the perpendicular lines and the sides of the rectangle in a pixel coordinate system, wherein the coordinates are respectively D (u)trs,vtrs),F(ulon,vlonAnda point D, wherein the point D is the intersection of a perpendicular line made in the direction of the vehicle head and the rectangle, the point F is the intersection of a perpendicular line made in the direction of the vehicle body to the right and the rectangle, the direction from the point C to the point F is the direction of the X axis of the vehicle body coordinate system, and the direction from the point C to the point D is the direction of the Y axis of the vehicle body coordinate system;
(2.6) converting the coordinates of the point C and the point D in the pixel coordinate system into a world coordinate system through the internal and external parameters of the camera, and respectively calculating the coordinates (x) of the points in the world coordinate system according to the formula (1)C,yC) And (x)D,yD) (ii) a And calculating the course angle theta of the vehicle body at the moment, wherein the calculation formula is as follows:
Figure BDA0003050802350000101
at this time, the coordinate calculation formula of each sphere center under the vehicle body coordinate system is as follows:
Figure BDA0003050802350000102
wherein (x, y) is the coordinate of the sphere center under a two-dimensional world coordinate system, and (x ', y', r-k) is the coordinate of the point under a vehicle body coordinate system, and the longitudinal coordinate value under the vehicle body coordinate system is the difference value of the radius r and k of the sphere; the coordinates of the centers of the three spheres up1, up2 and up3 in the front of the vehicle body in the vehicle body coordinate system shown in FIG. 4 are calculated, and the set of the coordinates of the centers is represented as HupNamely:
Figure BDA0003050802350000103
wherein (x)11,y11,z11),(x22,y22,z22),(x33,y33,z33) Coordinates of the centers of the three spheres up1, up2 and up3 in front of the vehicle body shown in FIG. 4 in a vehicle body coordinate system;
(2.7) in the same manner as in the above-mentioned step (2.6), the coordinates of the centers of the three spheres down1, down2 and down3 at the rear of the vehicle body shown in FIG. 4 in the vehicle body coordinate system are calculated and expressed as Hdown
(2.8) taking three spheres in the front left of the vehicle body (namely spheres up1, up2 and up3 in fig. 4) as a group of spherical obstacles, taking three spheres in the rear right of the vehicle body (namely spheres down1, down2 and down3 in fig. 4) as a group of spherical obstacles, forming an equilateral triangle by the sphere centers of the three spheres in each group of spherical obstacles, sampling three sides of the triangle in the clockwise direction from the sphere center farthest from the origin of the vehicle body coordinate system, taking a point at an interval of s centimeters, and obtaining the coordinates of each sampling point under the vehicle body coordinate system, wherein the vertical coordinates of each point are the difference between the sphere radii r and k, respectively storing the coordinates of the sampled ordered point set of each group of spherical obstacles in the vehicle body coordinate system, the two groups of spherical obstacles form two groups of ordered coordinate point sets which can be respectively expressed as point _ head and point _ tail, and the point _ head is expressed by taking point _ head as an example:
Figure BDA0003050802350000111
wherein (x)11,y11,z11)、(x22,y22,z22) And (x)33,y33,z33) Respectively representing the coordinates of three sphere centers in the vehicle body coordinate system, wherein the calculation mode of n and the calculation mode of the sampling points of the three sides are as follows:
Figure BDA0003050802350000112
Figure BDA0003050802350000113
wherein n is the number of sampling points of each edge, i is the serial number of the sampling points, and t is the sampling interval.
(3) The method comprises the steps of carrying out down-sampling, ground removing, point cloud clustering and center of sphere position clustering processing on a point cloud of a spherical obstacle detected by a vehicle-mounted laser radar, taking a center of sphere position clustering result as input, obtaining a center of sphere coordinate of the spherical obstacle in a laser radar coordinate system, and carrying out ordered sampling on the center of sphere coordinate to obtain an ordered coordinate point set of the spherical obstacle in the laser radar coordinate system.
The process of performing down-sampling and ground-removing processing on the point cloud of the spherical obstacle detected by the vehicle-mounted laser radar in the step (3) is as follows: acquiring and processing point cloud data scanned by a laser radar in real time, namely representing the acquired point cloud data as p ═ x, y, z, tau ], wherein x, y and z represent coordinates of a certain point in a laser radar coordinate system, tau represents echo intensity of the point, and the echo intensity is reflection intensity of a characteristic object to laser; collecting a frame of laser radar point cloud data, and performing down-sampling and ground removing processing on the frame of laser radar point cloud data by utilizing a PCL (personal computer) point cloud library and an RANSAC (random sample consensus) algorithm; the ground removing processing refers to removing points of the laser radar on the ground in the frame of laser radar point cloud data, and the points on the ground influence calculation finally, namely the RANSAC algorithm is used for segmenting the ground and non-ground point cloud of the laser radar.
The specific process of performing point cloud clustering on the point cloud data of the spherical obstacle detected by the vehicle-mounted laser radar in the step (3) is as follows: using laser radar point cloud data subjected to downsampling and ground removing processing as input, and performing point cloud clustering by using an Euclidean clustering algorithm to obtain a clustering result of one frame of laser radar point cloud data, namely obtaining laser point cloud clusters of each spherical obstacle respectively, wherein one point cloud cluster can be represented as P ═ P [ P ]1,p2,…pn]N is the number of points contained in one point cloud cluster; as shown in fig. 2, for the points in the laser point cloud cluster of each spherical obstacle, the T imsort algorithm is used to sort the points in the clustered point cloud cluster from high to low according to the echo intensity τ of the point cloud, and 30% of the points in the front of the queue are intercepted, and four points are used as the pointsA set of groups divides the set of intercepted points. For example, the set of points selected and sorted is Q ═ p1,p2,…pk],pi(1<i<k) Represents a certain point therein, k is the number of points in the point set, i.e.:
u1=[p1,p2,p3,p4]
ui=[p4i-3,p4i-2,p4i-1,p4i],(1=<i<=k mod 4)
uj=[p4j-3,p4j-2,p4j-1,p4j],(j=k mod 4)
wherein u is1Is a first set of data, ujFor the last set of data, uiFor the ith group of data, the divided set U is represented as:
U=[u1,u2,…,uj];
using the set U as input, calculating the sphere center coordinate o, such as o, corresponding to each group of data according to the Cramer's rule1=[x1,y1,z1]Assuming that the coordinates of the sphere center are (x, y, z), the four coordinates of the surface of the spherical obstacle are (x1, y1, z1), (x2, y2, z2), (x3, y3, z3), (x4, y4, z4), respectively, and the following is obtained according to the property that the distance from any point of the surface of the spherical obstacle to the sphere center is equal to the length of the radius:
Figure BDA0003050802350000131
by simplifying the above equation (4), the following equation can be obtained:
Figure BDA0003050802350000132
to simplify the presentation, let
Figure BDA0003050802350000133
Figure BDA0003050802350000134
Constants to the right of the equal sign are represented as P, Q, R, and can constitute a constant term determinant [ P, Q, R]TThe coefficients to the left of the equal sign x, y, z may form a coefficient matrix D as follows:
Figure BDA0003050802350000135
reissue to order
Figure BDA0003050802350000141
According to the Cramer law, the coordinates of the sphere centers are respectively:
Figure BDA0003050802350000142
the set of sphere centers calculated by the same set U may be denoted as O ═ O1,o2,…oj]And carrying out normalization processing on the sphere centers calculated by the same clustering result, wherein j is the number of the sphere centers, and the calculation mode is to calculate the weighted average of all the sphere centers, namely:
Figure BDA0003050802350000143
until the sphere center coordinates of the spherical obstacles sensed by the laser radar are calculated, namely q is [ x, y, z ], and the ordered coordinate point set of the sphere center of the spherical obstacle under the laser radar coordinate system is calculated as H:
H=[q1,q2,…qi](3=<i<=6)。
if the quantity of the clustering results of the laser radar point cloud data is 6, namely the field angle of the laser radar can cover the periphery of the vehicle body, clustering needs to be carried out again after the sphere center of each spherical obstacle under the laser radar coordinate system is calculated, and the coordinates of two groups of spherical obstacles at the upper left corner and the lower right corner of the vehicle body coordinate system are obtained respectively, and the clustering method is as follows: as shown in fig. 3, any one sphere center is selected, the euclidean distances from the sphere center to the other five sphere centers are calculated, the calculation results are sorted, two with the smallest distance and the two with the smallest distance can form a sphere center coordinate set, and the remaining clustering results form another set; sampling and taking values from the sphere center closest to the laser radar according to the sampling mode in the step (2.8), and storing the coordinates calculated by sampling in a lidar _ head and a lidar _ tail respectively.
If the quantity of the clustering results of the laser radar point cloud data is 3, namely the installation position of the laser radar can only sense the spherical obstacle at the upper left corner or the lower right corner, the clustering operation is not required to be carried out again, and the coordinate sampling operation is directly carried out; and (4) sampling coordinate values in the same sampling mode as the sampling mode in the step (2.8), and storing the coordinates calculated by sampling in the lidar _ head or the lidar _ tail respectively.
The step (3) of clustering the sphere center position of the point cloud data of the spherical obstacle detected by the vehicle-mounted laser radar is to cluster the sphere center coordinate set H according to the Euclidean distance, and the specific process is as follows: as shown in fig. 3, the center of a sphere of one spherical obstacle is randomly selected to obtain its coordinates, the coordinates of the centers of the remaining spherical obstacles are sequentially read and the euclidean distance between the coordinates and the center of the sphere of the first randomly selected spherical obstacle W is calculated, the coordinates of the centers of two spherical obstacles having the smallest euclidean distance with W are taken to form three vertices of a triangle with the coordinates of the center of the first randomly selected spherical obstacle, and the coordinates of the remaining three centers of the three spherical obstacles form three vertices of another triangle.
The calculation process of the rotational translation matrix between the vehicle body coordinate system and the laser radar coordinate system in the step (3) is as follows: and taking the point _ head or point _ tail and the lidar _ head or lidar _ tail as input, namely taking the obtained ordered coordinate point set of the spherical obstacle in a laser radar coordinate system and the obtained ordered coordinate point set of the spherical obstacle in a vehicle body coordinate system as input, calculating a rotation and translation matrix between the laser radar coordinate system and the vehicle body coordinate system by using a Singular Value Decomposition (SVD) algorithm in a PCL point cloud library, wherein the calculated rotation and translation matrix is a calibration result of the vehicle-mounted laser radar.
The invention has reasonable concept and simple operation flow, can automatically and standardize carry out high-efficiency calibration on the laser radar, effectively improves the calibration accuracy of the vehicle-mounted laser radar, and effectively solves the problems that the prior vehicle-mounted laser radar has complicated and complicated calibration steps, low accuracy and efficiency of manual calibration, needs operation of professionals and the like.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail with reference to the foregoing examples, those skilled in the art will appreciate that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (3)

1. A calibration method of a vehicle-mounted laser radar is characterized in that a camera used for obtaining an image of a calibration field is arranged above the middle of the calibration field of the vehicle-mounted laser radar, spherical obstacles are respectively arranged at two opposite corners of the calibration field of the vehicle-mounted laser radar, a vehicle body coordinate system of a vehicle of the vehicle to be calibrated is constructed, the coordinates of the center of sphere of the spherical obstacle in the vehicle body coordinate system are calculated, and then a characteristic calibration matrix formed by the centers of sphere of the spherical obstacles is sampled in sequence to obtain an ordered coordinate point set of the characteristic calibration matrix;
sequentially performing down-sampling, ground removing, point cloud clustering and center of sphere position clustering on the point cloud data of the spherical obstacle detected by the vehicle-mounted laser radar, taking the obtained center of sphere position clustering result as input, acquiring the center of sphere coordinates of the spherical obstacle in a laser radar coordinate system, and performing ordered sampling on the center of sphere coordinates to acquire an ordered coordinate point set of the spherical obstacle in the laser radar coordinate system;
finally, taking the obtained ordered coordinate point set of the spherical barrier under the laser radar coordinate system and the ordered coordinate point set of the spherical barrier under the vehicle body coordinate system as input, and calculating a rotation translation matrix between the vehicle body coordinate system and the laser radar coordinate system, namely a calibration result of the vehicle-mounted laser radar;
firstly, acquiring point cloud data scanned by a laser radar in real time, then expressing the acquired point cloud data as p ═ x, y, z and tau ], wherein x, y and z express the coordinate of a certain point in a laser radar coordinate system, tau expresses the echo intensity of the point, acquiring a frame of laser radar data, and carrying out downsampling and ground removing processing on the point cloud data scanned by the frame of laser radar in real time by utilizing a voxel downsampling algorithm and a RANSAC algorithm in a PCL point cloud library;
the specific process of carrying out point cloud clustering processing on the point cloud data of the spherical obstacle detected by the vehicle-mounted laser radar comprises the following steps: using laser radar point cloud data subjected to downsampling and ground removing processing as input, performing point cloud clustering by using an Euclidean clustering algorithm to obtain a clustering result of one frame of laser radar point cloud data, and obtaining a laser point cloud cluster of each spherical obstacle, wherein the point cloud cluster can be represented as P ═ P1,p2,…pn]N is the number of points contained in one point cloud cluster;
sorting the points in the clustered point cloud cluster of each obtained spherical barrier by using a Timport sorting algorithm according to the mode that the echo intensity tau of the point cloud is from high to low, intercepting 30% of the points in front of a sorted queue, and dividing an intercepted point set into a group of four points;
the selected and ordered point set is Q ═ p1,p2,…pk],pi(1<i<k) Represents a certain point therein, k is the number of points in the point set, i.e.:
u1=[p1,p2,p3,p4]
ui=[p4i-3,p4i-2,p4i-1,p4i],(1=<i<=k mod 4)
uj=[p4j-3,p4j-2,p4j-1,p4j],(j=k mod 4)
wherein u1Is a first set of data, ujFor the last set of data, uiFor the ith group of data, the divided set U is represented as:
U=[u1,u2,…,uj];
taking each element of the set U as input, and calculating the spherical center coordinates o, U corresponding to each group of data according to the claime rule1Corresponding to a sphere center coordinate of o1=[x1,y1,z1],u1The coordinates of the surface points of the four spherical obstacles are contained;
assuming that the coordinates of the sphere center are (x, y, z), the four coordinates of the surface of the spherical obstacle are (x1, y1, z1), (x2, y2, z2), (x3, y3, z3), (x4, y4, z4), respectively, and the following is obtained according to the property that the distance from any point of the surface of the spherical obstacle to the sphere center is equal to the length of the radius:
Figure FDA0003588735380000021
the above formula (4) is modified to obtain the following formula:
Figure FDA0003588735380000031
to simplify the presentation, let
Figure FDA0003588735380000032
Figure FDA0003588735380000033
The constants to the right of the equal sign of formula (5) are represented as P, Q, R,can form a constant term determinant as [ P, Q, R]TThe coefficients to the left of the equal sign x, y, z may form a coefficient matrix D as follows:
Figure FDA0003588735380000034
reissue to order
Figure FDA0003588735380000035
According to the Cramer law, the coordinates of the sphere centers are respectively:
Figure FDA0003588735380000036
the set of centroids calculated by the same data set U may be denoted as O ═ O1,o2,…oj]And carrying out normalization processing on the sphere centers calculated by the same clustering result, wherein j is the number of the sphere centers, the obtained result is the sphere center coordinate of the sphere corresponding to the set U, and the calculation mode is to calculate the weighted average of all the sphere centers, namely:
Figure FDA0003588735380000041
until the sphere center coordinates of all the spherical obstacles perceived by the laser radar are calculated and expressed as q ═ x ', y ', z ', the sphere center coordinate set obtained by perception calculation is H:
H=[q1,q2,…qi](3=<i<=6)。
2. the vehicle-mounted lidar calibration method according to claim 1, wherein the ball center position clustering process on the point cloud data of the spherical obstacle detected by the vehicle-mounted lidar is to cluster the ball center coordinate set H according to the euclidean distance, and the specific process is as follows: the method comprises the steps of randomly selecting the sphere center of a spherical obstacle to obtain the coordinate of the spherical obstacle, sequentially reading the sphere center coordinates of the rest spherical obstacles, calculating the Euclidean distance between the sphere center coordinates of the rest spherical obstacles and the sphere center coordinate of the spherical obstacle randomly selected for the first time, then taking the sphere center coordinates of the two spherical obstacles with the minimum Euclidean distance between the sphere center coordinates of the two spherical obstacles randomly selected for the first time and the sphere center coordinate of the spherical obstacle randomly selected for the first time to form three vertexes of a triangle, and extracting the three remaining sphere center coordinates after the two sphere center coordinates are extracted to form three vertexes of another triangle.
3. The vehicle-mounted lidar calibration method of claim 1, wherein: the rotation translation matrix is calculated by taking an ordered point coordinate set of the spherical obstacle in a vehicle body coordinate system and an ordered point coordinate set of the spherical obstacle in a laser radar coordinate system as input and using a singular value decomposition algorithm.
CN202110487006.6A 2021-05-03 2021-05-03 Vehicle-mounted laser radar calibration method Active CN113156411B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110487006.6A CN113156411B (en) 2021-05-03 2021-05-03 Vehicle-mounted laser radar calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110487006.6A CN113156411B (en) 2021-05-03 2021-05-03 Vehicle-mounted laser radar calibration method

Publications (2)

Publication Number Publication Date
CN113156411A CN113156411A (en) 2021-07-23
CN113156411B true CN113156411B (en) 2022-05-20

Family

ID=76873244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110487006.6A Active CN113156411B (en) 2021-05-03 2021-05-03 Vehicle-mounted laser radar calibration method

Country Status (1)

Country Link
CN (1) CN113156411B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114879153A (en) * 2022-06-08 2022-08-09 中国第一汽车股份有限公司 Radar parameter calibration method and device and vehicle
JP2024015755A (en) * 2022-07-25 2024-02-06 株式会社小糸製作所 Target device, adjustment method, adjustment system and adjustment program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105866762B (en) * 2016-02-26 2018-02-23 福州华鹰重工机械有限公司 Laser radar automatic calibrating method and device
CN109001711B (en) * 2018-06-05 2020-06-26 北京智行者科技有限公司 Multi-line laser radar calibration method
CN111912416B (en) * 2019-05-07 2022-07-29 北京市商汤科技开发有限公司 Method, device and equipment for positioning equipment

Also Published As

Publication number Publication date
CN113156411A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN109948661B (en) 3D vehicle detection method based on multi-sensor fusion
CN109598765B (en) Monocular camera and millimeter wave radar external parameter combined calibration method based on spherical calibration object
CN108932736B (en) Two-dimensional laser radar point cloud data processing method and dynamic robot pose calibration method
CN112396664B (en) Monocular camera and three-dimensional laser radar combined calibration and online optimization method
CN112101092A (en) Automatic driving environment sensing method and system
CN113156411B (en) Vehicle-mounted laser radar calibration method
CN110673107B (en) Road edge detection method and device based on multi-line laser radar
CN112308916B (en) Target pose recognition method based on image target
CN113470090A (en) Multi-solid-state laser radar external reference calibration method based on SIFT-SHOT characteristics
CN112001958A (en) Virtual point cloud three-dimensional target detection method based on supervised monocular depth estimation
CN110490936A (en) Scaling method, device, equipment and the readable storage medium storing program for executing of vehicle camera
CN113658257B (en) Unmanned equipment positioning method, device, equipment and storage medium
CN111860072A (en) Parking control method and device, computer equipment and computer readable storage medium
CN114325634A (en) Method for extracting passable area in high-robustness field environment based on laser radar
CN111476242A (en) Laser point cloud semantic segmentation method and device
CN115372990A (en) High-precision semantic map building method and device and unmanned vehicle
CN114120067A (en) Object identification method, device, equipment and medium
CN114179788A (en) Automatic parking method, system, computer readable storage medium and vehicle terminal
CN116279592A (en) Method for dividing travelable area of unmanned logistics vehicle
CN114332796A (en) Multi-sensor fusion voxel characteristic map generation method and system
CN114118247A (en) Anchor-frame-free 3D target detection method based on multi-sensor fusion
Deng et al. Joint calibration of dual lidars and camera using a circular chessboard
CN113219472A (en) Distance measuring system and method
Wang et al. A survey of extrinsic calibration of lidar and camera
CN115546202B (en) Tray detection and positioning method for unmanned forklift

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant