CN114935747B - Laser radar calibration method, device, equipment and storage medium - Google Patents

Laser radar calibration method, device, equipment and storage medium Download PDF

Info

Publication number
CN114935747B
CN114935747B CN202210477041.4A CN202210477041A CN114935747B CN 114935747 B CN114935747 B CN 114935747B CN 202210477041 A CN202210477041 A CN 202210477041A CN 114935747 B CN114935747 B CN 114935747B
Authority
CN
China
Prior art keywords
point cloud
coordinate system
transformation matrix
vehicle body
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210477041.4A
Other languages
Chinese (zh)
Other versions
CN114935747A (en
Inventor
赵学思
夏冰冰
石拓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Yijing Technology Co ltd
Original Assignee
Suzhou Yijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Yijing Technology Co ltd filed Critical Suzhou Yijing Technology Co ltd
Priority to CN202210477041.4A priority Critical patent/CN114935747B/en
Publication of CN114935747A publication Critical patent/CN114935747A/en
Application granted granted Critical
Publication of CN114935747B publication Critical patent/CN114935747B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a laser radar calibration method, device, equipment and storage medium. The laser radar calibration method comprises the following steps: obtaining a first transformation matrix from a scanning device coordinate system to a radar coordinate system; acquiring a first point cloud acquired by scanning equipment and a second point cloud of a vehicle body; performing point cloud registration on the first point cloud and the second point cloud to obtain a second transformation matrix from a scanning equipment coordinate system to a vehicle body coordinate system; a third transformation matrix of the radar coordinate system to the vehicle body coordinate system is determined based on the first transformation matrix and the second transformation matrix. In the method, external parameters of the laser radar are calibrated according to a first transformation matrix from a scanning device coordinate system to a radar coordinate system and a second transformation matrix from the scanning device coordinate system to a vehicle body coordinate system, so that the radar coordinate system is calibrated to the vehicle body coordinate system.

Description

Laser radar calibration method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of laser radar technologies, and in particular, to a method, an apparatus, a device, and a storage medium for calibrating a laser radar.
Background
Lidar is an object detection technique. The laser is used as a signal light source, and the reflected signal of the target object is collected by emitting the laser to the target object, so that information such as the azimuth and the speed of the target object is obtained. The laser radar has the advantages of high measurement precision, strong anti-interference capability and the like, and is widely applied to the fields of remote sensing, measurement, automatic driving, robots and the like.
Currently, autopilot vehicles are equipped with various types of sensors, including lidar. How to calibrate the radar coordinate system of the laser radar to the vehicle body coordinate system is a problem to be solved urgently.
Disclosure of Invention
The application provides a laser radar calibration method, device, equipment and storage medium, so as to achieve calibration of a laser radar coordinate system to a vehicle body coordinate system.
In a first aspect, the present application provides a method for calibrating a laser radar, including: obtaining a first transformation matrix from a scanning device coordinate system to a radar coordinate system; acquiring a first point cloud acquired by a scanning device and a second point cloud of the vehicle body, wherein the first point cloud is obtained by the scanning device through scanning at least one part of the vehicle body, and the second point cloud is obtained by performing discrete sampling on a model of the vehicle body or performing point cloud simulation on the model of the vehicle body; performing point cloud registration on the first point cloud and the second point cloud to obtain a second transformation matrix from a scanning equipment coordinate system to a vehicle body coordinate system; and determining a third transformation matrix from the radar coordinate system to the vehicle body coordinate system according to the first transformation matrix and the second transformation matrix, wherein the third transformation matrix is used for calibrating the laser radar.
In some possible implementations, obtaining a first transformation matrix of the scanning device coordinate system to the radar coordinate system includes: acquiring a third point cloud acquired by a scanning device and a fourth point cloud acquired by a laser radar, wherein an overlapping area exists between a field of view of the scanning device and the field of view of the laser radar, and at least one part of a characteristic object is arranged in the overlapping area; and carrying out point cloud registration on the third point cloud and the fourth point cloud to obtain a first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
In some possible embodiments, performing point cloud registration on the third point cloud and the fourth point cloud to obtain a first transformation matrix from a scanning device coordinate system to a radar coordinate system, including: estimating the initial pose of the laser radar relative to the scanning equipment according to the third point cloud and the fourth point cloud; and according to the initial pose, performing point cloud registration on the third point cloud and the fourth point cloud to obtain a first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
In some possible embodiments, determining a third transformation matrix of the radar coordinate system to the vehicle body coordinate system from the first transformation matrix and the second transformation matrix comprises: the third transformation matrix is obtained by multiplying the inverse of the first transformation matrix by the second transformation matrix.
In some possible embodiments, after determining the third transformation matrix of the radar coordinate system to the vehicle body coordinate system, the method further includes: and determining the displacement and the rotation angle of the radar coordinate system relative to the vehicle body coordinate system according to the third transformation matrix.
In a second aspect, the present application provides a calibration device for a laser radar, where the calibration device may be a chip or a system on a chip in the laser radar, and may also be a functional module in the laser radar for implementing the method according to the first aspect and any possible implementation manner thereof. The calibration device may implement the functions performed by the lidar in the first aspect and any possible implementation manners thereof, where the functions may be implemented by hardware executing corresponding software. Such hardware or software includes one or more modules corresponding to the functions described above. The calibration device comprises: the acquisition module is used for acquiring a first transformation matrix from the scanning device to the radar coordinate system; the acquisition module is also used for acquiring a first point cloud acquired by the scanning equipment and a second point cloud of the vehicle body, wherein the first point cloud is obtained by scanning at least one part of the vehicle body by the scanning equipment, and the second point cloud is obtained by performing discrete sampling on a model of the vehicle body or performing point cloud simulation on the model of the vehicle body; the point cloud registration module is used for carrying out point cloud registration on the first point cloud and the second point cloud to obtain a second transformation matrix from the scanning equipment coordinate system to the vehicle body coordinate system; the determining module is used for determining a third transformation matrix from the radar coordinate system to the vehicle body coordinate system according to the first transformation matrix and the second transformation matrix, and the third transformation matrix is used for calibrating the laser radar to the vehicle body coordinate system.
In some possible embodiments, the obtaining module is configured to obtain a third point cloud acquired by the scanning device and a fourth point cloud acquired by the laser radar, where a field of view of the scanning device and a field of view of the laser radar have an overlapping area, and at least a part of the feature object is disposed in the overlapping area; the point cloud registration module is used for: and carrying out point cloud registration on the third point cloud and the fourth point cloud to obtain a first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
In some possible embodiments, the point cloud registration module is configured to estimate an initial pose of the laser radar with respect to the scanning device according to the third point cloud and the fourth point cloud; and according to the initial pose, performing point cloud registration on the third point cloud and the fourth point cloud to obtain a first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
In some possible implementations, the determining module is configured to obtain the third transformation matrix by multiplying an inverse of the first transformation matrix by the second transformation matrix.
In some possible embodiments, the determining module is configured to determine, after determining the third transformation matrix, a displacement and a rotation angle of the radar coordinate system with respect to the vehicle body coordinate system according to the third transformation matrix.
In a third aspect, the present application provides a calibration device for a lidar, including: a memory storing computer executable instructions; a processor, coupled to the memory, for executing computer-executable instructions to implement the method according to the first aspect and any possible implementation thereof.
In a fourth aspect, the present application provides a calibration system for a lidar, including: n radars and a calibration device as described in the third aspect and any possible implementation thereof.
In a fifth aspect, the present application provides a computer storage medium having stored thereon computer executable instructions which, when executed by a processor, are capable of carrying out the method according to the first aspect and any one of its possible embodiments.
Compared with the prior art, the technical scheme provided by the application has the beneficial effects that:
in the method, the calibration device calibrates the external parameters of the laser radar according to the first transformation matrix from the scanning device coordinate system to the radar coordinate system and the second transformation matrix from the scanning device coordinate system to the vehicle body coordinate system so as to determine the third transformation matrix from the radar coordinate system to the vehicle body coordinate system, and thus the radar coordinate system is calibrated to the vehicle body coordinate system. Furthermore, the calibration from the radar coordinate system to the vehicle body coordinate system can be realized only through point cloud registration and transformation matrix calculation, and the operation is easy. In addition, the second transformation matrix is obtained by calculation according to the actually measured vehicle body point cloud and the vehicle body point cloud corresponding to the vehicle body model, and the accuracy is high, so that the accuracy of calibration by adopting the third transformation coordinate system is also high.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the scope of the application.
Drawings
Fig. 1 is a schematic structural diagram of a laser radar in the related art;
FIG. 2 is a schematic structural diagram of a laser radar calibration system according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a first implementation of a method for calibrating a laser radar according to an embodiment of the present application;
fig. 4 is a schematic diagram of a first scan scenario in an embodiment of the present application;
FIG. 5 is a schematic diagram of a second scan scenario in an embodiment of the present application;
FIG. 6 is a schematic flow chart of a second implementation of a method for calibrating a lidar according to an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a calibration device of a laser radar according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a calibration device of a lidar according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical solutions described in the present application, the following description is made by specific examples.
LiDAR (light detection and ranging, liDAR) is an object detection technology. The laser radar emits laser beams through the laser, the laser beams are diffusely reflected after encountering a target object, the reflected beams are received through the detector, and the characteristic quantities such as the distance, the azimuth, the height, the speed, the gesture and the shape of the target object are determined according to the emitted beams and the reflected beams. The application field of laser radar is very wide. In addition to its use in the military field, it is now widely used in the life field, including but not limited to: intelligent piloting vehicles, intelligent piloting airplanes, three-dimensional (3D) printing, virtual reality, augmented reality, service robots, and the like. Taking intelligent driving technology as an example, a laser radar is arranged in an intelligent driving vehicle, and the laser radar can scan the surrounding environment by rapidly and repeatedly emitting laser beams so as to acquire point cloud data reflecting the morphology, the position, the movement and the like of one or more target objects in the surrounding environment.
The intelligent driving technique may refer to unmanned, automatic, auxiliary, and the like. Fig. 1 is a schematic structural diagram of a lidar in the related art, and referring to fig. 1, a lidar 10 may include: a light emitting device 101, a light receiving device 102, and a processor 103. Wherein the light emitting device 101 and the light receiving device 102 are connected with the processor 103.
The connection relationship between the devices may be electrical connection or optical fiber connection. More specifically, in the light emitting device 101 and the light receiving device 102, a plurality of optical devices may be included, respectively, and the connection relationship between these optical devices may also be a spatial light transmission connection.
The processor 103 is used to realize control of the transmitting device 101 and the light receiving device 102 so that the light transmitting device 101 and the light receiving device 102 can operate normally. Illustratively, the processor 103 may provide driving voltages for the light emitting device 101 and the light receiving device 102, respectively, and the processor 103 may also provide control signals for the light emitting device 101 and the light receiving device 102.
By way of example, the processor 103 may be a general-purpose processor such as a central processing unit (central processing unit, CPU), a network processor (network processor, NP), etc.; the processor 103 may also be a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), a field-programmable gate array (field-programmable gate array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
The light emitting device 101 further comprises a light source (not shown in fig. 1). It will be appreciated that the light source may refer to a laser and the number of lasers may be one or more. Alternatively, the laser may be embodied as a pulsed laser diode (pulsed laser diode, PLD), semiconductor laser, fiber laser, or the like. The light source is used for emitting a laser beam. In particular, the processor 103 may send an emission control signal to the light source, triggering the light source to emit a laser beam.
It will be appreciated that the laser beam may also be referred to as a laser pulse, laser, emitted beam, etc.
The following describes the detection process of the target object 104 by the lidar in brief in connection with the structure of the lidar shown in fig. 1.
Referring to fig. 1, the laser beam propagates in the emission direction, and when the laser beam encounters the target object 104, reflection occurs on the surface of the target object 104, and the reflected beam is received by the light receiving device 102 of the laser radar. Here, the beam of the laser beam reflected back by the target object 104 may be referred to as an echo beam (the laser beam and the echo beam are indicated by solid lines in fig. 1).
After the light receiving device 102 receives the echo beam, photoelectric conversion is performed on the echo beam, that is, the echo beam is converted into an electrical signal, the light receiving device 102 outputs the electrical signal corresponding to the echo beam to the processor 103, and the processor 103 may obtain the morphology, the position, the moving point cloud data, and the like of the target object 104 according to the electrical signal of the echo beam.
In practice, autonomous vehicles are equipped with various types of sensors, such as lidar, millimeter-wave radar, cameras and inertial measurement units (inertial measurement unit, IMU), etc. In the actual installation process, the coordinate systems of the sensors (such as the vehicle-mounted laser radar) are not coincident, so that in order to realize accurate measurement, an appropriate mode is needed to be adopted to perform external parameter calibration of the vehicle-mounted laser radar, so that the radar coordinate system of the laser radar and the vehicle body coordinate system are unified, and the radar coordinate system is generally unified to the vehicle body coordinate system. How to calibrate the radar coordinate system of the vehicle-mounted laser radar to the vehicle body coordinate system is a problem to be solved.
In order to solve the above problems, an embodiment of the present application provides a laser radar calibration system, and the method may be applied to the laser radar calibration system.
Fig. 2 is a schematic structural diagram of a calibration system of a lidar according to an embodiment of the present application, and referring to fig. 2, the calibration system 20 may include: n radars (N is a positive integer, when N is greater than 1, the N radars may be denoted as radar 1, radar 2, radar 3, …, radar N), the calibration device 21, and the scanning device 22. The N radars are vehicle-mounted radars which need to be calibrated; the scanning device 22 is used for assisting the calibration device in calibrating the N radars; the calibration device 21 is used for calibrating the N radars.
The N radars are vehicle-mounted radars, i.e., radars mounted on the vehicle body, for sensing the external environment of the vehicle body. One or more vehicle-mounted radars may be provided on one vehicle body.
The scanning device 22 may be one or more. When there are a plurality of scanning devices 22, the plurality of scanning devices 22 are calibrated by external parameters, and the transformation matrix between the coordinate systems of these scanning devices 22 is known.
The scanning device 22 may be a high-precision three-dimensional laser scanner (such as a total station), or may be any device capable of generating high-precision three-dimensional point cloud information (such as various forms of lidar), which is not specifically limited in this embodiment of the present application.
In the embodiment of the present application, the vehicle-mounted radar refers to a vehicle-mounted lidar unless otherwise specified.
In practical application, the calibration device may be an independent device, may be integrated with the scanning device, and may also be integrated with the vehicle-mounted control device, which is not specifically limited in this embodiment of the present application.
The following describes a calibration method of the laser radar provided in the embodiment of the present application in combination with the calibration system.
In the embodiment of the present application, as shown in fig. 2, the calibration method of the laser radar is described by taking an example that the calibration system includes a vehicle-mounted radar (such as radar 1), a scanning device 22, and a calibration device 21.
Fig. 3 is a schematic flow chart of a first implementation of a method for calibrating a lidar according to an embodiment of the present application, and referring to fig. 3, the method may include:
s301: the calibration device obtains a first transformation matrix of the scanning device coordinate system to the radar coordinate system.
It should be understood that the scanning device may obtain the first transformation matrix from the scanning device coordinate system to the radar coordinate system through external parameter calibration in advance, and then the calibration device performs S301 to obtain the first transformation matrix. Alternatively, the calibration device may obtain the first transformation matrix through the point cloud a (i.e. the third point cloud) acquired by the scanning device and the point cloud B (i.e. the fourth point cloud) acquired by the laser radar. Of course, the calibration device may also obtain the first transformation matrix in other manners, which is not specifically limited in the embodiments of the present application.
In some possible embodiments, S301 may include: the calibration equipment obtains a point cloud A acquired by the scanning equipment and a point cloud B acquired by the laser radar; and the calibration equipment performs point cloud registration on the point cloud A and the point cloud B to obtain a first transformation matrix from the scanning equipment coordinate system to the radar coordinate system. That is, the scanning device and the laser radar scan their own fields of view, respectively, and acquire a point cloud a and a point cloud B. And then, the calibration equipment performs point cloud registration on the point cloud A and the point cloud B to obtain a first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
The field of view of the scanning device may include at least a part of the vehicle body and at least a part of the feature object, or may include only at least a part of the feature object; the field of view of the lidar includes at least a portion of the characteristic object. Wherein there is an overlap region (which may also be described as an overlap region) of the scanning device with the field of view of the lidar, at least a portion of the characteristic object may be provided in the overlap region. Illustratively, a part or all of the feature object may be disposed within the overlap region, and a part or all of the vehicle body may also be disposed.
Alternatively, the scanning device and the lidar may be placed in a scan scene before the point cloud is acquired. In this scanning scenario, there is an overlapping area of the scanning device and the field of view of the lidar.
In one embodiment, in setting up a scan scene, feature objects having typical point, line, surface features, including but not limited to corners, edges, walls, etc., may be selected, while the shape and relative orientation of the selected feature objects should have characteristics that are constant over time. By moving the vehicle body, the feature object is caused to appear within the field of view of the lidar. And adjusting the scanning device so that part or all of the vehicle body and the characteristic object are in the field of view of the scanning device. At this time, there is an overlapping area of the scanning device and the field of view of the lidar.
For example, fig. 4 is a schematic diagram of a first scan scenario in an embodiment of the present application, referring to fig. 4, in a scan scenario 40, the scan scenario includes: the vehicle body 41, the scanning device 22, the feature object 42, and the radar 1 located on the vehicle body 41. The field of view of the scanning device 22 includes a portion of the vehicle body 41 and a feature object 42; the characteristic object 42 is included in the field of view of the radar 1 located on the vehicle body 41, and there is an overlapping area of the field of view of the scanning device 22 and the field of view of the radar 1.
Further, for the first scanning scene, the scanning device may perform high-precision mapping on a part of the vehicle body and the feature object in its own view field to obtain a full scene point cloud M (i.e. point cloud a), and the coordinate system of the scanning device is denoted as O 1 (hereinafter simply referred to as the scanning device coordinate system). And scanning the characteristic object in the self view field by the laser radar to obtain a radar point cloud P (namely a point cloud B), wherein a coordinate system of the laser radar is marked as O 2 (hereinafter, simply referred to as a radar coordinate system).
It should be noted that, when setting up a scan scene, selecting an object with obvious features is beneficial to improving the accuracy of the subsequent point cloud registration algorithm, especially the normal distribution transform (normal distributions transform, NDT) algorithm.
For example, fig. 5 is a schematic diagram of a second scan scenario in an embodiment of the present application, referring to fig. 5, in a scan scenario 50, the scan scenario includes: the vehicle body 41, the scanning device 22, the feature object 42, and the radar 1 located on the vehicle body 41. Only the characteristic object 42 is present in the field of view of the scanning device 22; only the characteristic object 42 appears in the field of view of the radar 1 located on the vehicle body 41, and there is an overlapping area of the field of view of the scanning device 22 and the field of view of the radar 1.
For the second scanning scene, the scanning device may perform high-precision mapping on the feature objects in the field of view of the scanning device to obtain a point cloud a (i.e., a point cloud under the coordinate system of the scanning device); the laser radar scans the characteristic object in the self view field to obtain a point cloud B (point cloud under a radar coordinate system).
In some possible embodiments, after the scanning device and the laser radar acquire the point cloud a and the point cloud B, the calibration device performs point cloud registration on the point cloud a and the point cloud B to obtain a first transformation matrix from the scanning device coordinate system to the radar coordinate system.
It should be appreciated that the calibration device may perform point cloud registration on the point cloud a and the point cloud B by using a point cloud registration algorithm to obtain a first transformation matrix. For example, the point cloud registration algorithm may include: NDT algorithm, iterative closest point algorithm (iterative closest point, ICP) algorithm, etc.
In some possible embodiments, for the first scan scene, the step of obtaining the first transformation matrix in S301 may include: the calibration device estimates the initial pose of the laser radar relative to the scanning device according to the point cloud A and the point cloud B; and carrying out point cloud registration on the point cloud A and the point cloud B according to the initial pose to obtain a first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
Illustratively, after obtaining the point cloud a and the point cloud B, the calibration device compares the point cloud a and the point cloud B, respectively, to obtain an initial pose estimate of the radar coordinate system relative to the coordinate system of the scanning device, where the initial pose estimate includes initial estimates of six degrees of freedom (e.g., x, y, z, pitch, yaw and roll) of translation and rotation. Wherein (x, y, z) represents a position coordinate of the origin of the radar coordinate system in the scanning device coordinate system, x represents a displacement of the origin of the radar coordinate system relative to the origin of the scanning device coordinate system in the x-axis direction, and y represents a position coordinate of the origin of the radar coordinate system in the scanning device coordinate systemA displacement of the origin of the radar coordinate system relative to the origin of the scanning device coordinate system in the y-axis direction, z representing a displacement of the origin of the radar coordinate system relative to the origin of the scanning device coordinate system in the z-axis direction; (pitch, yaw, roll) represents the euler angle of the radar coordinate system compared to the scanning device coordinate system, pitch represents the pitch angle of the radar coordinate system compared to the scanning device coordinate system, yaw represents the yaw angle of the radar coordinate system compared to the scanning device coordinate system, and roll represents the roll angle of the radar coordinate system compared to the scanning device coordinate system. The calibration device uses the initial pose estimation of the radar coordinate system compared with the scanning device coordinate system as an initial external parameter of the point cloud registration algorithm. Further, the calibration device performs fine point cloud registration on the point cloud A and the point cloud B by using a point cloud registration algorithm, such as an ICP algorithm or an NDT algorithm, to obtain a radar coordinate system O 2 Compared with the scanning device coordinate system O 1 Is the first transformation matrix of the scanning device coordinate system to the radar coordinate system (which can be marked as T 12 )。
Of course, the calibration device may also obtain the first transformation matrix in other manners, which is not specifically limited in the embodiment of the present application.
S302, the calibration device obtains a point cloud C (namely a first point cloud) acquired by the scanning device and a point cloud D (namely a second point cloud) of the vehicle body.
The scanning equipment scans the vehicle body to obtain a point cloud C, and performs discrete sampling on a model of the vehicle body or performs point cloud simulation on the model of the vehicle body to obtain a point cloud D.
It should be noted that, before S302 is performed, the calibration apparatus may change the scanning range of the scanning apparatus, or change the relative position between the scanning apparatus and the vehicle body, or move the vehicle body so that the vehicle body appears within the field of view of the scanning apparatus. At this time, at least a part of the vehicle body may be included in the field of view of the scanning device. In the following, description will be given taking an example in which only the vehicle body is entirely located within the field of view of the scanning apparatus.
It should be appreciated that after S301, a change to the scan scene described above is required, at which point the scanning device may adjust its scan angle such that the vehicle body is entirely within the field of view of the scanning device. Then, the scanning device scans the vehicle body to obtain a point cloud C (namely, the vehicle body point cloud under the coordinate system of the scanning device). And the calibration equipment performs discrete sampling on a vehicle body model (such as a vehicle body 3D model) or performs point cloud simulation on the vehicle body model to obtain a point cloud D (namely, a vehicle body point cloud under a vehicle body coordinate system). Alternatively, after S301, the above-described scan scene needs to be changed, at which time the vehicle body is moved into the field of view of the scanning apparatus so that the vehicle body is entirely present within the field of view of the scanning apparatus. Then, the scanning device scans the vehicle body to obtain a point cloud C.
Alternatively, the body model may be a 3D model of the body obtained from the vehicle manufacturer.
In some possible embodiments, the point cloud D may be obtained by performing a mode such as discrete sampling, point cloud simulation, or the like on a model of the vehicle body. At this time, the point cloud D can be understood as a vehicle body point cloud in a vehicle body coordinate system. In the embodiment of the present application, the vehicle body coordinate system may be denoted as O 3 (hereinafter, simply referred to as a vehicle body coordinate system).
Illustratively, the origin of the vehicle body coordinate system is located at the center of the rear axle of the vehicle, the x-axis is directed to the right along the carrier axis, the y-axis is directed forward along the vehicle body, and the z-axis meets the right-hand rule of coordinates and is directed to the sky.
And S303, the calibration equipment performs point cloud registration on the point cloud C and the point cloud D to obtain a second transformation matrix from the scanning equipment coordinate system to the vehicle body coordinate system.
It should be understood that, by using a point cloud registration algorithm, the calibration device performs point cloud registration on the point cloud C and the point cloud D to obtain the second transformation matrix. For example, the point cloud registration algorithm may include: NDT algorithm, ICP algorithm, etc.
In some possible embodiments, the calibration device coordinates the scanning device to the system O through a point cloud registration algorithm (e.g., NDT, ICP, etc.) 1 Under point cloud C and vehicle body coordinate system O 3 Registering the point cloud D to obtain a transformation matrix from the scanning equipment coordinate system to the vehicle body coordinate system, namely a second transformation matrix T 13
S304, the calibration equipment determines a third transformation matrix from the radar coordinate system to the vehicle body coordinate system according to the first transformation matrix and the second transformation matrix.
The third transformation matrix is used for calibrating the laser radar.
It will be appreciated that the third transformation matrix is obtained by calculating the first transformation matrix and the second transformation matrix.
In some possible embodiments, the step S304 includes: the calibration device obtains a third transformation matrix by multiplying the inverse of the first transformation matrix by the second transformation matrix.
Exemplary, the calibration device is based on a first transformation matrix T 12 And a second transformation matrix T 13 Calculating to obtain a third transformation matrix T 23 Formula (1) of (2) is as follows:
T 23 =T 12 -1 ·T 13 (1)
wherein T is 12 -1 Is T 12 Is a matrix of inverse of (a).
In some possible embodiments, after S304, the method may further include: the calibration device determines the displacement (Deltax, deltay, deltaz) and the rotation angle (Deltapitch, deltaroll, deltayaw) of the radar coordinate system relative to the vehicle body coordinate system based on the third transformation matrix.
Thus, the radar coordinate system is calibrated to the vehicle body coordinate system.
The method for calibrating the laser radar will be described with specific examples.
Assume that the scan scene is the first scan scene shown in fig. 4 described above.
Fig. 6 is a schematic flow chart of a second implementation of a method for calibrating a lidar according to an embodiment of the present application, and referring to fig. 6, the method may include:
s601, the calibration equipment obtains a full scene point cloud M collected by the scanning equipment and a radar point cloud P collected by the laser radar.
S602, the calibration device estimates the initial pose of the scanning device relative to the laser radar according to the full scene point cloud M and the radar point cloud P.
S603, calibrating equipment according to the initial pose, aiming at the full scene point cloudPerforming point cloud registration on the M and the radar point cloud P to obtain a scanning equipment coordinate system O 1 To the radar coordinate system O 2 Is a first transformation matrix T of (a) 12
S604, the calibration device obtains a scanning device coordinate system O acquired by the scanning device 1 Lower vehicle body point cloud and vehicle body coordinate system O 3 And (3) a lower vehicle body point cloud.
S605, calibrating the coordinate system O of the equipment to the scanning equipment 1 Lower vehicle body point cloud and vehicle body coordinate system O 3 The lower vehicle body point cloud is subjected to point cloud registration to obtain a scanning equipment coordinate system O 1 To the car body coordinate system O 3 Is a second transformation matrix T of (2) 13
S606, calibrating equipment according to the first transformation matrix T 12 And a second transformation matrix T 13 Determining a radar coordinate system O 2 To the car body coordinate system O 3 Third transformation matrix T of (2) 23
In the embodiment of the application, the first transformation matrix from the scanning equipment coordinate system to the radar coordinate system is obtained, the second transformation matrix from the scanning equipment coordinate system to the vehicle body coordinate system is obtained according to the first point cloud and the second point cloud, and the third transformation matrix from the radar coordinate system to the vehicle body coordinate system is determined according to the first transformation matrix and the second transformation matrix, so that the radar coordinate system is calibrated to the vehicle body coordinate system, the calibration accuracy is high, the calibration equipment is easy to operate, and the external parameters of the laser radar are calibrated according to the first transformation matrix from the scanning equipment coordinate system to the radar coordinate system and the second transformation matrix from the scanning equipment coordinate system to the vehicle body coordinate system, so that the radar coordinate system is calibrated to the vehicle body coordinate system. Furthermore, the calibration from the radar coordinate system to the vehicle body coordinate system can be realized only through point cloud registration and transformation matrix calculation, and the operation is easy. In addition, the second transformation matrix is obtained by calculation according to the actually measured vehicle body point cloud and the vehicle body point cloud corresponding to the vehicle body model, and the accuracy is high, so that the accuracy of calibration by adopting the third transformation coordinate system is also high.
Based on the same inventive concept, the embodiments of the present application further provide a calibration device of a laser radar, where the device may be a chip or a system on a chip in a calibration device, and may also be a functional module in the calibration device for use in the method described in one or more embodiments above. The calibration device may implement the functions performed by the calibration device according to one or more embodiments described above, where these functions may be implemented by hardware executing corresponding software. Such hardware or software includes one or more modules corresponding to the functions described above. Fig. 7 is a schematic structural diagram of a calibration device of a lidar according to an embodiment of the present application, and referring to fig. 7, the calibration device 700 may include: an obtaining module 701, configured to obtain a first transformation matrix from the scanning device to the radar coordinate system; the obtaining module 701 is further configured to obtain a first point cloud acquired by the scanning device and a second point cloud of the vehicle body, where the first point cloud is obtained by the scanning device by scanning at least a portion of the vehicle body, and the second point cloud is obtained by performing discrete sampling on a model of the vehicle body or performing point cloud simulation on the model of the vehicle body; the point cloud registration module 702 is configured to perform point cloud registration on the first point cloud and the second point cloud to obtain a second transformation matrix from the scanning device coordinate system to the vehicle body coordinate system; a determining module 703, configured to determine a third transformation matrix from the radar coordinate system to the vehicle body coordinate system according to the first transformation matrix and the second transformation matrix, where the third transformation matrix is used to calibrate the laser radar to the vehicle body coordinate system.
In some possible embodiments, the obtaining module 701 is configured to obtain a third point cloud acquired by a scanning device and a fourth point cloud acquired by a lidar, where a field of view of the scanning device and a field of view of the lidar have an overlapping area, and at least a portion of a feature object is disposed in the overlapping area; the point cloud registration module 702 is configured to perform point cloud registration on the third point cloud and the fourth point cloud to obtain a first transformation matrix from the scanning device coordinate system to the radar coordinate system.
In some possible implementations, the point cloud registration module 702 is configured to estimate an initial pose of the lidar relative to the scanning device according to the third point cloud and the fourth point cloud; and according to the initial pose, performing point cloud registration on the third point cloud and the fourth point cloud to obtain a first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
In some possible implementations, the determining module 703 is configured to obtain the third transformation matrix by multiplying the inverse of the first transformation matrix by the second transformation matrix.
In some possible embodiments, the determining module 703 is configured to determine, after determining the third transformation matrix, a displacement and a rotation angle of the radar coordinate system relative to the vehicle body coordinate system according to the third transformation matrix.
It should be noted that, the specific implementation process of the obtaining module 701, the point cloud registration module 702, and the determining module 703 may refer to the detailed descriptions of the embodiments of fig. 1 to 6, and are not repeated herein for brevity of description.
The obtaining module 701, the point cloud registration module 702, and the determining module 703 mentioned in the embodiments of the present application may be one or more processors.
Based on the same inventive concept, embodiments of the present application provide a calibration device for a laser radar, which may be the calibration device described in one or more embodiments above. Fig. 8 is a schematic structural diagram of a laser radar calibration device according to an embodiment of the present application, and referring to fig. 8, a calibration device 800 may be implemented by using general-purpose computer hardware, including a processor 801 and a memory 802.
In the alternative, processor 801 and memory 802 may communicate via bus 803.
In some possible implementations, the at least one processor 801 may constitute any physical device having circuitry to perform logical operations on one or more inputs. For example, the at least one processor may include one or more integrated circuits (integrated circuit, ICs) including application specific integrated circuits (application specific integrated circuit, ASIC), microchips, microcontrollers, microprocessors, all or part of a central processing unit (central processing unit, CPU), a graphics processing unit (graphics processing unit, GPU), a digital signal processor (digital signal process, DSP), a field programmable gate array (field programmable gate array, FPGA), or other circuit suitable for executing instructions or performing logic operations. The instructions executed by the at least one processor may, for example, be preloaded into a memory integrated with or embedded in the controller, or may be stored in a separate memory. The memory may include random access memory (random access memory, RAM), read-only memory (ROM), hard disk, optical disk, magnetic medium, flash memory, other permanent, fixed, or volatile memory, or any other mechanism capable of storing instructions. In some embodiments, at least one processor may comprise more than one processor. Each processor may have a similar structure, or the processors may have different configurations electrically connected or disconnected from each other. For example, the processors may be separate circuits or integrated in a single circuit. When more than one processor is used, the processors may be configured to operate independently or cooperatively. The processors may be coupled in electrical, magnetic, optical, acoustical, mechanical, or by other means that allow them to interact. According to one embodiment of the present application, there is also provided a computer readable storage medium having stored thereon computer instructions for execution by a processor of the steps of the calibration method described above. Memory 802 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory and/or random access memory. Memory 802 may store an operating system, application programs, other program modules, executable code, program data, user data, and the like.
Further, the above-described memory 802 stores therein computer-executable instructions for implementing the functions of the acquisition module 701, the point cloud registration module 702, and the determination module 703 in fig. 7. The functions/implementation procedures of the obtaining module 701, the point cloud registration module 702 and the determining module 703 in fig. 7 may be implemented by the processor 801 in fig. 8 calling computer-executable instructions stored in the memory 802, and the specific implementation procedure and function refer to the above-mentioned related embodiments.
Based on the same inventive concept, an embodiment of the present application provides a calibration device of a laser radar, including: a memory storing computer executable instructions; and the processor is connected with the memory and is used for executing the computer-executable instructions and realizing the laser radar calibration method according to one or more embodiments.
Based on the same inventive concept, the embodiments of the present application provide a computer storage medium, where computer executable instructions are stored, and when the computer executable instructions are executed by a processor, the method for calibrating a lidar according to one or more embodiments of the present application can be implemented.
It will be understood by those skilled in the art that the sequence number of each step in the above embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art that the technical solutions described in the foregoing embodiments may be modified or some of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. The method for calibrating the laser radar is arranged on the vehicle body and is characterized by comprising the following steps of:
obtaining a first transformation matrix from a scanning device coordinate system to a radar coordinate system;
the method comprises the steps of obtaining a first point cloud acquired by scanning equipment and a second point cloud of the vehicle body, wherein the first point cloud is obtained by scanning at least one part of the vehicle body by the scanning equipment, and the second point cloud is obtained by performing discrete sampling on a model of the vehicle body or performing point cloud simulation on the model of the vehicle body;
performing point cloud registration on the first point cloud and the second point cloud to obtain a second transformation matrix from the scanning equipment coordinate system to a vehicle body coordinate system;
determining a third transformation matrix from the radar coordinate system to the vehicle body coordinate system according to the first transformation matrix and the second transformation matrix, wherein the third transformation matrix is used for calibrating the laser radar to the vehicle body coordinate system;
wherein the obtaining a first transformation matrix of the scanning device coordinate system to the radar coordinate system includes: obtaining a third point cloud acquired by the scanning equipment and a fourth point cloud acquired by the laser radar, wherein an overlapping area exists between the field of view of the scanning equipment and the field of view of the laser radar, and at least one part of a characteristic object is arranged in the overlapping area; and carrying out point cloud registration on the third point cloud and the fourth point cloud to obtain the first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
2. The method of claim 1, wherein the performing point cloud registration on the third point cloud and the fourth point cloud to obtain a first transformation matrix of the scanning device coordinate system to the radar coordinate system comprises:
estimating the initial pose of the laser radar relative to the scanning device according to the third point cloud and the fourth point cloud;
and carrying out point cloud registration on the third point cloud and the fourth point cloud according to the initial pose to obtain the first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
3. The method of claim 1, wherein the determining a third transformation matrix of the radar coordinate system to the vehicle body coordinate system from the first transformation matrix and the second transformation matrix comprises:
and multiplying the inverse matrix of the first transformation matrix by the second transformation matrix to obtain the third transformation matrix.
4. A method according to claim 3, wherein said determining a third transformation matrix of said radar coordinate system to said vehicle body coordinate system further comprises:
and determining the displacement and the rotation angle of the radar coordinate system relative to the vehicle body coordinate system according to the third transformation matrix.
5. A calibration device for a laser radar, the laser radar being disposed on a vehicle body, the device comprising:
the acquisition module is used for acquiring a first transformation matrix from a scanning equipment coordinate system to a radar coordinate system;
the acquisition module is further configured to acquire a first point cloud acquired by the scanning device and a second point cloud of the vehicle body, where the first point cloud is obtained by the scanning device by scanning at least a part of the vehicle body, and the second point cloud is obtained by performing discrete sampling on a model of the vehicle body or performing point cloud simulation on the model of the vehicle body;
the point cloud registration module is used for carrying out point cloud registration on the first point cloud and the second point cloud to obtain a second transformation matrix from the scanning equipment coordinate system to the vehicle body coordinate system;
the determining module is used for determining a third transformation matrix from the radar coordinate system to the vehicle body coordinate system according to the first transformation matrix and the second transformation matrix, and the third transformation matrix is used for calibrating the laser radar to the vehicle body coordinate system;
the acquisition module is further used for acquiring a third point cloud acquired by the scanning equipment and a fourth point cloud acquired by the laser radar, wherein an overlapping area exists between the field of view of the scanning equipment and the field of view of the laser radar, and at least one part of a characteristic object is arranged in the overlapping area; the point cloud registration module is further configured to perform point cloud registration on the third point cloud and the fourth point cloud to obtain the first transformation matrix from the scanning device coordinate system to the radar coordinate system.
6. The apparatus of claim 5, wherein the point cloud registration module is configured to estimate an initial pose of the lidar relative to the scanning device based on the third point cloud and the fourth point cloud; and carrying out point cloud registration on the third point cloud and the fourth point cloud according to the initial pose to obtain the first transformation matrix from the scanning equipment coordinate system to the radar coordinate system.
7. The apparatus of claim 5, wherein the means for determining is configured to obtain the third transformation matrix by multiplying an inverse of the first transformation matrix by the second transformation matrix.
8. The apparatus of claim 7, wherein the means for determining is configured to determine a displacement and an angle of rotation of the radar coordinate system relative to the vehicle body coordinate system based on the third transformation matrix after determining the third transformation matrix.
9. A laser radar calibration device, comprising:
a memory storing computer executable instructions;
a processor, coupled to the memory, for implementing the method of any of claims 1 to 4 by executing the computer-executable instructions.
10. A computer storage medium having stored thereon computer executable instructions which, when executed by a processor, are capable of carrying out the method of any one of claims 1 to 4.
CN202210477041.4A 2022-05-02 2022-05-02 Laser radar calibration method, device, equipment and storage medium Active CN114935747B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210477041.4A CN114935747B (en) 2022-05-02 2022-05-02 Laser radar calibration method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210477041.4A CN114935747B (en) 2022-05-02 2022-05-02 Laser radar calibration method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114935747A CN114935747A (en) 2022-08-23
CN114935747B true CN114935747B (en) 2023-05-12

Family

ID=82865480

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210477041.4A Active CN114935747B (en) 2022-05-02 2022-05-02 Laser radar calibration method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114935747B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171708B (en) * 2018-01-24 2021-04-30 北京威远图易数字科技有限公司 Vehicle damage assessment method and system
KR102146451B1 (en) * 2018-08-17 2020-08-20 에스케이텔레콤 주식회사 Apparatus and method for acquiring conversion information of coordinate system
CN111208492B (en) * 2018-11-21 2022-04-19 长沙智能驾驶研究院有限公司 Vehicle-mounted laser radar external parameter calibration method and device, computer equipment and storage medium
CN112068108A (en) * 2020-08-11 2020-12-11 南京航空航天大学 Laser radar external parameter calibration method based on total station
CN112034431B (en) * 2020-09-25 2023-09-12 新石器慧通(北京)科技有限公司 External parameter calibration method and device for radar and RTK
CN112363130B (en) * 2020-11-30 2023-11-14 东风汽车有限公司 Vehicle-mounted sensor calibration method, storage medium and system
CN112462350B (en) * 2020-12-10 2023-04-04 苏州一径科技有限公司 Radar calibration method and device, electronic equipment and storage medium
CN113188569A (en) * 2021-04-07 2021-07-30 东软睿驰汽车技术(沈阳)有限公司 Vehicle and laser radar coordinate system calibration method, device and storage medium

Also Published As

Publication number Publication date
CN114935747A (en) 2022-08-23

Similar Documents

Publication Publication Date Title
EP3540464B1 (en) Ranging method based on laser radar system, device and readable storage medium
CN109975773B (en) Millimeter wave radar calibration method, device, equipment and storage medium
CN114152935B (en) Method, device and equipment for evaluating radar external parameter calibration precision
CN108445456A (en) Calibration of the light up to-radar relative pose
CN112513679B (en) Target identification method and device
JP2019510967A (en) Radar installation judgment using unstructured data
CN108489382B (en) AGV dynamic pose measuring method based on space multi-point constraint
WO2021016854A1 (en) Calibration method and device, movable platform, and storage medium
CN112099025B (en) Method, device, equipment and storage medium for positioning vehicle under bridge crane
CN112684432B (en) Laser radar calibration method, device, equipment and storage medium
CN111435163A (en) Ground point cloud data filtering method and device, detection system and storage medium
CN115144825A (en) External parameter calibration method and device for vehicle-mounted radar
WO2023083198A1 (en) Echo signal processing method and apparatus, device, and storage medium
CN111492258A (en) Method and device for determining the installation angle between the road on which a vehicle is driving and the measurement or detection direction of a radar sensor
CN114509744B (en) Method, device and equipment for evaluating range finding detection rate of laser radar
CN115436912B (en) Point cloud processing method and device and laser radar
CN114612598A (en) Point cloud processing method and device and laser radar
CN114935747B (en) Laser radar calibration method, device, equipment and storage medium
CN115485582A (en) Method and device for detecting halos in lidar measurements
CN114755666B (en) Point cloud expansion evaluation method, device and equipment
CN112781893A (en) Spatial synchronization method and device for vehicle-mounted sensor performance test data and storage medium
US20230108583A1 (en) Distance measurement device
CN113740876B (en) Three-dimensional laser radar light path adjusting method and device and electronic equipment
CN116529630A (en) Detection method, detection device, movable platform and storage medium
CN113256734A (en) Vehicle-mounted sensing sensor calibration method and system and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant