WO2022160879A1 - 一种转换参数的确定方法和装置 - Google Patents

一种转换参数的确定方法和装置 Download PDF

Info

Publication number
WO2022160879A1
WO2022160879A1 PCT/CN2021/131608 CN2021131608W WO2022160879A1 WO 2022160879 A1 WO2022160879 A1 WO 2022160879A1 CN 2021131608 W CN2021131608 W CN 2021131608W WO 2022160879 A1 WO2022160879 A1 WO 2022160879A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
vehicle
calibration body
equation
total station
Prior art date
Application number
PCT/CN2021/131608
Other languages
English (en)
French (fr)
Inventor
胡烜
石现领
黄志臻
龚稼学
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022160879A1 publication Critical patent/WO2022160879A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00

Definitions

  • the present application relates to the technical field of automatic driving, and in particular, to a method and device for determining conversion parameters.
  • Laser Radar is an important sensor in the field of autonomous driving. It can perceive the surrounding environment, detect objects in the surrounding environment, and present the detected objects on the display interface in the form of three-dimensional coordinates, such as point clouds.
  • the established lidar (LiDAR) coordinate system is represented.
  • the position and size of an object in the vehicle coordinate system need to be obtained.
  • the position and size of the object are limited by the point cloud coordinate graph in the LiDAR coordinate system, and then the LiDAR
  • the three-dimensional coordinates of the point cloud in the coordinate system are converted into the vehicle coordinate system, which facilitates the planning of the vehicle's driving route based on the position and size of the measured object during path planning.
  • it is necessary to convert the point cloud coordinates of the calibrated "object" in the LiDAR coordinate system to the point cloud coordinates in the vehicle coordinate system, that is, obtaining the conversion parameters between the LiDAR coordinate system and the vehicle coordinate system is in the calibration process.
  • This process is also known as the external parameter calibration process, or “external parameter calibration” for short.
  • the external parameter calibration method in the LiDAR coordinate system it is generally based on the feature point information collected by the on-board sensors.
  • the coordinates of obtain the transformation parameters from the LiDAR coordinate system to the vehicle coordinate system under the least squares criterion.
  • the accuracy is limited, and jitter occurs during ranging, so the accuracy of the feature point coordinates in the LiDAR coordinate system can only reach the order of 5cm.
  • the LiDAR coordinate system of the order of 5cm to calibrate the external parameters of the coordinate system conversion, there will be errors in the calculated angle.
  • Figure 2 shows that a certain right-angle plane of the measured object is in the LiDAR coordinate system. It can be seen that due to the limitation of the resolution accuracy of the LiDAR coordinate system and the jitter of the point cloud coordinates, the extracted coordinates of the four corner points of the right-angled plane are inaccurate, so the calculated angle can only reach The unit of "degree” is of the order of magnitude, and it is difficult to meet the precision requirements of smaller orders such as 0.1°, so the current external parameter calibration method cannot meet the high-precision requirements of autonomous driving solutions.
  • the present application provides a method and device for determining conversion parameters, which are used to improve the accuracy of external parameter calibration of a lidar coordinate system, thereby meeting the high-precision requirements of automatic driving. Specifically, the application discloses the following technical solutions:
  • the present application provides a method for determining conversion parameters, which can be applied to the field of automatic driving or intelligent driving.
  • the method includes: acquiring a point cloud coordinate set of a plane calibration body represented by a lidar coordinate system, the The point cloud coordinate set includes the point cloud coordinates formed by the feature points of the plane calibration body; the first equation coefficients are obtained by dividing the point cloud coordinate set and the plane fitting process, and the first equation coefficients are the plane equation coefficients of the calibration body in the lidar coordinate system; obtain second equation coefficients, the second equation coefficients are the equation coefficients of the plane calibration body in the vehicle coordinate system; according to the first equation coefficients and The second equation coefficients obtain the conversion parameters of the lidar coordinate system and the vehicle coordinate system, and the conversion parameters are used to convert the point cloud coordinates in the lidar coordinate system into the vehicle coordinate system.
  • Point cloud coordinates point cloud coordinates.
  • the obtained point cloud coordinate set is processed by means of point cloud segmentation and plane fitting. Based on statistical averaging, the equation coefficients of the calibration body in the LiDAR coordinate system are obtained, and the equation coefficients are used to calibrate the calibration. The position of the body in the vehicle coordinate system is calibrated. After the point cloud segmented by this method is processed by plane fitting, the accuracy of the external parameter calibration of the LiDAR coordinate system will be improved by an order of magnitude, and the influence of ranging jitter will be reduced at the same time.
  • the conversion parameters from the LiDAR coordinate system to the vehicle coordinate system can be obtained, that is, the external parameters of LiDAR, and
  • the external parameters of the LiDAR are used to convert the point cloud coordinates of the lidar to the point cloud coordinates in the vehicle coordinate system.
  • This method can avoid the use of the "four-wheel positioning system" auxiliary method to obtain the coordinates of the feature points in the vehicle coordinate system, reduce the cost of calibrating the external objects of the vehicle by 4S shops or auto repair shops, and also improve the accuracy of the vehicle coordinate system. Calibration accuracy.
  • the obtaining the second equation coefficients includes: obtaining the third equation coefficients, and, the difference between the total station coordinate system and the vehicle coordinate system conversion parameters, wherein the third equation coefficient is the equation coefficient of the plane calibration body in the total station coordinate system; according to the conversion parameters between the total station coordinate system and the vehicle coordinate system, the The third equation coefficients are converted into the second equation coefficients.
  • the obtaining the third equation coefficient includes: obtaining the equation coefficient of the plane calibration body in the calibration body coordinate system; The conversion parameter between the coordinate system of the total station and the coordinate system of the calibration body, which converts the equation coefficients of the plane calibration body in the calibration body coordinate system into the equation coefficients in the coordinate system of the total station , to obtain the third equation coefficient.
  • acquiring the conversion parameters between the coordinate system of the total station and the coordinate system of the vehicle includes: acquiring n feature points on the vehicle, and the coordinates of each feature point in the vehicle coordinate system and the total station coordinate system, n ⁇ 1 and a positive integer; The coordinate system and the coordinates in the total station coordinate system are used to obtain the conversion parameters between the total station coordinate system and the vehicle coordinate system.
  • the n feature points include: a1, a1u, a2, a2u, a3, a3u, a4, a4u; further, a1 is the center of the rear wheel on the right side of the vehicle, and a1u is the highest point of the hub of the rear wheel on the right side of the vehicle , a2 is the center of the left rear wheel of the vehicle, a2u is the highest point of the left rear wheel hub of the vehicle, a3 is the center of the right front wheel of the vehicle, a3u is the highest point of the right front wheel hub of the vehicle, a4 is the left side of the vehicle The center of the front wheel, a4u is the highest point of the front wheel hub on the left side of the vehicle.
  • the present application provides a conversion parameter determination device, the device includes: an acquisition unit configured to acquire a point cloud coordinate set of a plane calibration body represented by a lidar coordinate system, where the point cloud coordinate set includes all The point cloud coordinates formed by the feature points of the plane calibration body; the processing unit is used for dividing the point cloud coordinate set and performing plane fitting processing to obtain first equation coefficients, and the first equation coefficients are the plane constants.
  • Equation coefficients of the target body in the lidar coordinate system the acquisition unit is further configured to obtain second equation coefficients, where the second equation coefficients are the equation coefficients of the plane calibration body in the vehicle coordinate system; the The processing unit is further configured to obtain conversion parameters between the lidar coordinate system and the vehicle coordinate system according to the first equation coefficients and the second equation coefficients, where the conversion parameters are used to convert the lidar coordinate system into the The point cloud coordinates are converted into the point cloud coordinates in the vehicle coordinate system.
  • the processing unit is further configured to obtain a third equation coefficient, and a conversion parameter between the coordinate system of the total station and the vehicle coordinate system , wherein the third equation coefficient is the equation coefficient of the plane calibration body in the total station coordinate system; the acquisition unit is further configured to determine the relationship between the total station coordinate system and the vehicle coordinate system according to the The conversion parameters of the third equation are converted into the second equation coefficients.
  • the obtaining unit is further configured to obtain the equation coefficients of the plane calibration body in the calibration body coordinate system;
  • the conversion parameters between the instrument coordinate system and the calibration body coordinate system, the equation coefficients of the plane calibration body in the calibration body coordinate system are converted into the equation coefficients in the total station coordinate system to obtain the third equation coefficient.
  • the acquisition unit is further configured to acquire n feature points on the vehicle, and each of the feature points is in the vehicle coordinate system and The coordinates in the total station coordinate system, n ⁇ 1 and a positive integer, according to the coordinates of each of the n feature points in the vehicle coordinate system and the total station coordinate system, The conversion parameters between the coordinate system of the total station and the coordinate system of the vehicle are obtained.
  • the n feature points include but are not limited to: a1, a1u, a2, a2u, a3, a3u, a4, a4u; further, a1 is the center of the rear wheel on the right side of the vehicle, and a1u is the hub of the rear wheel on the right side of the vehicle a2 is the center of the left rear wheel of the vehicle, a2u is the highest point of the left rear wheel hub of the vehicle, a3 is the center of the right front wheel of the vehicle, a3u is the highest point of the right front wheel hub of the vehicle, a4 is The center of the left front wheel of the vehicle, a4u is the highest point of the hub of the left front wheel of the vehicle.
  • the present application further provides an apparatus for determining conversion parameters, the apparatus comprising at least one processor and an interface circuit, wherein the interface circuit is configured to provide instructions and/or data for the at least one processor;
  • the at least one processor is configured to execute the instructions to implement the aforementioned first aspect and the methods in various implementations of the first aspect.
  • the terminal further includes a memory, and the memory is used for storing the instruction and/or data.
  • the at least one processor and the interface circuit may be integrated into one processing chip or chip circuit.
  • the conversion parameter determination device is a terminal, and the terminal includes but is not limited to a vehicle, a lidar, a robot, or a PC.
  • the present application also provides a computer-readable storage medium, in which instructions are stored, so that when the instructions are executed on a computer or a processor, the instructions can be used to execute the foregoing first aspect and each of the first aspects. method in an implementation.
  • the present application also provides a computer program product, the computer program product includes computer instructions, when the instructions are executed by a computer or a processor, the aforementioned first aspect and the methods in various implementation manners of the first aspect can be implemented.
  • the present application further provides a terminal, where the terminal includes the apparatus in the foregoing second aspect and various implementation manners of the second aspect, or includes the apparatus in the foregoing third aspect, for implementing the foregoing first aspect Aspects and methods in various implementations of the first aspect.
  • the terminal includes but is not limited to a vehicle, a robot, a lidar or a PC.
  • beneficial effects corresponding to the technical solutions of the various implementation manners of the second aspect to the fifth aspect are the same as the beneficial effects of the foregoing first aspect and various implementation manners of the first aspect.
  • beneficial effects in various implementation manners of the first aspect will not be repeated.
  • FIG. 1 is a schematic diagram of a LiDAR coordinate system and a vehicle coordinate system provided by the application;
  • FIG. 2 is a schematic diagram of point cloud coordinates of a rectangular plane represented in a LiDAR coordinate system provided by the application;
  • FIG. 3 is a schematic structural diagram of a positioning system according to an embodiment of the present application.
  • FIG. 4 is a flowchart of a method for determining a conversion parameter provided by an embodiment of the present application
  • FIG. 6 is a schematic diagram of a feature point on a vehicle provided by an embodiment of the present application.
  • FIG. 7A is a schematic diagram of an angular quantity estimation error of an external parameter provided by an embodiment of the present application.
  • 7B is a schematic diagram of an estimation error of a position quantity of an external parameter according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an external parameter calibration result in a laboratory environment provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an apparatus for determining conversion parameters provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • the technical solution of the present application can be applied to application scenarios in the field of automatic driving, and specifically relates to a method for offline calibration of external parameters from a lidar coordinate system to a vehicle coordinate system.
  • the method is applicable to a positioning system or a parameter calibration system comprising: a first device, a vehicle, a laser or laser scanner, a calibration body and a total station.
  • other devices or instruments may also be included, which is not limited in this embodiment.
  • the first device may be a terminal, such as a smart phone, a smart screen TV (TV), a notebook computer, a tablet computer, a personal computer (PC), a personal digital assistant (PDA), Foldable terminals, vehicles, robots, drones, lidars, wearable devices with wireless communication capabilities (such as smart watches or wristbands), user devices (user devices) or user equipment (UE), and enhanced Augmented reality (AR) or virtual reality (virtual reality, VR) devices, etc.
  • the embodiments of the present application do not limit the specific device form of the terminal.
  • the above-mentioned various terminals include, but are not limited to, those equipped with Apple (IOS), Android (Android), and Microsoft (Microsoft).
  • FIG. 3 it is a schematic structural diagram of a positioning system provided in this embodiment.
  • the system includes: a vehicle, a plane calibration body 1 to a plane calibration body 5, a reference calibration body, a total station 1 and a total station 2, and a first device.
  • the vehicle includes a vehicle-machine processor and various sensors, such as a gyroscope sensor, an acceleration sensor, a rotating shaft sensor, and the like.
  • the terminal is not shown.
  • the terminal may be set in the vehicle, such as a PC installed on the vehicle.
  • the vehicle is also equipped with a laser or laser scanner, which is used to emit laser light outward, scan the external environment, and establish a lidar coordinate system according to the scanned external environment, referred to as "LiDAR coordinate system".
  • the laser or laser scanner can be mounted somewhere on the roof.
  • the calibration and positioning process of the external environment and internal components of the vehicle in this embodiment can be represented by the following coordinate systems, for example, including: vehicle coordinate system, LiDAR coordinate system, total station coordinate system and calibration body coordinate system.
  • vehicle coordinate system LiDAR coordinate system
  • total station coordinate system LiDAR coordinate system
  • calibration body coordinate system Specifically, with reference to Figure 3, the symbol representation and functions of each coordinate system are introduced as follows:
  • A Indicates the vehicle coordinate system, which can be established by the vehicle-machine processor and is used to represent the coordinate system of the positional relationship between the internal components of the vehicle and external objects.
  • the coordinate origin of the vehicle coordinate system A is the center of the rear axle of the vehicle, and the three coordinate axes are respectively represented as X A , Y A , and Z A , where the X A axis is rightward along the vehicle rear axis, and the Z A axis is vertical. up on the chassis of the vehicle.
  • LiDAR coordinate system B Indicates a LiDAR coordinate system, which can be established by a laser or laser scanner to measure the vehicle's external environment.
  • a laser or laser scanner produces a light pulse that hits an object and reflects back, and is eventually received by a receiver, which accurately measures the travel time of the light pulse from when it was emitted to when it was reflected back, and then based on the speed of light and travel time to get the measured distance.
  • the coordinates of each object in the measurement area can be accurately calculated, the measurement accuracy can reach centimeter level, and the measurement distance can reach 300m on both sides of the road.
  • the coordinate origin can be the center of the laser or laser scanner, and the three coordinate axes are represented as X B , Y B , and Z B , respectively.
  • C indicates the coordinate system of the total station, which is a coordinate system established by the total station, and the three coordinate axes of the coordinate system can be expressed as X C , Y C , and Z C respectively.
  • Total station also known as Electronic Total Station, is a high-tech measuring instrument integrating light, machine and electricity.
  • a surveying and mapping instrument system integrating measurement functions such as distance) and height difference.
  • Total stations are widely used in the fields of precision engineering measurement or deformation monitoring such as large-scale above-ground buildings and underground tunnel construction. It is called a total station because it can complete all the measurement work on the station once the instrument is installed.
  • total station 1 the total station located at the door on the right side of the vehicle is called “total station 1”
  • the coordinate system established by the total station 1 is the total station coordinate system C1
  • the corresponding three coordinates of the total station coordinate system C1 The coordinate axes are X C1 , Y C1 , and Z C1 .
  • total station 2 the coordinate system established by the total station 2 is the total station coordinate system C2
  • the corresponding total station coordinate system C2 The three coordinate axes are X C2 , Y C2 , and Z C2 .
  • the calibration body coordinate system is used to calibrate at least one calibration body, such as a two-dimensional plane calibration body or a three-dimensional reference calibration body.
  • the coordinate origin of the calibration body coordinate system D can be set as the center of a certain plane on the reference calibration body, and the three coordinate axes are respectively represented as X D , Y D , and Z D .
  • the two-dimensional plane calibration body can also be called “calibration plane", which is used as the target for external parameter calibration.
  • the two-dimensional plane calibration body includes, but is not limited to, a metal flat plate, a diffuse reflection plate, and the like.
  • the three-dimensional reference calibration body is used to establish the calibration body coordinate system D.
  • the reference calibration body is a cuboid.
  • At least one feature point (feature) of the plane calibration body 1 is represented as "FeaD1_D” in the calibration body coordinate system D.
  • at least one feature point of the plane calibration body 2 is in the calibration body.
  • In the coordinate system D it is represented as "FeaD2_D”
  • at least one feature point of the plane calibration body 3 is represented as “FeaD3_D” in the calibration body coordinate system D
  • at least one feature point of the plane calibration body 4 is in the calibration body coordinate system.
  • at least one feature point of the plane calibration body 5 is represented as "FeaD5_D” in the calibration body coordinate system D.
  • the reference calibration volume includes a plurality of two-dimensional planes, and at least one feature point on one of the two-dimensional planes is represented as "FeaD_D" in the calibration body coordinate system D.
  • the vehicle coordinate system A and the lidar coordinate system B are both right-handed coordinate systems.
  • this embodiment provides a method for determining conversion parameters, see FIG. 4 , the method includes:
  • the laser or laser scanner establishes a LiDAR coordinate system B, and at least one feature point of the calibration body to be scanned by the laser or laser scanner is represented by the LiDAR coordinate system B.
  • the calibration bodies scanned in this embodiment include plane calibration bodies 1 to 5, and the position of each plane calibration body is fixed, and each plane calibration body contains at least one feature point, and these feature points can pass through three-dimensional coordinate points.
  • Cloud representation For example, at least one feature point of plane calibration body 1 is represented by LiDAR coordinate system B as point cloud coordinate set 1, and at least one feature point of plane calibration body 2 is represented by LiDAR coordinate system B as point cloud coordinate set 2, and so on. .
  • the laser or laser scanner sends the calibrated point cloud coordinate sets of all the calibration objects to the first device, where the first device may be a conversion parameter determination device for executing the conversion parameter determination method provided in this embodiment , specifically, the method includes the following steps:
  • the point cloud coordinate set includes point cloud coordinates formed by the feature points of the plane calibration body, such as one or more of the above-mentioned point cloud coordinate systems 1-5.
  • first equation coefficients are equation coefficients of the plane calibration body in the lidar coordinate system.
  • the first device divides the point cloud coordinate set, also referred to as "point cloud segmentation".
  • Point cloud segmentation is to divide the point cloud according to the characteristics of space, geometry and texture, so that the point clouds in the same division have similar characteristics.
  • a possible point cloud segmentation method is to extract the feature boundary based on the point cloud slice, use the point cloud density as a threshold to separate the point clouds of different continuous feature curves, and then perform a plane simulation on the segmented data of different point sets. combined processing.
  • a robust point cloud data plane fitting method can be used. This method is based on the eigenvalue method, and by using certain criteria to delete the difference values in the point cloud data, a robust plane parameter estimation value can be obtained.
  • the least squares method, the eigenvalue method and the robust eigenvalue method can be used to perform plane fitting processing on the point cloud data, so as to obtain reliable plane parameter equation coefficients, that is, to obtain the first equation coefficients.
  • This embodiment does not limit the specific processes of point cloud segmentation and plane fitting processing.
  • a method for obtaining the coefficients of the second equation includes:
  • the third equation coefficient is the equation coefficient of the plane calibration body in the coordinate system C of the total station.
  • the transformation parameter between the total station coordinate system C and the vehicle coordinate system A is also referred to as a "transformation matrix”.
  • an implementation manner of obtaining the third equation coefficient includes: the first device obtains the equation coefficient of the plane calibration body represented by the calibration body coordinate system D, specifically, the equation coefficient is the plane The equation coefficients of the plane where the at least three feature points of the calibration body are located in the calibration body coordinate system D, and then, according to the conversion parameters between the total station coordinate system C and the calibration body coordinate system D, the plane is The equation coefficients of the calibration body in the calibration body coordinate system D are converted into the equation coefficients in the total station coordinate system C, and the equation coefficients of the plane calibration body in the total station coordinate system C are obtained, that is, the Describe the third equation coefficient.
  • the first device when the first device obtains the equation coefficients of the plane calibration body in the calibration body coordinate system D, it needs to obtain the conversion parameters between the total station coordinate system C and the calibration body coordinate system D in advance, and then convert the The equation coefficients of the plane calibration body in the total station coordinate system C are converted to the equation coefficients expressed in the calibration body coordinate system D, that is, the equation coefficients of the plane calibration body in the calibration body coordinate system D.
  • the coordinates of each feature point on the plane calibration body can be represented by the calibration body coordinate system D.
  • the feature points in the plane calibration body 1 are represented in the calibration body coordinate system D as shown in Figure 3 "FeaD1_D ".
  • the specific implementation of obtaining the equation coefficients of the plane calibration body in the total station coordinate system C includes: taking the total station 1 as an example, the total station 1 establishes the total station coordinate system C1, and the scanned plane All the feature points on the calibration bodies 1 to 5, and then calibrate all the feature points on the scanned plane calibration bodies 1 to 5 through the total station coordinate system C1. Based on the plane fitting method, the plane calibration bodies 1 to 5 are obtained.
  • the equation coefficients in the coordinate system C1 of the total station, the plane fitting method includes but not limited to the least squares method, the eigenvalue method, the robust eigenvalue method, etc.; finally, the total station 1 will obtain the plane calibration body
  • the equation coefficients of 1 to 5 in the total station coordinate system C1 are reported to the first device.
  • the first device receives the plane calibration bodies 1 to 5 sent by the total station 1 in the total station coordinate system. Equation coefficients in C1.
  • step 103-1 the conversion parameter between the coordinate system C of the total station and the coordinate system A of the vehicle is obtained.
  • a possible implementation manner is to use at least one feature point on the vehicle to obtain the conversion parameter .
  • n feature points on the vehicle are acquired, where n ⁇ 1 and is a positive integer.
  • the n feature points include: a1, a1u, a2, a2u, a3, a3u, a4, a4u.
  • a1 is the center of the right rear wheel of the vehicle
  • a1u is the highest point of the right rear wheel hub of the vehicle
  • a2 is the center of the left rear wheel of the vehicle
  • a2u is the highest point of the left rear wheel hub of the vehicle
  • a3 is the right side of the vehicle
  • the center of the front wheel a3u is the highest point of the front wheel hub on the right side of the vehicle
  • a4 is the center of the front wheel on the left side of the vehicle
  • a4u is the highest point of the front wheel hub on the left side of the vehicle.
  • n feature points include but are not limited to the above examples, and other points on the vehicle may also be selected as feature points.
  • At least one feature point a1, . . . , a4u of the above vehicle is calibrated in the vehicle coordinate system A and the total station coordinate system C, respectively.
  • the total station 1 calibrates the coordinates of the feature points a1, a1u, a3, and a3u on the right side of the vehicle in the total station coordinate system C1, and reports these coordinate values to the first device.
  • the total station 2 calibrates the coordinates of the feature points a2, a2u, a4, and a4u on the left side of the vehicle in the total station coordinate system C2, and reports these coordinate values to the first device.
  • the first device receives the coordinates in the total station coordinate system C of all the feature points a1 , . . .
  • the first device also obtains the coordinates of all the feature points a1, .
  • the coordinates in the instrument coordinate system C, and the conversion parameters between the total station coordinate system C and the vehicle coordinate system A are obtained.
  • the conversion of these two coordinate systems can be obtained through at least one feature point of the "reference calibration body”. parameter.
  • the total station 1 observes the reference calibration body, and obtains the coordinates of each plane in the reference calibration body in the total station coordinate system C1, and the total station 1 sends the reference calibration body to the first device.
  • the coordinate set of the feature point the first device receives the coordinate set sent by the total station 1 .
  • the first device also obtains the coordinates of the feature points of the reference calibration body in the calibration body coordinate system D.
  • the coordinates of the feature points in the reference calibration body in the calibration body coordinate system D are known quantities, which can be calibrated and obtained in advance.
  • the first device obtains the conversion parameters between the total station coordinate system C1 and the calibration body coordinate system D according to the coordinates of the feature points of the reference calibration body in the total station coordinate system C1 and the calibration body coordinate system D respectively.
  • the conversion parameters between the total station coordinate system C2 and the calibration body coordinate system D can also be obtained.
  • the process of converting parameters is not repeated here.
  • the total station in this embodiment can also be replaced by other instruments or devices that can measure the three-dimensional coordinates of feature points, such as a three-dimensional coordinate measuring instrument, which is not limited in this embodiment.
  • 103-2 Convert the third equation coefficients into the second equation coefficients according to the conversion parameters between the total station coordinate system C and the vehicle coordinate system A.
  • the second equation coefficient is the equation coefficient of the plane calibration body in the vehicle coordinate system A.
  • the conversion parameter between the lidar coordinate system B and the vehicle coordinate system A is also called "LiDAR external parameter”.
  • the external parameter calculation algorithm of LiDAR is as follows:
  • M, N, O, and P represent the plane equation coefficients, that is, the aforementioned second equation coefficients
  • X A , Y A , and Z A represent the three coordinate axes of the vehicle coordinate system A, respectively.
  • M', N', O' and P' represent the plane equation coefficients, namely the aforementioned first equation coefficients
  • X B , Y B , and Z B represent the three coordinate axes of the LiDAR coordinate system B, respectively.
  • R is the rotation matrix and T is the translation matrix.
  • R is a 3x3 matrix and T is a 3x1 matrix, for example:
  • arg min represents the values of the variables M, N, O, and P when the following formula reaches the minimum value
  • k represents the kth plane calibration body, and k ⁇ [0,5] in this embodiment.
  • R * , T * represent the transformation parameters of LiDAR coordinate system B and vehicle coordinate system A, that is, the external parameters of LiDAR.
  • the obtained point cloud coordinate set is processed by means of point cloud segmentation and plane fitting, and based on statistical averaging, the equation coefficients of the calibration body in the LiDAR coordinate system are obtained, and the equation coefficients are used to calibrate the calibration.
  • the position of the body in the vehicle coordinate system is calibrated.
  • the conversion parameters from the LiDAR coordinate system to the vehicle coordinate system can be obtained, that is, the external parameters of LiDAR, and
  • the external parameters of the LiDAR are used to convert the point cloud coordinates of the lidar to the point cloud coordinates in the vehicle coordinate system.
  • This method can avoid the use of the "four-wheel positioning system" auxiliary method to obtain the coordinates of the feature points in the vehicle coordinate system, reduce the cost of calibrating the external objects of the vehicle by 4S shops or auto repair shops, and also improve the accuracy of the vehicle coordinate system. Calibration accuracy.
  • the above-mentioned external parameters of LiDAR are used for simulation to obtain the angle estimation error of the external parameters, as shown in FIG. 7A , and the position estimation error of the external parameters, as shown in FIG. 7B .
  • the abscissa represents the standard deviation of ranging jitter
  • the symbol is represented as Err_B
  • the unit is: meter (m).
  • the ordinate represents the angle estimation error of the external parameter
  • the symbol is Err_Ang
  • the unit degree (°).
  • Figure 7A contains three angle estimation errors, namely Arfa( ⁇ ), Beta( ⁇ ) and Gama( ⁇ ).
  • the angle estimation error of the external parameter is less than 0.1°.
  • the abscissa value of Line1 is 0.02m
  • the angle estimation errors Err_Arfa, Err_Beta, Err_Gama value are all located below 0.1°.
  • the abscissa represents the standard deviation of ranging jitter, the symbol is Err_B, and the unit is: meter (m); the ordinate represents the position estimation error, and the symbol is Err_T, and the unit is: meter (m) .
  • Figure 7B includes three types of position quantity estimation errors, namely Err_X, Err_Y, and Err_Z. Among them, when the abscissa value of Line3 is 0.02m, the values of the position quantity estimation errors Err_X, Err_Y and Err_Z (see the ordinate corresponding to Line4) are all less than 0.025m.
  • the external parameters of the LiDAR in the above-mentioned step 104 are also used to measure the actual data, and two sets of measurement results of the actual data are obtained. Referring to FIG. 8 , they are the calibration results of the external parameters in the laboratory environment, which are respectively the experimental results. Data 1 and Experimental Data 2. Among them, the estimated value of Gama obtained in experimental data 1 is 124.92577875158723°, which can be approximately equal to 124.93°.
  • the method for determining conversion parameters provided in this embodiment can also be applied to the calibration of external parameters from other three-dimensional coordinate sensors to vehicles, and the basic idea and operation process can completely adopt the technical solutions in the above method embodiments.
  • FIG. 9 is a schematic structural diagram of an apparatus for determining a conversion parameter according to an embodiment of the present application.
  • the apparatus may be the first apparatus in the foregoing embodiment, and the apparatus may implement the method for determining the conversion parameter in the foregoing embodiment.
  • the apparatus may include: an acquisition unit 901 and a processing unit 902 .
  • the apparatus may also include other units or modules such as a storage module, a sending module, and the like.
  • the acquiring unit 901 is configured to acquire a point cloud coordinate set of a plane calibration body represented by a lidar coordinate system, where the point cloud coordinate set includes point cloud coordinates formed by at least three feature points of the plane calibration body ; a processing unit 902, configured to perform segmentation and plane fitting processing on the point cloud coordinate set to obtain first equation coefficients, where the first equation coefficients are the equations of the plane calibration body in the lidar coordinate system coefficient.
  • the obtaining unit 901 is further configured to obtain second equation coefficients, where the second equation coefficients are the equation coefficients of the plane calibration body in the vehicle coordinate system; the processing unit 902 is further configured to obtain the second equation coefficients according to the first equation coefficients and the obtained equation coefficients.
  • the second equation coefficients are used to obtain the conversion parameters of the lidar coordinate system and the vehicle coordinate system, and the conversion parameters are used to convert the point cloud coordinates in the lidar coordinate system to the point cloud coordinates in the vehicle coordinate system. .
  • the processing unit 902 is further configured to acquire a third equation coefficient, and a conversion parameter between the total station coordinate system and the vehicle coordinate system, where all The third equation coefficient is the equation coefficient of the plane calibration body in the total station coordinate system; the obtaining unit 901 is further configured to, according to the conversion parameters between the total station coordinate system and the vehicle coordinate system, convert the The third equation coefficients are converted into the second equation coefficients.
  • the obtaining unit 901 is further configured to obtain equation coefficients of the plane calibration body in the calibration body coordinate system; according to the total station coordinate system and the conversion parameters between the calibration body coordinate system, the equation coefficients of the plane calibration body in the calibration body coordinate system are converted into the equation coefficients in the total station coordinate system, and the first Three equation coefficients.
  • the acquiring unit 901 is further configured to acquire n feature points on the vehicle, and each of the feature points is in the vehicle coordinate system and the whole The coordinates in the station coordinate system, n ⁇ 1 and a positive integer, according to the coordinates of each of the n feature points in the vehicle coordinate system and the total station coordinate system, the said Conversion parameters between the coordinate system of the total station and the vehicle coordinate system.
  • the n feature points include: a1, a1u, a2, a2u, a3, a3u, a4, a4u; further, a1 is the center of the rear wheel on the right side of the vehicle, and a1u is the highest point of the hub of the rear wheel on the right side of the vehicle , a2 is the center of the left rear wheel of the vehicle, a2u is the highest point of the left rear wheel hub of the vehicle, a3 is the center of the right front wheel of the vehicle, a3u is the highest point of the right front wheel hub of the vehicle, a4 is the left side of the vehicle The center of the front wheel, a4u is the highest point of the front wheel hub on the left side of the vehicle.
  • feature points may also be selected as the feature points on the vehicle in this embodiment, which is not limited in this embodiment.
  • the conversion parameter determination device may be a control module including an arithmetic function, such as a Microcontroller Unit (MCU), integrated in the terminal.
  • MCU Microcontroller Unit
  • FIG. 10 shows a schematic structural diagram of another apparatus for determining conversion parameters.
  • the apparatus may include: at least one processor 11 and an interface circuit 12 , and, optionally, a memory 13 .
  • At least one processor 11 is the control center of the conversion parameter determination device, and can be used to complete the conversion parameter determination method in the foregoing embodiments.
  • At least one processor 11 may be composed of an integrated circuit (Integrated Circuit, IC), for example, may be composed of a single packaged IC, or may be composed of a plurality of packaged ICs connected with the same function or different functions.
  • the processor may include a central processing unit (central processing unit, CPU) or a digital signal processor (digital signal processor, DSP) or the like.
  • the above-mentioned processor may also include a hardware chip, and the hardware chip may be a logic circuit, an application specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • the above-mentioned PLD can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), a general-purpose array logic (generic array logic, GAL) or any combination thereof.
  • CPLD complex programmable logic device
  • FPGA field-programmable gate array
  • GAL general-purpose array logic
  • the memory 13 is used for storing and exchanging various types of instructions, data and/or software, and the memory 13 may contain multiple storage media, at least one of which may be used to store parameters of each coordinate system, plane equation coefficients, and the like. Furthermore, at least one storage medium can also be used to store computer programs or codes.
  • the above-mentioned memory 13 may include volatile memory (volatile memory), such as random access memory (random access memory, RAM); may also include non-volatile memory (non-volatile memory), such as flash memory ( flash memory), hard disk (hard sisk drive, HDD) or solid-state drive (Solid-State Drive, SSD), or, the memory may also include a combination of the above-mentioned types of memory.
  • volatile memory such as random access memory (random access memory, RAM
  • non-volatile memory such as flash memory ( flash memory), hard disk (hard sisk drive, HDD) or solid-state drive (Solid-State Drive, SSD), or, the memory may also include a combination of the above-mentioned types of memory.
  • the memory 13 may be integrated into at least one processor 11 as a storage medium, or may be configured outside the processor, which is not limited in this embodiment.
  • the interface circuit 12 includes, but is not limited to, a transceiver interface and/or a transceiver, the transceiver interface is used for the at least one processor to transmit and receive data and/or information, and the transceiver is used for the conversion parameter determination device to communicate with other devices or Network communication, such as Ethernet, wireless access network, WLAN, etc.
  • An embodiment of the present application further provides a terminal, where the terminal may include any conversion parameter determination apparatus in the foregoing embodiments. Further, the terminal may be a vehicle, a robot, a lidar or a PC.
  • the terminal may further include at least one of a mobile communication module, a wireless communication module and the like.
  • the mobile communication module can provide wireless communication solutions including 2G/3G/4G/5G.
  • the mobile communication module may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • LNA low noise amplifier
  • at least part of the functional modules of the mobile communication module may be provided in at least one processor 11 .
  • the wireless communication module can provide wireless local area networks (WLAN) (such as WiFi network), Bluetooth (BT), global navigation satellite system (Global Navigation Satellite System, GNSS) and other wireless communication solutions .
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • the above-mentioned terminal may also include other more or less components, such as a display screen, a speaker, a camera, a sensor, etc., and the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the terminal.
  • the components shown in FIG. 10 can be implemented in hardware, software, firmware or any combination thereof.
  • the functions of the acquisition unit 901 and the processing unit 902 in the conversion parameter determination device shown in FIG. 9 may be implemented by at least one processor 11, or by at least one processor 11 and the memory 13, and the function of the storage unit may be Realized by the memory 13 .
  • At least one processor 11 is configured to obtain a point cloud coordinate set of a plane calibration body represented by a lidar coordinate system, and perform segmentation and plane fitting processing on the point cloud coordinate set to obtain the first equation coefficients, and obtain the first equation coefficients.
  • second equation coefficients and obtain conversion parameters of the lidar coordinate system and the vehicle coordinate system according to the first equation coefficients and the second equation coefficients; and convert the lidar coordinate system based on the conversion parameters
  • the point cloud coordinates are converted into point cloud coordinates in the vehicle coordinate system.
  • the device provided in this embodiment is improved in calibration accuracy and cost.
  • the scheme of matching the plane equation coefficient pairs is adopted, based on statistical averaging, the accuracy of the external parameter calibration of the LiDAR coordinate system is improved by an order of magnitude, and the errors caused by the angular resolution and ranging jitter of the LiDAR are avoided.
  • the plane equation coefficients in the vehicle coordinate system are obtained. Under the circumstance that the increase of the calibration time is limited, the need for the four-wheel station of the external parameter calibration scheme is avoided and the external parameters are saved. Parameter calibration cost.
  • this embodiment also provides an external parameter calibration system.
  • the structure of the system can be the same as that shown in FIG. 3.
  • the system includes: a conversion parameter determination device, a vehicle, a laser or laser scanner, a plane calibration device At least one of objects 1 to 5 and a total station.
  • the laser or laser scanner is used to scan the plane calibration bodies 1 to 5, and the point cloud obtained by the scanned plane calibration bodies 1 to 5 is represented by the LiDAR coordinate system, and the plane calibration will be represented by the LiDAR coordinate system.
  • the point cloud coordinates of volumes 1 to 5 are sent to the transformation parameter determination device.
  • the conversion parameter determination device receives the point cloud coordinates of the plane calibration bodies 1 to 5 sent by the laser or laser scanner, and executes steps 101 to 105 shown in the aforementioned FIG. 4 to obtain the conversion between the lidar coordinate system and the vehicle coordinate system parameters, and based on the conversion parameters, the point cloud coordinates in the lidar coordinate system are converted into the point cloud coordinates in the vehicle coordinate system.
  • the total station is used to establish the total station coordinate system C, and the total station coordinate system C is used to calibrate all the feature points on the plane calibration bodies 1 to 5.
  • the plane calibration bodies 1 to 5 are obtained in the whole
  • the equation coefficients in the station coordinate system C, and the obtained equation coefficients are reported to the conversion parameter determination device, so that the conversion parameter determination device obtains at least one feature point of the plane calibration bodies 1 to 5 in the total station. Equation coefficients in coordinate system C.
  • the total station is also used to represent the coordinates of at least one feature point on the vehicle by the total station coordinate system C, and send the coordinates of the feature points of these vehicles to the conversion parameter determination device, so that the conversion parameter is determined
  • the device obtains the conversion parameters between the total station coordinate system C and the vehicle coordinate system A according to the coordinates of the feature points of the vehicle in the total station coordinate system C.
  • Embodiments of the present application also provide a computer program product, where the computer program product includes one or more computer program instructions.
  • the computer program product includes one or more computer program instructions.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer program instructions may be stored in or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transferred from a network node, computer, server or data
  • the center transmits to another node by wire or wireless.

Abstract

一种转换参数的确定方法和装置,应用于自动驾驶或者智能驾驶领域。方法包括:获取利用激光雷达坐标系表示的平面定标体的点云坐标集(101),点云坐标集中包括平面定标体的特征点形成的点云坐标;对点云坐标集进行分割和平面拟合处理得到第一方程系数(102),并获取第二方程系数(103);根据第一方程系数和第二方程系数得到激光雷达坐标系与车辆坐标系的转换参数(104),转换参数用于将激光雷达坐标系中的点云坐标转换成车辆坐标系中的点云坐标;降低了激光雷达测距抖动和角度分辨率低对激光雷达外参标定精度的影响。

Description

一种转换参数的确定方法和装置
本申请要求于2021年1月26日提交中国国家知识产权局、申请号为202110102230.9、发明名称为“一种转换参数的确定方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及自动驾驶技术领域,尤其是涉及一种转换参数的确定方法和装置。
背景技术
激光雷达(Laser Radar,LiDAR)是自动驾驶领域中的一种重要传感器。它能感知周围环境,探测周围环境中的物体,并将探测到的物体以三维坐标,比如点云的形式在显示界面上呈现,其中被探测到的物体的三维坐标或者点云通常利用LiDAR所建立的激光雷达(LiDAR)坐标系来表示。
参见图1,在自动驾驶解决方案中,需要获得车辆坐标系中某一物体的位置和大小,该物体的位置和大小受限于以LiDAR坐标系下的点云坐标图形勾勒,然后再将LiDAR坐标系下的点云三维坐标转换到车辆坐标系中表示,从而便于在路径规划时,基于测量物体的位置和大小对车辆的行驶路线进行规划。在规划过程中,需要将LiDAR坐标系中标定“物体”的点云坐标转换到以车辆坐标系下的点云坐标,即获得LiDAR坐标系与车辆坐标系之间的转换参数是标定过程中的重要环节,该过程又称为外部参数标定过程,简称“外参标定”。
目前,对于LiDAR坐标系下的外参标定方法,一般是基于车载传感器采集的特征点信息,将描述物体轮廓的特征点坐标分别通过LiDAR坐标系和车辆坐标系来表示,然后基于两组特征点的坐标在最小二乘法准则下获得LiDAR坐标系到车辆坐标系的转换参数。该方法中,由于激光器扫描物体,并通过LiDAR坐标系标定物体特征点的精度有限,且测距时会发生抖动,所以在LiDAR坐标系下的特征点坐标的精度只能达到5cm量级。在此情况下,利用5cm量级的LiDAR坐标系在做坐标系转换的外参标定时,计算的角度会存在误差,比如图2示出了被测物体的某一直角平面在LiDAR坐标系下的点云坐标,可以看出,受到LiDAR坐标系的分辨率精度限制以及点云坐标的抖动,使得提取的该直角平面的4个角点的坐标不准确,从而计算获得的角度仅能达到以“度”为单位的量级,难以达到比如0.1°等更小量级的精度需求,所以目前的外参标定方法无法满足自动驾驶解决方案的高精度要求。
发明内容
本申请提供了一种转换参数的确定方法和装置,该方法用于提高了激光雷达坐标系的外参标定的精确度,进而满足自动驾驶的高精度要求。具体地,本申请公开了以下技术方案:
第一方面,本申请提供了一种转换参数的确定方法,可以应用于自动驾驶或者智能驾 驶领域,该方法包括:获取利用激光雷达坐标系表示的平面定标体的点云坐标集,所述点云坐标集中包括所述平面定标体的特征点形成的点云坐标;对所述点云坐标集进行分割和平面拟合处理得到第一方程系数,所述第一方程系数为所述平面定标体在所述激光雷达坐标系中的方程系数;获取第二方程系数,所述第二方程系数为所述平面定标体在车辆坐标系的方程系数;根据所述第一方程系数和所述第二方程系数得到所述激光雷达坐标系与所述车辆坐标系的转换参数,所述转换参数用于将所述激光雷达坐标系中的点云坐标转换成所述车辆坐标系中的点云坐标。
本方面,利用点云分割和平面拟合的方法对获取的点云坐标集进行处理,基于统计平均,得到定标体在激光雷达LiDAR坐标系中的方程系数,并利用该方程系数对定标体在车辆坐标系中的位置做标定。本方法分割后的点云经过平面拟合处理后,会将LiDAR坐标系外参标定的精度提升一个数量级,同时降低测距抖动带来的影响。
另外,通过将平面定标体的至少一个特征点分别在LiDAR坐标系和车辆坐标系中标定,进而能够获得从该LiDAR坐标系到车辆坐标系之间的转换参数,即LiDAR的外参,并利用该LiDAR的外参实现激光雷达的点云坐标到车辆坐标系中的点云坐标的转换。该方法可避免采用“四轮定位系统”辅助办法获得特征点在车辆坐标系下的坐标,减少了4S店或汽修店对车辆外部物体做标定的成本,同时还提高了在车辆坐标系中标定的精确度。
结合第一方面,在第一方面的一种可能的实现方式中,所述获取第二方程系数,包括:获取第三方程系数,和,全站仪坐标系与所述车辆坐标系之间的转换参数,其中所述第三方程系数为所述平面定标体在全站仪坐标系中的方程系数;根据所述全站仪坐标系与所述车辆坐标系之间的转换参数,将所述第三方程系数转换成所述第二方程系数。
结合第一方面,在第一方面的另一种可能的实现方式中,所述获取第三方程系数,包括:获得所述平面定标体在定标体坐标系中的方程系数;根据所述全站仪坐标系与所述定标体坐标系之间的转换参数,将所述平面定标体在定标体坐标系中的方程系数转换为在所述全站仪坐标系中的方程系数,得到所述第三方程系数。
结合第一方面,在第一方面的又一种可能的实现方式中,获取所述全站仪坐标系与所述车辆坐标系之间的转换参数,包括:获取车辆上的n个特征点,以及每个所述特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,n≥1且为正整数;根据所述n个特征点中的每个特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,得到所述全站仪坐标系与所述车辆坐标系之间的转换参数。
其中,所述n个特征点包括:a1、a1u、a2、a2u、a3、a3u、a4、a4u;进一步地,a1为车辆右侧后轮的中心,a1u为车辆右侧后轮轮毂的最高点,a2为车辆左侧后轮的中心,a2u为车辆左侧后轮轮毂的最高点,a3为车辆右侧前轮的中心,a3u为车辆右侧前轮轮毂的最高点,a4为车辆左侧前轮的中心,a4u为车辆左侧前轮轮毂的最高点。
第二方面,本申请提供了一种转换参数确定装置,该装置包括:获取单元,用于获取利用激光雷达坐标系表示的平面定标体的点云坐标集,所述点云坐标集中包括所述平面定标体的特征点形成的点云坐标;处理单元,用于对所述点云坐标集进行分割和平面拟合处理得到第一方程系数,所述第一方程系数为所述平面定标体在所述激光雷达坐标系中的方程系数;所述获取单元,还用于获取第二方程系数,所述第二方程系数为所述平面定标体在车辆坐标系的方程系数;所述处理单元,还用于根据所述第一方程系数和所述第二方程 系数得到所述激光雷达坐标系与所述车辆坐标系的转换参数,所述转换参数用于将激光雷达坐标系中的点云坐标转换成所述车辆坐标系中的点云坐标。
结合第二方面,在第二方面的一种可能的实现方式中,所述处理单元,还用于获取第三方程系数,和,全站仪坐标系与所述车辆坐标系之间的转换参数,其中所述第三方程系数为所述平面定标体在全站仪坐标系中的方程系数;所述获取单元,还用于根据所述全站仪坐标系与所述车辆坐标系之间的转换参数,将所述第三方程系数转换成所述第二方程系数。
结合第二方面,在第二方面的另一种可能的实现方式中,所述获取单元,还用于获得所述平面定标体在定标体坐标系中的方程系数;根据所述全站仪坐标系与所述定标体坐标系之间的转换参数,将所述平面定标体在定标体坐标系中的方程系数转换为在所述全站仪坐标系中的方程系数,得到所述第三方程系数。
结合第二方面,在第二方面的又一种可能的实现方式中,所述获取单元,还用于获取车辆上的n个特征点,以及每个所述特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,n≥1且为正整数,根据所述n个特征点中的每个特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,得到所述全站仪坐标系与所述车辆坐标系之间的转换参数。
其中,所述n个特征点包括但不限于:a1、a1u、a2、a2u、a3、a3u、a4、a4u;进一步地,a1为车辆右侧后轮的中心,a1u为车辆右侧后轮轮毂的最高点,a2为车辆左侧后轮的中心,a2u为车辆左侧后轮轮毂的最高点,a3为车辆右侧前轮的中心,a3u为车辆右侧前轮轮毂的最高点,a4为车辆左侧前轮的中心,a4u为车辆左侧前轮轮毂的最高点。
第三方面,本申请还提供了一种转换参数确定装置,该装置包括至少一个处理器和接口电路,其中,所述接口电路,用于为所述至少一个处理器提供指令和/或数据;所述至少一个处理器,用于执行所述指令,以实现前述第一方面及第一方面各种实现方式中的方法。
此外,所述终端中还包括存储器,所述存储器用于存储所述指令,和/或,数据。
可选的,所述至少一个处理器和所述接口电路可以集成在一个处理芯片或者芯片电路中。
可选的,所述转换参数确定装置为一种终端,所述终端包括但不限于车辆、激光雷达、机器人或PC。
第四方面,本申请还提供了一种计算机可读存储介质,该存储介质中存储有指令,使得当指令在计算机或处理器上运行时,可以用于执行前述第一方面以及第一方面各种实现方式中的方法。
另外,本申请还提供了一种计算机程序产品,该计算机程序产品包括计算机指令,当该指令被计算机或处理器执行时,可实现前述第一方面以及第一方面各种实现方式中的方法。
第五方面,本申请还提供了一种终端,所述终端包括前述第二方面以及第二方面各种实现方式中的装置,或者,包括前述第三方面中的装置,用于实现前述第一方面以及第一方面各种实现方式中的方法。
可选的,所述终端包括但不限于车辆、机器人、激光雷达或PC。
需要说明的是,上述第二方面至第五方面的各种实现方式的技术方案所对应的有益效果与前述第一方面以及第一方面的各种实现方式的有益效果相同,具体参见上述第一方面 以及第一方面的各种实现方式中的有益效果描述,不再赘述。
附图说明
图1为本申请提供的一种LiDAR坐标系和车辆坐标系的示意图;
图2为本申请提供的一种在LiDAR坐标系下表示的直角平面的点云坐标的示意图;
图3为本申请实施例提供的一种定位系统的结构示意图;
图4为本申请实施例提供的一种转换参数的确定方法的流程图;
图5为本申请实施例提供的一种获取第二方程系数的方法流程图;
图6为本申请实施例提供的一种车辆上特征点的示意图;
图7A为本申请实施例提供的一种外参的角度量估计误差的示意图;
图7B为本申请实施例提供的一种外参的位置量估计误差的示意图;
图8为本申请实施例提供的一种实验室环境下的外参标定结果的示意图;
图9为本申请实施例提供的一种转换参数确定装置的结构示意图;
图10为本申请实施例提供的一种终端的结构示意图。
具体实施方式
下面结合附图对本申请实施例中的技术方案作详细的说明。在对本申请实施例的技术方案说明之前,首先结合附图对本申请实施例的应用场景进行说明。
本申请的技术方案可应用于自动驾驶领域的应用场景,具体涉及一种激光雷达坐标系到车辆坐标系的外参离线标定的方法。该方法可应用于一种定位系统或者参数标定系统,该系统包括:第一装置、车辆、激光器或激光扫描仪、定标(calibration)体和全站仪。此外,还可以包括其他设备或者仪器,本实施例对此不作限定。
其中,所述第一装置可用是一种终端,比如智能手机、智慧屏电视(TV)、笔记本电脑、平板电脑、个人计算机(personal computer,PC)、个人数字助理(personal digital assistant,PDA),可折叠终端、车辆、机器人、无人机、激光雷达、具备无线通讯功能的可穿戴设备(例如智能手表或手环)、用户设备(user device)或用户设备(user equipment,UE)、以及增强现实(augmented reality,AR)或者虚拟现实(virtual reality,VR)设备等。本申请的实施例对终端的具体设备形态不做限定。另外,上述各种终端中包括但不限于搭载苹果(IOS)、安卓(Android)和微软(Microsoft)等。
参见图3,为本实施例提供的一种定位系统的结构示意图。该系统包括:车辆、平面定标体1至平面定标体5、参考定标体、全站仪1和全站仪2,以及第一装置。其中,车辆中包括车机处理器、各种传感器,比如陀螺仪传感器、加速度传感器、转轴传感器等。在图3中,终端未示出。可选的,所述终端可以设置在车辆中,比如安装在车辆上的一个PC。
另外,车辆上还设置有激光器或激光扫描仪,用于向外发射激光,扫描外部环境,以及根据扫描的外部环境建立激光雷达坐标系,简称“LiDAR坐标系”。可选的,该激光器或激光扫描仪可被安装在车顶的某一位置。
为了便于说明本实施例的技术方案,首先对上述各个器件的功能,以及每个器件所建立的坐标系进行描述。
本实施例对车辆外部环境和内部组件的标定和定位过程可通过以下坐标系来表 示,比如包括:车辆坐标系、LiDAR坐标系、全站仪坐标系和定标体坐标系。具体地,结合图3,各个坐标系的符号表示和功能介绍如下:
A:表示车辆坐标系,该车辆坐标系可由车机处理器建立,用于表征车辆内部组件及外部物体位置关系的坐标系。可选的,该车辆坐标系A的坐标原点为车辆后轴中心,三个坐标轴分别表示为X A,Y A,Z A,其中,X A轴沿车辆后轴向右,Z A轴垂直于车辆底盘向上。
B:表示激光雷达(LiDAR)坐标系,该LiDAR坐标系可以由激光器或激光扫描仪建立,用于测量车辆外部环境。激光器或激光扫描仪将产生的一束光脉冲打在某一物体上并反射回来,最终被接收器所接收,接收器准确地测量光脉冲从发射到被反射回的传播时间,然后再基于光速和传播时间得到测量距离。最后,结合激光器的高度以及激光扫描角度就可以准确地计算出测量区域内每个物体的坐标,测量精度可达到厘米级,测量距离可达到道路两旁300m范围。在LiDAR坐标系B中,坐标原点可以为激光器或激光扫描仪的中心,三个坐标轴分别表示为X B,Y B,Z B
C:表示全站仪坐标系,该全站仪坐标系是由全站仪建立的坐标系,该坐标系的三个坐标轴可分别表示为X C,Y C,Z C
全站仪,又称全站型电子测距仪(Electronic Total Station),是一种集光、机、电为一体的高技术测量仪器,是集水平角、垂直角、距离(斜距、平距)、高差等测量功能于一体的测绘仪器系统。全站仪被广泛用于地上大型建筑和地下隧道施工等精密工程测量或变形监测领域。因其一次安置仪器就可完成该测站上全部测量工作,所以称之为全站仪。
本实施例中,包含两个全站仪,分别位于车辆两面的车门侧。其中,将位于车辆右侧车门位置的全站仪称为“全站仪1”,该全站仪1所建立的坐标系为全站仪坐标系C1,对应的全站仪坐标系C1的三个坐标轴分别为X C1,Y C1,Z C1。同理地,位于车辆左侧车门位置的全站仪称为“全站仪2”,该全站仪2所建立的坐标系为全站仪坐标系C2,对应的全站仪坐标系C2的三个坐标轴分别为X C2,Y C2,Z C2
D:表示定标体坐标系,所述定标体坐标系是以参考定标体上特征点为基准建立的坐标系。该定标体坐标系用于标定至少一个定标体,比如二维的平面定标体或者三维的参考定标体。该定标体坐标系D的坐标原点可设定为参考定标体上某一平面中心,三个坐标轴分别表示为X D,Y D,Z D
另外,二维的平面定标体又可称为“定标平面”,用作外参标定的靶标。可选的,所述二维的平面定标体包括但不限于金属平板、漫反射板等。三维的参考定标体用于建立定标体坐标系D,一般地,参考定标体为长方体。
本实施例中,将平面定标体1的至少一个特征点(feature)在定标体坐标系D中表示为“FeaD1_D”,类似的,平面定标体2的至少一个特征点在定标体坐标系D中表示为“FeaD2_D”,平面定标体3的至少一个特征点在定标体坐标系D中表示为“FeaD3_D”,平面定标体4的至少一个特征点在定标体坐标系D中表示为“FeaD4_D”,平面定标体5的至少一个特征点在定标体坐标系D中表示为“FeaD5_D”。
参考定标体包含多个二维平面,其中的一个二维平面上的至少一个特征点在定标体坐标系D中表示为“FeaD_D”。
其中,在上述坐标系A至坐标系D中,车辆坐标系A和激光雷达坐标系B均为右手坐标系。
为了解决上述在LiDAR坐标系下标定直角平面的点云坐标精度较低的问题,本实施例提供了一种转换参数的确定方法,参见图4,该方法包括:
首先,激光器或激光扫描仪建立激光雷达坐标系B,并且激光器或激光扫描仪将扫描的定标体的至少一个特征点通过该LiDAR坐标系B表示。
本实施例中扫描的定标体包括平面定标体1至5,且每个平面定标体的位置固定,每个平面定标体中包含至少一个特征点,这些特征点可通过三维坐标点云的方式表示。比如平面定标体1的至少一个特征点通过LiDAR坐标系B表示为点云坐标集1,平面定标体2的至少一个特征点通过LiDAR坐标系B表示为点云坐标集2,以此类推。
激光器或激光扫描仪将标定的所有定标体的点云坐标集发送给第一装置,所述第一装置可以是一种转换参数确定装置,用于执行本实施例提供的转换参数的确定方法,具体地,该方法包括以下步骤:
101:获得利用激光雷达坐标系表示的平面定标体的点云坐标集。所述点云坐标集中包括所述平面定标体的特征点形成的点云坐标,比如上述的点云坐标系1~5中的一个或多个。
102:对所述点云坐标集进行分割和平面拟合处理得到第一方程系数,所述第一方程系数为所述平面定标体在所述激光雷达坐标系中的方程系数。
其中,第一装置对点云坐标集进行分割又简称“点云分割”,点云分割是根据空间,几何和纹理等特征对点云进行划分,使得同一划分内的点云拥有相似的特征。一种可能的点云分割方式是,基于点云切片的特征边界提取,利用点云密度作为阈值,将不同的连续特征曲线点云分割开,然后再对分割后的不同点集数据做平面拟合处理。
比如,可以利用一种稳健的点云数据平面拟合方法,该方法以特征值法为基础,通过利用一定的准则删除点云数据中的差异值,从而获得稳健的平面参数估计值,在具体实验中,可分别利用最小二乘法、特征值法和稳健特征值法对点云数据进行平面拟合处理,从而得到可靠的平面参数方程系数,即获得所述第一方程系数。本实施例对点云分割和平面拟合处理的具体过程不做限制。
103:获取第二方程系数,所述第二方程系数为所述平面定标体在车辆坐标系的方程系数。
具体地,一种获取第二方程系数的方法,如图5所示,包括:
103-1:获取第三方程系数,和,全站仪坐标系C与车辆坐标系A之间的转换参数。其中,第三方程系数为所述平面定标体在全站仪坐标系C中的方程系数。
可选的,所述全站仪坐标系C与车辆坐标系A之间的转换参数又称为“转换矩阵”。
具体地,一种获取该第三方程系数的实施方式,包括:第一装置获得利用定标体坐标系D表示的所述平面定标体的方程系数,具体地,该方程系数为所述平面定标体的至少三个特征点所在的平面在定标体坐标系D中的方程系数,然后,根据全站仪坐标系C与定标体坐标系D之间的转换参数,将所述平面定标体在定标体坐标系D中的方程系数转换为在所述全站仪坐标系C中的方程系数,得到该平面定标体在全站仪坐标系C中的方程系数,即所述第三方程系数。
其中,第一装置获得所述平面定标体在定标体坐标系D的方程系数时,需要预先获得全站仪坐标系C与定标体坐标系D之间的转换参数,然后将所述平面定标体在全站仪坐标系C中方程系数转换为其在定标体坐标系D中表示的方程系数,即所述平面定标体在定标体坐标系D的方程系数。其中,平面定标体上的每个特征点的坐标可以通过定标体坐标系D表示,比如,平面定标体1中的特征点在定标体坐标系D中表示如图3的“FeaD1_D”。
另外,获取所述平面定标体在全站仪坐标系C中的方程系数的具体实施方式,包括:以全站仪1为例,全站仪1建立全站仪坐标系C1,扫描的平面定标体1~5上的所有特征点,然后将扫描的平面定标体1~5上所有特征点通过全站仪坐标系C1标定,基于平面拟合方法得到平面定标体1~5在全站仪坐标系C1中的方程系数,所述平面拟合方法包括但不限于最小二乘法、特征值法和稳健特征值法等;最后,全站仪1将该得到所述平面定标体1~5在全站仪坐标系C1中的方程系数上报给所述第一装置,相应的,第一装置接收全站仪1发送的所述平面定标体1~5在全站仪坐标系C1中的方程系数。
在步骤103-1中,获得所述全站仪坐标系C与所述车辆坐标系A之间的转换参数,一种可能的实现方式是,利用车辆上的至少一个特征点来获得该转换参数。具体地,如图6所示,获取车辆上的n个特征点,n≥1且为正整数。且所述n个特征点中包括:a1、a1u、a2、a2u、a3、a3u、a4、a4u。
其中a1为车辆右侧后轮的中心,a1u为车辆右侧后轮轮毂的最高点,a2为车辆左侧后轮的中心,a2u为车辆左侧后轮轮毂的最高点,a3为车辆右侧前轮的中心,a3u为车辆右侧前轮轮毂的最高点,a4为车辆左侧前轮的中心,a4u为车辆左侧前轮轮毂的最高点。
应理解,上述n个特征点包括不局限于上述举例,还可以选择车辆上的其他点作为特征点。
将上述车辆的至少一个特征点a1、......、a4u分别在车辆坐标系A和全站仪坐标系C标定。具体包括:全站仪1标定车辆右侧的特征点a1、a1u、a3、a3u在全站仪坐标系C1中的坐标,并将这些坐标值上报给第一装置。同理地,全站仪2标定车辆左侧的特征点a2、a2u、a4、a4u在全站仪坐标系C2中的坐标,并将这些坐标值上报给该第一装置。第一装置接收全站仪1和全站仪2上报的车辆的所有特征点a1、......、a4u在全站仪坐标系C中的坐标。另外,第一装置还获取车辆的所有特征点a1、......、a4u在车辆坐标系A中的坐标,最后,第一装置根据所有这些特征点分别在车辆坐标系A和全站仪坐标系C中的坐标,得到全站仪坐标系C与车辆坐标系A之间的转换参数。
另外,在上述“获得全站仪坐标系C与定标体坐标系D之间的转换参数”的过程中,可通过“参考定标体”的至少一个特征点得到这两个坐标系的转换参数。具体地,全站仪1观测参考定标体,并获得该参考定标体中每个平面在全站仪坐标系C1中的坐标,全站仪1向第一装置发送该参考定标体的特征点的坐标集,第一装置接收全站仪1发送的所述坐标集。另外,该第一装置还获得该参考定标体的特征点在定标体坐标系D中的坐标。其中,该参考定标体中的特征点在定标体坐标系D中的坐标为已知 量,可预先标定和获得。第一装置根据该参考定标体的特征点分别在全站仪坐标系C1和定标体坐标系D中的坐标获得全站仪坐标系C1与定标体坐标系D之间的转换参数。
同理地,按照该方法还可以获得全站仪坐标系C2与定标体坐标系D之间的转换参数,本实施例对获取全站仪坐标系C2与定标体坐标系D之间的转换参数的过程不再赘述。
需要说明的是,本实施例中的全站仪还可以由其他可测量特征点三维坐标的仪器或者装置来替代,比如三维坐标测量仪等,本实施例对予不做限制。
103-2:根据所述全站仪坐标系C与所述车辆坐标系A之间的转换参数,将所述第三方程系数转换成所述第二方程系数。该第二方程系数为平面定标体在车辆坐标系A的方程系数。
104:根据所述第一方程系数和所述第二方程系数得到所述激光雷达坐标系与所述车辆坐标系的转换参数,所述转换参数用于将激光雷达坐标系中的点云坐标转换成所述车辆坐标系中的点云坐标。
其中,所述激光雷达坐标系B与所述车辆坐标系A的转换参数又称为“LiDAR的外参”。
具体地,在一示例中,LiDAR的外参计算算法如下:
平面定标体的至少一个特征点在车辆坐标系A中的定标平面方程(1):
Figure PCTCN2021131608-appb-000001
其中,M、N、O和P表示平面方程系数,即前述的第二方程系数,X A,Y A,Z A分别表示车辆坐标系A的三个坐标轴。
平面定标体的至少一个特征点在LiDAR坐标系中的定标平面方程(2):
Figure PCTCN2021131608-appb-000002
其中,M′、N′、O′和P′表示平面方程系数,即前述的第一方程系数,X B,Y B,Z B分别表示LiDAR坐标系B的三个坐标轴。
LiDAR坐标系B到车辆坐标系A的转换关系为方程(3):
Figure PCTCN2021131608-appb-000003
其中,R表示旋转矩阵,T表示平移矩阵。可选的,R是一个3×3的矩阵,T是一个3×1的矩阵,比如:
Figure PCTCN2021131608-appb-000004
将(3)式代入(1)式得到平面定标体在LiDAR坐标系中平面方程(4):
Figure PCTCN2021131608-appb-000005
基于(2)式和(4)式得到LiDAR坐标系和车辆坐标系中所述平面定标体的方程系数的关系式(5):
Figure PCTCN2021131608-appb-000006
基于(5)式,外参求解可等效为以下最优化解,见式(6):
Figure PCTCN2021131608-appb-000007
其中,arg min表示后面的式子达到最小值时的变量M、N、O、P的取值,k表示第k个平面定标体,本实施例中k∈[0,5]。R *,T *表示LiDAR坐标系B与车辆坐标系A的转换参数,即LiDAR的外参。
本实施例中,利用点云分割和平面拟合的方法对获取的点云坐标集进行处理,基于统计平均,得到定标体在LiDAR坐标系中的方程系数,并利用该方程系数对定标体在车辆坐标系中的位置做标定。本方法分割后的点云经过平面拟合处理后,会将LiDAR坐标系外参标定的精度提升一个数量级,同时降低测距抖动带来的影响。
另外,通过将平面定标体的至少一个特征点分别在LiDAR坐标系和车辆坐标系中标定,进而能够获得从该LiDAR坐标系到车辆坐标系之间的转换参数,即LiDAR的外参,并利用该LiDAR的外参实现激光雷达的点云坐标到车辆坐标系中的点云坐标的转换。该方法可避免采用“四轮定位系统”辅助办法获得特征点在车辆坐标系下的坐标,减少了4S店或汽修店对车辆外部物体做标定的成本,同时还提高了在车辆坐标系中标定的精确度。
下面从“实验仿真计算”和“实际数据测量”两个维度对本方法提高外参标定的精确度进行说明。
利用上述LiDAR的外参进行仿真得到外参的角度量估计误差,如图7A所示,和外参的位置量估计误差,如图7B所示。在图7A所示的仿真结果中,横坐标表示测距抖动标准差,符号表示为Err_B,单位是:米(m)。纵坐标表示外参的角度估计误差,符号表示为Err_Ang,单位是:度(°)。图7A中包含3个角度估计误差,分别是Arfa(α)、Beta(β)和Gama(γ)。其中,在LiDAR测距抖动标准差小于0.02m时,外参的角度估计误差小于0.1°,如图7A中,在Line1的横坐标值为0.02m,角度估计误差Err_Arfa、Err_Beta、Err_Gama的值(见Line2对应的纵坐标)都位于0.1°以下。
类似的,在图7B中,横坐标表示测距抖动标准差,符号表示为Err_B,单位是:米(m);纵坐标表示位置量估计误差,符号表示为Err_T,单位是:米(m)。图7B中包含3种位置量估计误差,分别是Err_X、Err_Y和Err_Z。其中,在Line3的横坐标值为0.02m时,位置量估计误差Err_X、Err_Y和Err_Z的值(见Line4对应的纵坐标)均小于0.025m。
由上述图7A和图7B的仿真结果可以看出,在LiDAR测距抖动标准差(Err_B)小于0.02m时,外参的角度估计误差(Err_Ang)和位置量估计误差(Err_T)均在较小的范围内,因此能够满足自动驾驶方案的高精度要求。
本实施例中,还利用上述步骤104中的LiDAR的外参进行实际数据测量,得到两组实际数据的测量结果,参见图8所示,为实验室环境下的外参标定结果,分别是实验数据1和实验数据2。其中,在实验数据1中得到Gama估计值为124.92577875158723°,可约等于124.93°。并且在实验室环境下,Gama外参的真值已知,在实验数据1中,真值为125°,则可计算出实验得到的Gama估计值与真值之差,Δ1=125°-124.93°=0.07°,该角度估计误差小于0.1°。
同理地,在实验数据2中,Gama估计值为93.89807604536011°,可约等于93.90°,且实验室条件下的真值为94°,则可计算出估计值与真值之差,Δ2=94°-93.90°=0.1°,该角度估计误差为0.1°。所以根据上述实验数据1和2可得,采用本申请上述实施例中的方法得到外参标定值与真值之间的误差不超过0.1°,所以评估标定方法的精度较高。
另外,需要说明的是,本实施例提供的转换参数的确定方法还可以应用于其他三维坐标传感器到车辆的外参标定,其基本思路和操作流程可完全采用上述方法实施例中的技术方案。
图9为本申请实施例提供的一种转换参数确定装置的结构示意图。所述装置可以是一种前述实施例中的第一装置,该装置可以实现前述实施例中转换参数的确定方法。
具体地,如图9所示,该装置可以包括:获取单元901和处理单元902。此外,所述装置还可以包括存储模块、发送模块等其他的单元或模块。
其中,获取单元901,用于获取利用激光雷达坐标系表示的平面定标体的点云坐标集,所述点云坐标集中包括所述平面定标体的至少三个特征点形成的点云坐标;处理单元902,用于对所述点云坐标集进行分割和平面拟合处理得到第一方程系数,所述第一方程系数为所述平面定标体在所述激光雷达坐标系中的方程系数。
获取单元901,还用于获取第二方程系数,所述第二方程系数为所述平面定标体在车辆坐标系的方程系数;处理单元902,还用于根据所述第一方程系数和所述第二方程系数得到所述激光雷达坐标系与所述车辆坐标系的转换参数,所述转换参数用于将激光雷达坐标系中的点云坐标转换成所述车辆坐标系中的点云坐标。
可选的,在本实施例的一种可能的实施方式中,处理单元902还用于获取第三方程系数,和,全站仪坐标系与所述车辆坐标系之间的转换参数,其中所述第三方程系数为所述平面定标体在全站仪坐标系中的方程系数;获取单元901还用于根据所述全站仪坐标系与所述车辆坐标系之间的转换参数,将所述第三方程系数转换成所述第二方程系数。
可选的,在本实施例的另一种可能的实施方式中,获取单元901还用于获得所述平面定标体在定标体坐标系中的方程系数;根据所述全站仪坐标系与所述定标体坐标系之间的转换参数,将所述平面定标体在定标体坐标系中的方程系数转换为在所述全站仪坐标系中的方程系数,得到所述第三方程系数。
可选的,在本实施例的又一种可能的实施方式中,获取单元901还用于获取车辆上的n个特征点,以及每个所述特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,n≥1且为正整数,根据所述n个特征点中的每个特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,得到所述全站仪坐标系与所述车辆坐标系之间的转换参数。
其中,所述n个特征点包括:a1、a1u、a2、a2u、a3、a3u、a4、a4u;进一步地,a1为车辆右侧后轮的中心,a1u为车辆右侧后轮轮毂的最高点,a2为车辆左侧后轮的中心,a2u为车辆左侧后轮轮毂的最高点,a3为车辆右侧前轮的中心,a3u为车辆右侧前轮轮毂的最高点,a4为车辆左侧前轮的中心,a4u为车辆左侧前轮轮毂的最高点。
应理解,本实施例中还可以选择其他特征点作为车辆上的特征点,本实施例对此不予限制。
可选的,所述转换参数确定装置可以是一个包含运算功能的控制模块,比如微控制单元(Microcontroller Unit,MCU),集成在终端中。
图10示出了又一种转换参数确定装置的结构示意图,该装置可以包括:至少一个处理器11和接口电路12,此外,可选的,还包括存储器13。
其中,至少一个处理器11为转换参数确定装置的控制中心,可用于完成前述实施例中的转换参数的确定方法。
可选的,至少一个处理器11可以由集成电路(Integrated Circuit,IC)组成,例如可以由单颗封装的IC所组成,也可以由连接多颗相同功能或不同功能的封装IC而组成。举例来说,处理器可以包括中央处理器(central processing unit,CPU)或数字信号处理器(digital signal processor,DSP)等。
此外,上述处理器还可以包括硬件芯片,所述该硬件芯片可以是一种逻辑电路,专用集成电路(application specific integrated circuit,ASIC),可编程逻辑器件(programmable logic device,PLD)或其组合。上述PLD可以是复杂可编程逻辑器件(complex programmable logic device,CPLD),现场可编程逻辑门阵列(field-programmable gate array,FPGA),通用阵列逻辑(generic array logic,GAL)或其任意组合。
存储器13用于存储和交换各类指令、数据和/或软件,该存储器13中可以包含多个存储介质,其中至少一个存储介质可用于存储各个坐标系的参数以及平面方程系数等。此外,至少一个存储介质还可以用于存储计算机程序或代码。
具体地,上述存储器13可以包括易失性存储器(volatile memory),例如随机存取内存(random access memory,RAM);还可以包括非易失性存储器(non-volatile memory),例如快闪存储器(flash memory),硬盘(hard sisk drive,HDD)或固态硬盘(Solid-State Drive,SSD),或者,存储器还可以包括上述种类的存储器的组合。
可选的,存储器13既可以作为存储介质集成在至少一个处理器11中,还可以被配置在处理器之外,本实施例对此不予限制。
接口电路12包括但不限于收发接口和/或收发器,所述收发接口用于所述至少一个处理器收发数据和/或信息,所述收发器用于所述转换参数确定装置与其它设备或通信网络通信,如以太网,无线接入网、WLAN等。
本申请实施例还提供了一种终端,该终端可以包括前述实施例中的任一转换参数确定装置。进一步地,所述终端可以为车辆、机器人、激光雷达或者PC。
进一步可选的,所述终端还可以包括移动通信模块、无线通信模块等中的至少一个。其中,移动通信模块可以提供包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。在一些实施例中,所述移动通信模块的至少部分功能模块可以被设置于至少一个处理器11中。另外,所述无线通信模块可以提供无线局域网(wireless local area networks,WLAN)(如WiFi网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS)等无线通信的解决方案。
应理解,上述终端中还可以包括其他更多或更少的部件,比如显示屏、扬声器、摄像头、传感器等,本申请实施例示意的结构并不构成对终端的具体限定。并且图10所示的部件可以以硬件,软件、固件或者其任意组合的方式来实现。
当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。例如,在前述图9所示的转换参数确定装置中的获取单元901和处理单元902的功能可以由至少一个处理器11,或者由至少一个处理器11和存储器13来实现,存储单元的功能可以由存储器13实现。
具体地,至少一个处理器11,用于获取利用激光雷达坐标系表示的平面定标体的点云坐标集,对所述点云坐标集进行分割和平面拟合处理得到第一方程系数,获取第二方程系数,并根据所述第一方程系数和所述第二方程系数得到所述激光雷达坐标系与所述车辆坐标系的转换参数;以及基于所述转换参数将激光雷达坐标系中的点云坐标转换成所述车辆坐标系中的点云坐标。
本实施例提供的装置,在标定精度和成本上均有所改进。其中,采用平面方程系数对匹配的方案,基于统计平均,将LiDAR坐标系外参标定的精确度提高一个数量级,避免了激光雷达的角分辨率和测距抖动带来的误差。另外,通过全站仪多次坐标转换的方案,获得了车辆坐标系下的平面方程系数,在标定时间增加有限的情况下,规避了外参标定方案对四轮工位的需求,节约了外参标定成本。
另外,本实施例还提供了一种外参标定系统,该系统的结构可以与前述图3所示的结构相同,所述系统包括:转换参数确定装置、车辆、激光器或激光扫描仪、平面定标体1~5和全站仪等中的至少一个。
其中,激光器或激光扫描仪,用于扫描平面定标体1~5,并将扫描的平面定标体1~5所获得的点云用LiDAR坐标系表示,将通过LiDAR坐标系表示平面定标体1~5的点云坐标发送给转换参数确定装置。
所述转换参数确定装置接收激光器或激光扫描仪发送的平面定标体1~5的点云坐标,并执行前述图4所示的步骤101至105,得到激光雷达坐标系与车辆坐标系的转换参数,并基于该转换参数将激光雷达坐标系中的点云坐标转换成所述车辆坐标系中的点云坐标。
另外,全站仪用于建立全站仪坐标系C,利用全站仪坐标系C标定平面定标体1~5上所有特征点,基于平面拟合方法得到平面定标体1~5在全站仪坐标系C中的方程系数,以及,将该得到所述方程系数上报给所述转换参数确定装置,使得转换参数确定装置获得平面定标体1~5的至少一个特征点在全站仪坐标系C中方程系数。
此外,全站仪还用于将车辆上的至少一个特征点的坐标,通过全站仪坐标系C表示,并将这些车辆的特征点坐标发送给转换参数确定装置,以使所述转换参数确定装置根据该车辆的特征点在全站仪坐标系C中的坐标得到全站仪坐标系C与车辆坐标系A之间的转换参数。
本申请实施例还提供一种计算机程序产品,所述计算机程序产品包括一个或多个计算机程序指令。在计算机加载和执行所述计算机程序指令时,全部或部分地产生按照上述各个实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络或者其他可编程装置。
所述计算机程序指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网络节点、计算机、服务器或数据中心通过有线或无线方式向另一个节点进行传输。
此外,在本申请的描述中,除非另有说明,“多个”是指两个或多于两个。另外,为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
以上所述的本申请实施方式并不构成对本申请保护范围的限定。

Claims (14)

  1. 一种转换参数的确定方法,其特征在于,所述方法包括:
    获取利用激光雷达坐标系表示的平面定标体的点云坐标集,所述点云坐标集中包括所述平面定标体的特征点形成的点云坐标;
    对所述点云坐标集进行分割和平面拟合处理得到第一方程系数,所述第一方程系数为所述平面定标体在所述激光雷达坐标系中的方程系数;
    获取第二方程系数,所述第二方程系数为所述平面定标体在车辆坐标系的方程系数;
    根据所述第一方程系数和所述第二方程系数得到所述激光雷达坐标系与所述车辆坐标系的转换参数,所述转换参数用于将所述激光雷达坐标系中的点云坐标转换成所述车辆坐标系中的点云坐标。
  2. 根据权利要求1所述的方法,其特征在于,所述获取第二方程系数,包括:
    获取第三方程系数,和,全站仪坐标系与所述车辆坐标系之间的转换参数,其中所述第三方程系数为所述平面定标体在全站仪坐标系中的方程系数;
    根据所述全站仪坐标系与所述车辆坐标系之间的转换参数,将所述第三方程系数转换成所述第二方程系数。
  3. 根据权利要求1或2所述的方法,其特征在于,所述获取第三方程系数,包括:
    获得所述平面定标体在定标体坐标系中的方程系数;
    根据所述全站仪坐标系与所述定标体坐标系之间的转换参数,将所述平面定标体在定标体坐标系中的方程系数转换为在所述全站仪坐标系中的方程系数,得到所述第三方程系数。
  4. 根据权利要求2所述的方法,其特征在于,获取所述全站仪坐标系与所述车辆坐标系之间的转换参数,包括:
    获取车辆上的n个特征点,以及每个所述特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,n≥1且为正整数;
    根据所述n个特征点中的每个特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,得到所述全站仪坐标系与所述车辆坐标系之间的转换参数。
  5. 根据权利要求4所述的方法,其特征在于,所述n个特征点包括:a1、a1u、a2、a2u、a3、a3u、a4、a4u;
    其中,a1为车辆右侧后轮的中心,a1u为车辆右侧后轮轮毂的最高点,a2为车辆左侧后轮的中心,a2u为车辆左侧后轮轮毂的最高点,a3为车辆右侧前轮的中心,a3u为车辆右侧前轮轮毂的最高点,a4为车辆左侧前轮的中心,a4u为车辆左侧前轮轮毂的最高点。
  6. 一种转换参数确定装置,其特征在于,所述装置包括:
    获取单元,用于获取利用激光雷达坐标系表示的平面定标体的点云坐标集,所述点云坐标集中包括所述平面定标体的特征点形成的点云坐标;
    处理单元,用于对所述点云坐标集进行分割和平面拟合处理得到第一方程系数,所述第一方程系数为所述平面定标体在所述激光雷达坐标系中的方程系数;
    所述获取单元,还用于获取第二方程系数,所述第二方程系数为所述平面定标体 在车辆坐标系的方程系数;
    所述处理单元,还用于根据所述第一方程系数和所述第二方程系数得到所述激光雷达坐标系与所述车辆坐标系的转换参数,所述转换参数用于将所述激光雷达坐标系中的点云坐标转换成所述车辆坐标系中的点云坐标。
  7. 根据权利要求6所述的装置,其特征在于,
    所述处理单元,还用于获取第三方程系数,和,全站仪坐标系与所述车辆坐标系之间的转换参数,其中所述第三方程系数为所述平面定标体在全站仪坐标系中的方程系数;
    所述获取单元,还用于根据所述全站仪坐标系与所述车辆坐标系之间的转换参数,将所述第三方程系数转换成所述第二方程系数。
  8. 根据权利要求6或7所述的装置,其特征在于,
    所述获取单元,还用于获得所述平面定标体在定标体坐标系中的方程系数;根据所述全站仪坐标系与所述定标体坐标系之间的转换参数,将所述平面定标体在定标体坐标系中的方程系数转换为在所述全站仪坐标系中的方程系数,得到所述第三方程系数。
  9. 根据权利要求7所述的装置,其特征在于,
    所述获取单元,还用于获取车辆上的n个特征点,以及每个所述特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,n≥1且为正整数,根据所述n个特征点中的每个特征点在所述车辆坐标系和所述全站仪坐标系中的坐标,得到所述全站仪坐标系与所述车辆坐标系之间的转换参数。
  10. 根据权利要求9所述的装置,其特征在于,所述n个特征点包括:a1、a1u、a2、a2u、a3、a3u、a4、a4u;
    其中,a1为车辆右侧后轮的中心,a1u为车辆右侧后轮轮毂的最高点,a2为车辆左侧后轮的中心,a2u为车辆左侧后轮轮毂的最高点,a3为车辆右侧前轮的中心,a3u为车辆右侧前轮轮毂的最高点,a4为车辆左侧前轮的中心,a4u为车辆左侧前轮轮毂的最高点。
  11. 一种转换参数确定装置,其特征在于,包括至少一个处理器和接口电路,
    所述接口电路,用于为所述至少一个处理器提供指令和/或数据;
    所述至少一个处理器,用于执行所述指令,以实现如权利要求1至5中任一项所述的方法。
  12. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机程序指令,当所述计算机程序指令被运行时,实现如权利要求1至5中任一项所述的方法。
  13. 一种终端,其特征在于,所述终端包括如权利要求6至10中任一项所述的装置,或者如权利要求11所述的装置。
  14. 根据权利要求13所述的终端,其特征在于,所述终端为车辆、机器人、激光雷达或个人计算机PC。
PCT/CN2021/131608 2021-01-26 2021-11-19 一种转换参数的确定方法和装置 WO2022160879A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110102230.9 2021-01-26
CN202110102230.9A CN114791610A (zh) 2021-01-26 2021-01-26 一种转换参数的确定方法和装置

Publications (1)

Publication Number Publication Date
WO2022160879A1 true WO2022160879A1 (zh) 2022-08-04

Family

ID=82459593

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/131608 WO2022160879A1 (zh) 2021-01-26 2021-11-19 一种转换参数的确定方法和装置

Country Status (2)

Country Link
CN (1) CN114791610A (zh)
WO (1) WO2022160879A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115994955A (zh) * 2023-03-23 2023-04-21 深圳佑驾创新科技有限公司 相机外参标定方法、装置和车辆

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103711050A (zh) * 2013-12-31 2014-04-09 中交第二公路勘察设计研究院有限公司 一种激光雷达道路改扩建勘测设计方法
CN105203023A (zh) * 2015-07-10 2015-12-30 中国人民解放军信息工程大学 一种车载三维激光扫描系统安置参数的一站式标定方法
CN109696663A (zh) * 2019-02-21 2019-04-30 北京大学 一种车载三维激光雷达标定方法和系统
CN110221603A (zh) * 2019-05-13 2019-09-10 浙江大学 一种基于激光雷达多帧点云融合的远距离障碍物检测方法
CN112068108A (zh) * 2020-08-11 2020-12-11 南京航空航天大学 一种基于全站仪的激光雷达外部参数标定方法
US20200401823A1 (en) * 2019-06-19 2020-12-24 DeepMap Inc. Lidar-based detection of traffic signs for navigation of autonomous vehicles

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103711050A (zh) * 2013-12-31 2014-04-09 中交第二公路勘察设计研究院有限公司 一种激光雷达道路改扩建勘测设计方法
CN105203023A (zh) * 2015-07-10 2015-12-30 中国人民解放军信息工程大学 一种车载三维激光扫描系统安置参数的一站式标定方法
CN109696663A (zh) * 2019-02-21 2019-04-30 北京大学 一种车载三维激光雷达标定方法和系统
CN110221603A (zh) * 2019-05-13 2019-09-10 浙江大学 一种基于激光雷达多帧点云融合的远距离障碍物检测方法
US20200401823A1 (en) * 2019-06-19 2020-12-24 DeepMap Inc. Lidar-based detection of traffic signs for navigation of autonomous vehicles
CN112068108A (zh) * 2020-08-11 2020-12-11 南京航空航天大学 一种基于全站仪的激光雷达外部参数标定方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115994955A (zh) * 2023-03-23 2023-04-21 深圳佑驾创新科技有限公司 相机外参标定方法、装置和车辆

Also Published As

Publication number Publication date
CN114791610A (zh) 2022-07-26

Similar Documents

Publication Publication Date Title
CN112654886B (zh) 外参标定方法、装置、设备及存储介质
US9659378B2 (en) Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and program therefor
CN110988849B (zh) 雷达系统的标定方法、装置、电子设备及存储介质
CN110501712B (zh) 无人驾驶中用于确定位置姿态数据的方法、装置和设备
CN113074727A (zh) 基于蓝牙与slam的室内定位导航装置及其方法
CN113655453B (zh) 用于传感器标定的数据处理方法、装置及自动驾驶车辆
JP2016109650A (ja) 位置推定装置、位置推定方法、位置推定プログラム
CN110873883A (zh) 融合激光雷达和imu的定位方法、介质、终端和装置
CN112415494B (zh) Agv双激光雷达位置标定方法、装置、设备和存储介质
CN112051575B (zh) 一种毫米波雷达与激光雷达的调整方法及相关装置
CN111913169B (zh) 激光雷达内参、点云数据的修正方法、设备及存储介质
CN111435163A (zh) 地面点云数据过滤方法、装置、探测系统及存储介质
WO2022179094A1 (zh) 车载激光雷达外参数联合标定方法、系统、介质及设备
WO2021016854A1 (zh) 一种标定方法、设备、可移动平台及存储介质
WO2020258217A1 (zh) 可移动平台状态估计方法、系统、可移动平台及存储介质
CN114488099A (zh) 一种激光雷达系数标定方法、装置、电子设备及存储介质
WO2022160879A1 (zh) 一种转换参数的确定方法和装置
CN113759348A (zh) 一种雷达标定方法、装置、设备及存储介质
CN113281777A (zh) 一种货物体积动态测量方法及其测量装置
CN116929343A (zh) 位姿估计方法、相关设备及存储介质
WO2020215296A1 (zh) 可移动平台的巡线控制方法、设备、可移动平台及系统
CN113495281B (zh) 可移动平台的实时定位方法及装置
CN116047481A (zh) 矫正点云数据畸变方法、装置、设备及存储介质
WO2022037370A1 (zh) 一种运动估计方法及装置
WO2023065110A1 (zh) 基站标定方法、计算机设备以及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21922469

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21922469

Country of ref document: EP

Kind code of ref document: A1