CN111142091B - Automatic driving system laser radar online calibration method fusing vehicle-mounted information - Google Patents

Automatic driving system laser radar online calibration method fusing vehicle-mounted information Download PDF

Info

Publication number
CN111142091B
CN111142091B CN202010026379.9A CN202010026379A CN111142091B CN 111142091 B CN111142091 B CN 111142091B CN 202010026379 A CN202010026379 A CN 202010026379A CN 111142091 B CN111142091 B CN 111142091B
Authority
CN
China
Prior art keywords
vehicle
data
laser radar
automatic driving
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202010026379.9A
Other languages
Chinese (zh)
Other versions
CN111142091A (en
Inventor
秦晓辉
谢国涛
王晓伟
边有钢
徐彪
胡满江
杨泽宇
胡展溢
钟志华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202010026379.9A priority Critical patent/CN111142091B/en
Publication of CN111142091A publication Critical patent/CN111142091A/en
Application granted granted Critical
Publication of CN111142091B publication Critical patent/CN111142091B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Abstract

The invention discloses an automatic driving system laser radar online calibration method fusing vehicle-mounted information, which aims to: the method for calibrating the laser radar external parameters from end to end on line is provided, complex mathematical model derivation and optimization are avoided, extra equipment is avoided, point cloud data and vehicle ECU (electronic control unit) data of the laser radar are processed by using the data analysis capability of a deep convolution neural network, and external parameter errors of the laser radar are estimated on line in real time, so that real-time correction of the laser radar external parameters is realized, the accuracy and the stability of the environment sensing function of an automatic driving system are improved, and the driving safety of the automatic driving system is ensured.

Description

Automatic driving system laser radar online calibration method fusing vehicle-mounted information
Technical Field
The invention relates to a laser radar calibration method for an automatic driving system, in particular to an automatic driving system laser radar online calibration method integrating vehicle-mounted information.
Background
The automatic driving technology which has emerged in recent years is expected to solve the social problem that the number of traffic accidents is high, and even is expected to change the traveling mode of people and influence the design pattern of cities, so that the automatic driving technology receives attention from various countries and becomes a research hotspot of the automobile industry.
Autonomous driving systems need to interact with the environment and thus need to sense the state of the environment. The environmental perception sensors commonly used in current autopilot systems include: laser radar, camera, millimeter wave radar and ultrasonic radar. Among them, laser radar is favored because it can accurately measure distance, has a wide visual field, and does not rely on visible light, and is an indispensable sensor in the existing automatic driving system.
In the environment sensing process, a target detected by the laser radar can be used by a decision module of the automatic driving system only by converting into a vehicle coordinate system, and external reference of the laser radar is required for the coordinate conversion. In general, the autopilot system performs static calibration after the laser radar is installed, and measures the translation distance and the rotation angle of the laser radar relative to the vehicle coordinate system by common measurement means, such as a tape measure, a laser range finder, and the like, that is, measures the external parameters. The external reference is then written into the configuration file of the autopilot system for the environment awareness algorithm to call to correct the position of the obstacle detected by the lidar for conversion into the vehicle coordinate system.
However, in the operation process of the automatic driving system, the suspension is deformed due to the problems of unstable tire pressure, different passenger quantity, uneven load and the like, so that the external parameters of the laser radar are temporarily influenced, and the temporary errors of the external parameters are caused. Meanwhile, in the long-term use process of automatic driving, a mounting bracket of the equipment can deform, and permanent errors are caused to external parameters of the laser radar. These all can produce the influence to the environmental perception system of automatic driving system, reduce perception precision and the stability of barrier, threaten the safety of going of system even.
Although periodic recalibration of external parameters may alleviate the above problems: 1) the method is time consuming and labor intensive; 2) automatic driving systems for large-scale mass production cannot recall the re-marked external parameters; 3) temporary changes of the external parameters of the laser radar cannot be identified in time. Therefore, the point cloud information of the laser radar is necessary to be fully utilized, the vehicle-mounted other sensor information is combined, the external parameter of the laser radar is estimated on line in real time, the external parameter error is found in time, the accuracy and the robustness of the environment sensing system are ensured, and the normal operation of the automatic driving system is protected.
The existing laser radar external reference calibration method has the following defects: 1) most of the methods are static calibration methods, require special fields and special processes and cannot be implemented on line; 2) data modeling calculation is required, algorithm complexity is increased, and model mismatch errors are introduced; 3) iterative optimization is usually required to solve the optimal solution, which is easy to fall into local optimization and cannot obtain true accurate external parameters.
The emerging deep convolutional neural network technology in recent years has strong nonlinear fitting capability and learning capability, and is easy to automatically separate key features from big data, so that end-to-end pattern recognition is realized. The deep convolutional neural network technology is combined with a calibration scene of the laser radar, a complex pose conversion relation is expected to be learned from big data, and laser point cloud data and vehicle-mounted sensor data are fused on line to realize on-line identification of laser radar external parameter errors. But the related work has not been done for a while.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide an automatic driving system laser radar online calibration method fusing vehicle-mounted information, which utilizes the data analysis capability of a deep convolutional neural network to process the point cloud data of the laser radar and the vehicle-mounted sensor data and estimates the external parameter error of the laser radar online in real time, thereby realizing the real-time correction of the external parameter of the laser radar and ensuring the normal work of the automatic driving system.
In order to achieve the purpose, the invention provides the following technical scheme: an automatic driving system laser radar online calibration method fusing vehicle-mounted information comprises the following steps:
step 1, building a big data engine, wherein the big data engine comprises a vehicle-mounted part and a server part, the vehicle-mounted part comprises a sensor, preprocessing, data synchronization and data storage, and the sensor comprises:
a laser radar providing environmental point cloud data around the autonomous vehicle;
the vehicle-mounted ECU provides necessary state information of the vehicle, namely the vehicle speed, the steering wheel angle and the INS information;
after receiving the environment point cloud data and the necessary state information of the vehicle, analyzing the point cloud data of the laser radar through preprocessing, namely converting the original data of the laser radar into a common format required by an algorithm, taking the space three-dimensional coordinate of a laser spot as a representation, analyzing the data of the vehicle-mounted ECU, and converting the original data of the ECU into the common format required by the algorithm from the CAN bus, namely a vehicle speed value, a steering wheel turning angle value and a three-axis acceleration value under a vehicle coordinate system;
step 2, building and storing an initial deep convolutional neural network model by using C + + programming; deploying the built deep convolutional neural network model by using a server, inputting data generated by a big data engine into the model, performing model training, and storing the trained deep convolutional neural network model;
step 3, loading the trained deep convolution neural network model into an automatic driving system, and connecting laser radar data, vehicle-mounted sensor data and the deep convolution neural network model in software; when the automatic driving system is normally operated, the deep convolution neural network model can collect laser radar data and vehicle-mounted sensor data in real time, and compares the deviation between the laser radar data and the vehicle-mounted sensor data in real time to give out an external parameter error value of the laser radar on line.
As a further improvement of the present invention, the specific steps of building the big data engine in the step one are as follows:
the method comprises the steps that a laser radar is installed for a target automatic driving system, a vehicle-mounted sensor interface is connected, and a precise instrument is used for completing static calibration of the laser radar to obtain external parameters of the laser radar;
driving a vehicle carrying a calibrated automatic driving system to run in a target working area and a scene, keeping the running speed below 30km/h, ensuring that the vehicle does not bump violently, collecting laser point cloud and vehicle-mounted sensor data, and recording the laser point cloud and the vehicle-mounted sensor data through software;
and step three, applying artificial interference to the acquired data, simulating laser radar point cloud data and vehicle-mounted sensor data under the condition of violent vehicle movement, and recording the data through software.
As a further improvement of the present invention, in the step 1, the data acquired by the vehicle-mounted ECU is processed into a vehicle state trajectory, wherein the vehicle state trajectory is derived from a vehicle kinematic model, and the vehicle kinematic model is derived by the following steps:
step 11, when the vehicle is moving steadily and at low speed, the centroid slip angle and the tire slip angle are both negligible, and the following relationship holds.
Figure GDA0003340452490000041
In the above formula, δ is the average turning angle of the front wheels of the vehicle, and v is the vehicle speed;
step 12, according to the kinematic relation shown in the step 11, if the current pose of the vehicle is TkThen, the vehicle pose at the next time can be predicted as shown in the following formula:
Tk+1=TkTΔ
in the above formula, TΔThe relative pose variation of the vehicle at the moment k to k +1 is obtained;
step 13, converting T in step 12ΔCalculated as follows:
Figure GDA0003340452490000042
where φ ^ represents the antisymmetric matrix for vector φ, and Δ t represents the time difference between k and k + 1.
As a further improvement of the present invention, the data synchronization in step 1 specifically comprises the following steps:
step 14, marking system time for the laser point cloud data;
and step 15, interpolating in the vehicle state data stream according to the system time of the laser point cloud to obtain the vehicle state data approximate value of the moment of the laser point cloud.
As a further improvement of the present invention, the specific contents of the interference in the first step three are as follows:
simulating the increase of the vehicle speed, and performing numerical simulation by using the existing data to generate data of a high-speed driving working condition;
simulating vehicle jolting, and applying preset short-time large pitching and roll angle transformation to the point cloud and the vehicle state track together by using the existing data to simulate the vibration working condition of the vehicle;
simulating a shielding working condition, and artificially filtering laser point cloud data of a plurality of angles, so that the deep convolutional neural network can cope with the working condition that the point cloud is rare;
and simulating different external parameter deviation values, and acquiring data with external parameter deviation by artificially adding preset external parameter deviation.
The method has the advantages that 1) an external parameter deviation change mathematical model of the laser radar does not need to be deduced artificially, and the method is easy for engineering and practical; 2) the third-party equipment required for external parameter error estimation is a vehicle-mounted sensor, and the equipment is necessary in an automatic driving vehicle, so that no additional equipment is added in the automatic driving system, and the automatic driving system is easy to apply to the existing automatic driving system; 3) the method realizes real-time online calibration of the laser radar external parameter, and can provide real-time external parameter correction for the automatic driving system, so that the environmental perception precision and reliability of the automatic driving system are improved, and the safety of the automatic driving system is further improved.
Drawings
FIG. 1 is a hardware topology diagram of the present invention;
FIG. 2 is a schematic diagram of the big data engine architecture of the present invention;
FIG. 3 is a schematic view of a kinematic model of a vehicle;
FIG. 4 is a schematic diagram of the model training architecture of the present invention;
FIG. 5 is a schematic diagram of a deep convolutional neural network model of the present invention;
FIG. 6 is a schematic diagram of the model deployment application architecture of the present invention.
Detailed Description
The invention will be further described in detail with reference to the following examples, which are given in the accompanying drawings.
As shown in fig. 1, the automatic driving related hardware according to the present invention includes a laser radar, an in-vehicle ECU (electronic Control Unit), and a calculation Unit. The laser radar is used for detecting the obstacles in the automatic driving system, the laser radar feeds detected point cloud information back to the automatic driving system in real time, and the automatic driving obstacle detection algorithm analyzes the types and the positions of the obstacles based on the point cloud information. The vehicle-mounted ECU belongs to original vehicle equipment, and transmits current vehicle speed, steering wheel angle information, and INS (Inertial Navigation System) information to the outside through a CAN (Controller area Network) bus. Wherein, the INS information is triaxial (x, y and z axes) acceleration information under the vehicle coordinate system. The laser radar is a research object of the invention, and the external reference error calculated by the invention is the pose calibration deviation of the laser radar. The vehicle-mounted ECU provides auxiliary information for the algorithm of the invention, and the essence of the invention for calculating the laser radar external parameter error through the deep convolution neural network is as follows: and the vehicle track corresponding to the change of the continuous multi-frame laser point clouds is consistent with the track corresponding to the change of the vehicle-mounted ECU information. The vehicle track corresponding to the laser point cloud change has strong nonlinearity, and an analytic solution is not easy to obtain through data modeling. The computing unit has 2 main roles in the present invention: 1) when a big data engine is built, data are collected and recorded; 2) when the model is deployed and applied, a depth network model is operated, and the laser radar external parameter error is estimated in real time, so that the automatic driving algorithm is practical. Note that in practical use, the automatic driving algorithm is also run in the calculation unit.
As shown in fig. 2, the big data engine of the present invention mainly comprises: an onboard portion and a server portion. The parameters of each neuron in the deep convolutional neural network can be converged to an effective value only through big data training, so that a big data engine needs to be built for collecting, processing and storing necessary training data for the deep convolutional neural network training. The vehicle-mounted part of the big data engine runs on an automatic driving vehicle and is responsible for collecting effective data; after the effective data are collected from the vehicle-mounted part, the data are artificially biased on the server so as to enrich the data quantity and the data types.
The on-board portion of the big data engine includes: sensor, preprocessing, data synchronization and data storage, for a total of 4 sub-parts. Wherein, the sensor mainly includes: 1) the laser radar is a research object of the invention and provides environmental point cloud data around the automatic driving vehicle; 2) and an in-vehicle ECU providing necessary state information of the vehicle, i.e., vehicle speed, steering wheel angle, and INS information. The pretreatment part mainly comprises the following steps: 1) analyzing point cloud data of the laser radar, namely converting original data of the laser radar into a common format required by an algorithm, and representing the space three-dimensional coordinates of a laser point; 2) and (3) analyzing the data of the vehicle-mounted ECU, and converting the original data of the ECU into common formats required by an algorithm from a CAN bus, namely a vehicle speed value, a steering wheel turning angle value and a triaxial acceleration value under a vehicle coordinate system (hereinafter, the three are collectively referred to as vehicle state data). The synchronization part is mainly used for aligning the laser point cloud data and the vehicle-mounted ECU data to the same time stamp and comprises the following steps: 1) marking the system time for the laser point cloud data; 2) and according to the system time of the laser point cloud, interpolating in the vehicle state data stream to obtain the vehicle state data approximate value of the moment of the laser point cloud. The storage section stores the aligned laser point cloud data and vehicle state data in a hard disk of the computing unit in time series.
The server part of the big data engine mainly comprises: an artificial bias adding part and a storage part. The purpose of artificial biasing is to generate more effective data by utilizing the existing data and enhance the robustness and the prediction precision of the deep convolutional neural network through specific data biasing contents. The biased content mainly comprises: 1) simulating the increase of the vehicle speed, and performing numerical simulation by using the existing data to generate data of a high-speed driving working condition; 2) simulating vehicle bump, and utilizing the existing data to jointly apply preset short-time large pitch and roll angle transformation to the point cloud and the vehicle state track (the term will be explained in figure 3) so as to simulate the vibration working condition of the vehicle; 3) simulating a shielding working condition, and artificially filtering laser point cloud data of a plurality of angles, so that the deep convolutional neural network can cope with the working condition that the point cloud is rare; 4) and simulating different external parameter deviation values, and acquiring data with external parameter deviation by artificially adding preset external parameter deviation. And a storage part, namely, storing the newly generated data in a time sequence for subsequent training. Note that before the artificial bias, the vehicle state data is converted into a vehicle state trajectory for subsequent processing. In the storage phase of the big data engine server part, the original vehicle state is not stored, but the calculated vehicle state track is stored.
The vehicle-mounted part of the big data engine runs in the automatic driving vehicle and collects the target working environment data of the automatic driving system. Before data acquisition each time, static calibration is carried out on external parameters of the laser radar through high-precision equipment to obtain accurate external parameter values which are used as references during model training. In the acquisition process, the speed of the vehicle is controlled to be below 30km/h, and the vehicle is kept to run stably as much as possible, so that the change of external parameters of equipment due to violent movement of the vehicle body is avoided. And after the acquisition is finished, copying the data of the vehicle-mounted part into a server for subsequent processing.
The server part of the big data engine runs in a background server, and generates more effective data after artificial biasing according to vehicle-mounted collected data so as to support big data training of the deep convolutional neural network. Wherein, the step of artificially biasing can be as follows: 1) the working condition types of the data are increased, and when the data are collected by a real vehicle, only the low-speed and stable running working condition exists, and the data under the high-speed and violent driving working condition can be obtained by artificially biasing; 2) the data volume is increased, the deep convolutional neural network is easy to overfit, and only when the data volume is large enough, the deep convolutional neural network can improve the generalization capability on the basis of ensuring the precision and enhance the system robustness.
Note that during data acquisition and storage, the external reference values measured by the high-precision device are also recorded as reference values during model training.
Fig. 3 shows a vehicle kinematics model used in the present embodiment. In the figure, δoAnd deltaiThe turning angles of the left front wheel and the right front wheel of the vehicle, O is the current rotation center of the vehicle, L is the wheel track of the vehicle, L is the wheel track of the vehiclewIs the vehicle track width and r is the radius of gyration of the vehicle center of mass. When the vehicle does not move violently, the state change of the vehicle conforms to the vehicle kinematic model shown in fig. 3, that is, the center of mass of the vehicle moves according to an arc with O as the center and r as the radius, and the peripheral speed is the longitudinal speed v of the vehicle. The vehicle state data obtained from the vehicle-mounted ECU cannot be directly input to the deep convolutional neural network, and needs to be processed into a vehicle state trajectory first. Vehicle state trajectory, i.e. from several vehicle poses Ti(including x, y, z, α, β, and γ, i.e., longitudinal position, lateral position, vertical position, roll angle, pitch angle, and yaw angle). Since the motion of the vehicle conforms to the vehicle kinematics model, the vehicle state trajectory can be derived using the vehicle state data in conjunction with the model of fig. 3. When the vehicle is moving at a steady and low speed, the centroid slip angle and the tire slip angle are both negligible, and the following relationship holds.
Figure GDA0003340452490000091
In the above formula, δ is the average rotation angle of the front wheel of the vehicle, i.e. δoAnd deltaiV is the vehicle speed and β is the vehicle pitch angle. Mean front wheel steering angle delta and vehicle steering wheel steering angle delta of a vehiclewThere is a fixed proportional relationship between, i.e., δwλ δ, where λ is a coefficient. Although the coefficient lambda can change with the different steering wheel rotation angles, the change trend is fixed and does not change with time and environment, and can be measured in advance. According to the kinematic relation shown in the above formula, if the current pose of the vehicle is TkThen the vehicle pose at the next time (i.e., the time k + 1) can be predicted as shown in the following equation.
Tk+1=TkTΔ
In the above formula, TΔNamely the relative change of the pose of the vehicle from k to k + 1. In this embodiment, the pose of the vehicle is represented as a spatial transformation matrix T, and the structure of the spatial transformation matrix T is shown as the following formula, where R is a rotation matrix equivalent to α, β, and γ of the vehicle, and T is a translation vector, that is, x, y, and z of the vehicle.
Figure GDA0003340452490000092
Foregoing TΔCalculated as follows. Where φ ^ represents the antisymmetric matrix for vector φ, and Δ t represents the time difference between k and k + 1.
Figure GDA0003340452490000101
φ=(0 0 Δγ)T
ρ=JΔt
Figure GDA0003340452490000102
Figure GDA0003340452490000103
Figure GDA0003340452490000104
Figure GDA0003340452490000105
Therefore, the poses of the vehicle at a plurality of future moments can be predicted based on the vehicle state (namely the steering wheel angle and the vehicle speed information), and the poses are linked to form the vehicle state track.
Note that the INS data is not used in the above process, and is mainly used to determine whether the vehicle is located or notAnd the vehicle runs stably on flat ground. When the vehicle is in a flat land smooth driving state, the z-axis acceleration value of the INS is close to the gravity acceleration value (the error is less than 2 m/s)2) And the acceleration values of the x and y axes are kept in a small value range (-2 m/s)2~2m/s2In between). When the INS data is beyond the range, the vehicle is considered to be in violent motion and the data is not recorded.
As shown in FIG. 4, the model training of the present invention utilizes the inverse modification mechanism of the deep convolutional neural network. The method comprises the steps of firstly reading big data from a memory, and analyzing the big data into a format commonly used by an algorithm, namely a space three-dimensional coordinate form of a laser point and a space three-dimensional 6-degree-of-freedom pose form of a vehicle state track. And then inputting the laser point cloud and the vehicle state track data to a deep convolutional neural network model, and calculating a deviation value of the external parameters by the deep convolutional neural network model. And comparing the external reference deviation value calculated by the deep convolutional neural network with the external reference standard value stored in the memory to obtain the prediction residual error of the deep convolutional neural network. And (4) obtaining a reverse correction value of the neuron parameter by utilizing the derivation from the prediction residual error to the neuron parameter of the deep convolutional neural network model. And finally, correcting the neuron parameters of the deep convolutional neural network by using the obtained reverse correction value, entering the next iteration after correction, and repeating the steps to continuously optimize the neuron parameters so that the deep convolutional neural network has the prediction capability on the laser radar external parameter errors.
Fig. 5 shows a specific structure of the deep convolutional neural network model of this embodiment, which mainly includes: data stream input, full convolution layer, pooling layer, pyramid pooling layer, and full connection layer, W in the figure represents depth map width in pixels, and H represents depth map height in pixels. Taking the data processing of the 16-line Velodyne lidar as an example in fig. 5, each frame of laser point cloud includes 16 lines of data, and each line of data includes thousands of data points. In the invention, each frame of point cloud data is firstly converted into a depth map, and the pixel is 1800 multiplied by 16 multiplied by 1, namely, width multiplied by height multiplied by gray scale. Meanwhile, in order to take the vehicle state track data as the input of the neural network, the vehicle pose at the moment corresponding to the laser point cloud is extracted from the vehicle state track, and the vehicle pose information is converted into a 1800 multiplied by 16 multiplied by 1 depth map, wherein the pixel value of the depth map is the three-dimensional 6-degree-of-freedom pose value of the vehicle. And superposing the point cloud depth map and the vehicle pose depth map together to form a 1800 multiplied by 16 multiplied by 2 mixed depth map. Only the motion of multiple frames of images can generate a vehicle motion path, and the external parameters of the laser radar can be estimated, so that 9 frames of historical mixed depth maps are cached in the invention, the historical mixed depth maps and the current frame depth map are spliced into a large depth map together, the large depth map is used as the input of a depth convolution neural network, and the pixel of the large depth map is 1800 multiplied by 160 multiplied by 2. As shown in fig. 5, after the large depth map is input, the depth convolution neural network sequentially passes through full convolution, pooling, pyramid pooling and full connection layers to obtain a predicted lidar extrinsic parameter, which is described in the form of 6-dimensional lie algebra. And obtaining an external parameter error after the external parameter error is differenced with the accurate external parameter stored in the database, wherein the external parameter error is used for calculating a correction term of the neuron, and then the parameter value of the neuron is adjusted through the back propagation of the correction term.
Fig. 6 shows a deep neural model deployment application system architecture of the present invention, which mainly includes: sensor and software. After training is completed, the deep convolutional neural network is deployed in an automatic driving system of a real vehicle and used for calculating the external parameter deviation of the laser radar in real time. The sensor portion of the deployment architecture includes: 1) the laser radar is a main body of the external parameter deviation; 2) and the vehicle-mounted ECU provides auxiliary information so that the deep convolutional neural network can estimate accurate external parameters. The software portion of the deployment architecture includes: preprocessing, synchronization and model 3 sections. The pretreatment part mainly comprises the following steps: 1) analyzing point cloud data of the laser radar, namely converting original data of the laser radar into a common format required by an algorithm, and representing the space three-dimensional coordinates of a laser point; 2) and (3) analyzing data of the vehicle-mounted ECU, and converting the original CAN data of the ECU into a format required by an algorithm, namely a vehicle state track, wherein the vehicle kinematic model in the figure 3 is used. The synchronization part is mainly used for aligning the laser point cloud data and the vehicle-mounted ECU data to the same time stamp and comprises the following steps: 1) marking the system time for the laser point cloud data; 2) and according to the system time of the laser point cloud, interpolating in the vehicle state track data stream to obtain vehicle pose approximate data at the moment of the laser point cloud. The model part mainly carries out: 1) converting the laser point cloud data into a depth map; 2) converting the vehicle pose into a depth map; 3) splicing the laser depth map and the vehicle pose depth map to form a mixed depth map; 4) caching 9 frames of historical mixed depth maps, and splicing the historical mixed depth maps with the current frame of mixed depth map into a large depth map; 5) inputting the large depth map into a depth convolution neural network to obtain estimated laser radar external parameters; 6) and subtracting the estimated external reference value of the external reference original static calibration to obtain an external reference error.
The process shown in fig. 6 operates on-line to provide real-time lidar external reference errors to the autopilot system to improve the environmental awareness of the autopilot system.
The above description is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above embodiments, and all technical solutions belonging to the idea of the present invention belong to the protection scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may occur to those skilled in the art without departing from the principle of the invention, and are considered to be within the scope of the invention.

Claims (5)

1. An automatic driving system laser radar online calibration method fusing vehicle-mounted information is characterized by comprising the following steps: the method comprises the following steps:
step 1, building a big data engine, wherein the big data engine comprises a vehicle-mounted part and a server part, the vehicle-mounted part comprises a sensor, preprocessing, data synchronization and data storage, and the sensor comprises:
a laser radar providing environmental point cloud data around the autonomous vehicle;
the vehicle-mounted ECU provides necessary state information of the vehicle, namely the vehicle speed, the steering wheel angle and the INS information;
after receiving the environment point cloud data and the necessary state information of the vehicle, analyzing the point cloud data of the laser radar through preprocessing, namely converting the original data of the laser radar into a common format required by an algorithm, taking the space three-dimensional coordinate of a laser spot as a representation, analyzing the data of the vehicle-mounted ECU, and converting the original data of the ECU into the common format required by the algorithm from the CAN bus, namely a vehicle speed value, a steering wheel turning angle value and a three-axis acceleration value under a vehicle coordinate system;
step 2, building and storing an initial deep convolutional neural network model by using C + + programming; deploying the built deep convolutional neural network model by using a server, inputting data generated by a big data engine into the model, performing model training, and storing the trained deep convolutional neural network model;
step 3, loading the trained deep convolution neural network model into an automatic driving system, and connecting laser radar data, vehicle-mounted sensor data and the deep convolution neural network model in software; when the automatic driving system is normally operated, the deep convolution neural network model can collect laser radar data and vehicle-mounted sensor data in real time, and compares the deviation between the laser radar data and the vehicle-mounted sensor data in real time to give out an external parameter error value of the laser radar on line.
2. The on-line laser radar calibration method for the automatic driving system fusing the vehicle-mounted information according to claim 1, characterized in that: the specific steps of building the big data engine in the first step are as follows:
the method comprises the steps that a laser radar is installed for a target automatic driving system, a vehicle-mounted sensor interface is connected, and a precise instrument is used for completing static calibration of the laser radar to obtain external parameters of the laser radar;
driving a vehicle carrying a calibrated automatic driving system to run in a target working area and a scene, keeping the running speed below 30km/h, ensuring that the vehicle does not bump violently, collecting laser point cloud and vehicle-mounted sensor data, and recording the laser point cloud and the vehicle-mounted sensor data through software;
and step three, applying artificial interference to the acquired data, simulating laser radar point cloud data and vehicle-mounted sensor data under the condition of violent vehicle movement, and recording the data through software.
3. The automatic driving system laser radar online calibration method fusing the vehicle-mounted information according to claim 2, characterized in that: in the step 1, the data analysis method for the vehicle-mounted ECU is to process the data acquired by the vehicle-mounted ECU into a vehicle state track, wherein the vehicle state track is derived by a vehicle kinematic model, and the vehicle kinematic model is derived by the following steps:
step 11, when the vehicle is moving steadily and at low speed, the centroid slip angle and the tire slip angle are both negligible, so that the following relationship holds:
Figure FDA0003354398230000021
in the above formula, δ is the average turning angle of the front wheels of the vehicle, and v is the vehicle speed;
step 12, according to the kinematic relation shown in the step 11, if the current pose of the vehicle is TkThen, the vehicle pose at the next time can be predicted as shown in the following formula:
Tk+1=TkTΔ
in the above formula, TΔThe relative pose variation of the vehicle at the moment k to k +1 is obtained;
step 13, converting T in step 12ΔCalculated as follows:
Figure FDA0003354398230000022
where φ ^ represents the antisymmetric matrix for vector φ, and Δ t represents the time difference between k and k + 1.
4. The automatic driving system laser radar online calibration method fusing the vehicle-mounted information according to claim 3, characterized in that: the data synchronization in the step 1 specifically comprises the following steps:
step 14, marking system time for the laser point cloud data;
and step 15, interpolating in the vehicle state data stream according to the system time of the laser point cloud to obtain the vehicle state data approximate value of the moment of the laser point cloud.
5. The automatic driving system laser radar online calibration method fusing vehicle-mounted information according to claim 3 or 4, characterized in that: the specific contents of interference in the first step three are as follows:
simulating the increase of the vehicle speed, and performing numerical simulation by using the existing data to generate data of a high-speed driving working condition;
simulating vehicle jolting, and applying preset short-time large pitching and roll angle transformation to the point cloud and the vehicle state track together by using the existing data to simulate the vibration working condition of the vehicle;
simulating a shielding working condition, and artificially filtering laser point cloud data of a plurality of angles, so that the deep convolutional neural network can cope with the working condition that the point cloud is rare;
and simulating different external parameter deviation values, and acquiring data with external parameter deviation by artificially adding preset external parameter deviation.
CN202010026379.9A 2020-01-10 2020-01-10 Automatic driving system laser radar online calibration method fusing vehicle-mounted information Expired - Fee Related CN111142091B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010026379.9A CN111142091B (en) 2020-01-10 2020-01-10 Automatic driving system laser radar online calibration method fusing vehicle-mounted information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010026379.9A CN111142091B (en) 2020-01-10 2020-01-10 Automatic driving system laser radar online calibration method fusing vehicle-mounted information

Publications (2)

Publication Number Publication Date
CN111142091A CN111142091A (en) 2020-05-12
CN111142091B true CN111142091B (en) 2021-12-24

Family

ID=70524389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010026379.9A Expired - Fee Related CN111142091B (en) 2020-01-10 2020-01-10 Automatic driving system laser radar online calibration method fusing vehicle-mounted information

Country Status (1)

Country Link
CN (1) CN111142091B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111983616A (en) * 2020-07-31 2020-11-24 嘉善新石器智牛科技有限公司 Automatic adjusting system and method for unmanned vehicle radar and unmanned vehicle
CN112212872B (en) * 2020-10-19 2022-03-11 合肥工业大学 End-to-end automatic driving method and system based on laser radar and navigation map
CN112287557B (en) * 2020-11-09 2023-04-07 东风汽车集团有限公司 Radar point cloud data loop playback method and system for assisting driving simulation test
CN112379353B (en) * 2020-11-10 2022-10-25 上海交通大学 Combined calibration method and system among multiple target laser radars
CN112396664B (en) * 2020-11-24 2022-03-25 华南理工大学 Monocular camera and three-dimensional laser radar combined calibration and online optimization method
CN112731320A (en) * 2020-12-29 2021-04-30 福瑞泰克智能系统有限公司 Method, device and equipment for estimating error data of vehicle-mounted radar and storage medium
CN112693466A (en) * 2021-01-29 2021-04-23 重庆长安汽车股份有限公司 System and method for evaluating performance of vehicle environment perception sensor

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103091667A (en) * 2013-01-10 2013-05-08 清华大学 Vehicle-borne radar calibration device and calibration method
CN103226833A (en) * 2013-05-08 2013-07-31 清华大学 Point cloud data partitioning method based on three-dimensional laser radar
CN107044856A (en) * 2016-12-30 2017-08-15 袁重德 A kind of Centimeter Level tuning on-line method of highway driving vehicle
CN108227707A (en) * 2017-12-25 2018-06-29 清华大学苏州汽车研究院(吴江) Automatic Pilot method based on laser radar and end-to-end deep learning method
CN109472831A (en) * 2018-11-19 2019-03-15 东南大学 Obstacle recognition range-measurement system and method towards road roller work progress
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data
CN110097620A (en) * 2019-04-15 2019-08-06 西安交通大学 High-precision map creation system based on image and three-dimensional laser
CN110161485A (en) * 2019-06-13 2019-08-23 同济大学 A kind of outer ginseng caliberating device and scaling method of laser radar and vision camera

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106599827A (en) * 2016-12-09 2017-04-26 浙江工商大学 Small target rapid detection method based on deep convolution neural network
CN107300863B (en) * 2017-07-12 2020-01-10 吉林大学 Longitudinal acceleration control method based on MAP graph and online calibration
US10436885B2 (en) * 2017-10-19 2019-10-08 DeepMap Inc. Calibrating sensors mounted on an autonomous vehicle
CN109444911B (en) * 2018-10-18 2023-05-05 哈尔滨工程大学 Unmanned ship water surface target detection, identification and positioning method based on monocular camera and laser radar information fusion

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103091667A (en) * 2013-01-10 2013-05-08 清华大学 Vehicle-borne radar calibration device and calibration method
CN103226833A (en) * 2013-05-08 2013-07-31 清华大学 Point cloud data partitioning method based on three-dimensional laser radar
CN107044856A (en) * 2016-12-30 2017-08-15 袁重德 A kind of Centimeter Level tuning on-line method of highway driving vehicle
CN108227707A (en) * 2017-12-25 2018-06-29 清华大学苏州汽车研究院(吴江) Automatic Pilot method based on laser radar and end-to-end deep learning method
CN109472831A (en) * 2018-11-19 2019-03-15 东南大学 Obstacle recognition range-measurement system and method towards road roller work progress
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data
CN110097620A (en) * 2019-04-15 2019-08-06 西安交通大学 High-precision map creation system based on image and three-dimensional laser
CN110161485A (en) * 2019-06-13 2019-08-23 同济大学 A kind of outer ginseng caliberating device and scaling method of laser radar and vision camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Calibration method between dual 3D lidar sensors for autonomous vehicles;Taehyeong Kim et al.;《2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE)》;20171113;全文 *
三维激光雷达在地面无人平台中的外参数标定;程子阳 等;《应用激光》;20190228;第39卷(第1期);全文 *

Also Published As

Publication number Publication date
CN111142091A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
CN111142091B (en) Automatic driving system laser radar online calibration method fusing vehicle-mounted information
CN110658531B (en) Dynamic target tracking method for port automatic driving vehicle
KR102581263B1 (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN113819914A (en) Map construction method and device
JP2022019642A (en) Positioning method and device based upon multi-sensor combination
CN112083726B (en) Park-oriented automatic driving double-filter fusion positioning system
CN111257853B (en) Automatic driving system laser radar online calibration method based on IMU pre-integration
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN117836653A (en) Road side millimeter wave radar calibration method based on vehicle-mounted positioning device
CN110987463A (en) Multi-scene-oriented intelligent driving autonomous lane change performance test method
CN112977411A (en) Intelligent chassis control method and device
CN113238251B (en) Target level semantic positioning method based on vehicle-mounted laser radar
CN112099378B (en) Front vehicle lateral motion state real-time estimation method considering random measurement time lag
CN111452786B (en) Obstacle avoidance method and system for unmanned vehicle
CN113252022A (en) Map data processing method and device
CN113947639A (en) Self-adaptive online estimation calibration system and method based on multi-radar-point cloud line characteristics
CN113252051A (en) Map construction method and device
CN111649740A (en) Method and system for high-precision positioning of vehicle based on IMU
CN115046540A (en) Point cloud map construction method, system, equipment and storage medium
CN114942642A (en) Unmanned automobile track planning method
CN114879207A (en) Ultrasonic obstacle avoidance method for L4-level automatic driving vehicle
CN111829514B (en) Road surface working condition pre-aiming method suitable for vehicle chassis integrated control
CN111103578B (en) Laser radar online calibration method based on deep convolutional neural network
CN111708010B (en) Mobile equipment positioning method, device and system and mobile equipment
Wang et al. Extraction of preview elevation information based on terrain mapping and trajectory prediction in real-time

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20211224

CF01 Termination of patent right due to non-payment of annual fee