CN109459750B - Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision - Google Patents
Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision Download PDFInfo
- Publication number
- CN109459750B CN109459750B CN201811219589.9A CN201811219589A CN109459750B CN 109459750 B CN109459750 B CN 109459750B CN 201811219589 A CN201811219589 A CN 201811219589A CN 109459750 B CN109459750 B CN 109459750B
- Authority
- CN
- China
- Prior art keywords
- millimeter wave
- wave radar
- track
- vehicle
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
- G01S13/72—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
- G01S13/723—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
- G01S13/726—Multiple target tracking
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Abstract
The invention relates to a front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision, which utilizes the millimeter wave radar to obtain front data information, eliminates invalid information according to the echo reflection intensity and width information, and only retains the front vehicle information. According to the method of fusing the millimeter wave radar and the camera, the motion trail is generated and trail association is carried out through filtering radar information and an online tracking model. The preceding vehicles for which track association has been made are recorded and numbered. For the front vehicles with the generated tracks and numbered, the repeated processing of the steps is only needed for the data of the next period, and consistency check is carried out, so that the data are added to the numbered tracks. For newly-appearing vehicles, track generation, track association and numbering are performed according to the initial steps. The invention combines the advantages of millimeter wave radar and visual deep learning, and can effectively improve the accuracy and the robustness of forward multi-vehicle target tracking.
Description
Technical Field
The invention belongs to the technical field of multi-target tracking, relates to an intelligent driving automobile auxiliary driving method, in particular to an information fusion front multi-vehicle target tracking method, and particularly relates to a millimeter wave radar and deep learning vision fusion front multi-vehicle tracking method.
Background
Unmanned vehicles have become a hotspot in research today, where environmental awareness is an important element in achieving intelligent driving. Tracking is becoming increasingly important to researchers as an important ring of environmental awareness. When a single sensor is used for sensing and tracking, the problems of low precision, poor stability and high false alarm rate always occur. Therefore, fusion of multiple sensors to achieve tracking is a research hotspot. The millimeter wave radar has higher working stability, can reliably work in various environments, has longer detection distance, but has poorer target recognition capability. The multi-target tracking method based on deep learning is a multi-target tracking method which is rising in recent years, has good object recognition capability through a large number of sample training, and can accurately recognize the category of a front object and generate a motion trail. When the number of layers of the neural network is large, the recognition effect is good, but the operation is complex, the speed is slow, and when the distance is long, the effect is poor. If a method of combining millimeter wave radar with deep learning of a less-layer neural network is adopted to track a plurality of vehicles in front, the method is a new attempt and has wide application value.
Disclosure of Invention
The invention aims to provide a front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision aiming at the defects and shortcomings of a single type of sensor in the prior art.
The invention aims at realizing the following technical scheme:
a forward multi-vehicle tracking method integrating millimeter wave radar and deep learning vision comprises the following steps:
A. establishing a coordinate conversion relation between a millimeter wave radar coordinate system and a vision sensor (camera) coordinate system, unifying the coordinate systems of the millimeter wave radar coordinate system and the vision sensor (camera) coordinate system, and sampling by using the lowest sampling frequency of the millimeter wave radar and the camera so as to keep the consistency in time;
B. receiving data of the millimeter wave radar, resolving according to a certain rule, and performing corresponding processing, so that a front vehicle is screened out, and an invalid target is removed;
C. filtering the millimeter wave radar data received in the step B, tracking a front vehicle by using Kalman filtering, and generating a track to number the track; when a new vehicle appears, after the data acquired by the radar are calculated and determined, tracking is performed to generate a new track and a number is added;
D. preprocessing an image acquired by a camera;
E. training the deep learning neural network offline to identify the vehicles in front;
F. the image which is preprocessed in the step D is sent to a deep learning neural network which is pre-trained in the step D, a front vehicle target is detected and positioned, the position of a plurality of vehicles in front in the image and the detection reliability (the vehicle signals with the reliability lower than M percent are deleted) are obtained, and the vehicles are numbered;
G. f, tracking the vehicle detected in the step F by using an online tracking model, generating a running track of the vehicle and numbering;
H. the data processing centers of the millimeter wave radar and the camera respectively send the local track to a data fusion center, and the fusion center performs data fusion on the distance and the coordinate relation of the output data of the millimeter wave radar and the camera, and track correlation is performed based on the track of the millimeter wave radar and the track of the camera;
I. repeating the steps, and updating the track to obtain a tracking result.
Further, in the step a, the step of unifying the coordinate system includes:
a1, establishing a conversion relation between a millimeter wave radar and a world three-dimensional coordinate system, wherein the millimeter wave radar coordinate system is a two-dimensional coordinate system of a horizontal plane;
a2, establishing a conversion relation between a camera coordinate system and a three-dimensional world coordinate system, wherein the camera coordinate system is a two-dimensional coordinate system of a vertical plane;
a3, combining the coordinate relationship between the millimeter wave radar and the three-dimensional world and the coordinate relationship between the camera and the three-dimensional world, which are deduced in the step A1 and the step A2, and deducing the coordinate relationship between the millimeter wave radar and the camera image, wherein the coordinate relationship is as follows:
further, the step A1 specifically includes the following steps:
A11、X 0 O 0 z is the coordinate system of millimeter wave radar, the coordinate plane of Z is parallel to the XOZ plane of O-XYZ of the three-dimensional world coordinate system, and X 0 O 0 The Z plane is located at a Y1 position below the XOZ plane, Y1 is the installation height of the millimeter wave radar, and the XOZ plane of the three-dimensional world coordinate system O-XYZ is projected to the millimeter wave radar coordinate system X 0 O 0 On Z, OX axis and O 0 X 0 Distance Z between 0 O is the origin of the world coordinate system, and O is the origin of the millimeter wave radar coordinate system, namely the installation position of the millimeter wave radar;
a12, assuming that the front vehicle M is found in the scanning range of the millimeter wave radar, the relative distance between the front vehicle M and the millimeter wave radar is R, and the relative angle is alpha, namely MO 0 R unit is mm, +_mo 0 Z=a in degrees;
a13, transferring the vehicle target in the millimeter wave radar coordinate system to a three-dimensional world coordinate system to obtain X=Rxsin alpha and Z=Z 0 +R x cosɑ。
Further, the step A2 specifically includes the following steps:
a21, the camera coordinate system is a two-dimensional coordinate system XOY in a vertical plane, O is the origin of the camera coordinate system, and the coordinate plane is parallel to the XOY plane of the three-dimensional world coordinate system O-XYZ. Where O is the origin of coordinates of the three-dimensional world coordinate system and is also the center of the lens of the camera, i.e. oo=f, f is the effective focal length of the camera in mm.
A22, the camera mounting process requires its optical axis to be parallel to the ground, i.e. the Y value in the three-dimensional world coordinate system remains unchanged, i.e. y=y 0 ,Y 0 The mounting height of the camera is in mm.
A23, the vehicle object M (X, Y 0 Z) to the image plane in the camera coordinate system, the conversion relationship is as follows:
Further, the step B specifically includes the following steps, as shown in fig. 4:
b1, the front data received by the millimeter wave radar comprises a distance, an angle, a relative speed range, a reflection intensity power and a width of a front target;
b2, calculating the received data by utilizing a calculation protocol specified by the millimeter wave radar, and removing the static target and the invalid target;
b3, performing target screening according to the reflection intensity and width of the front target, and setting a reflection intensity threshold u 0 And a width threshold v 0 When the reflection intensity is larger than or equal to u 0 And width is greater than or equal to v 0 And confirming the vehicle target.
Further, the step C specifically includes the following steps: :
and C1, predicting the vehicle state of the next period by using Kalman filtering, reading the millimeter wave radar actual measurement data of the next period, and matching the prediction state with the actual measurement state prediction. (with four frames as a period, the step size is two frames);
and C2, repeating the step C1 for the newly-appearing vehicle, renumbering the newly-appearing vehicle, and generating a new tracking track.
Further, the step C1 specifically includes the following steps:
c12, predicting the state of the next period of the detected front multi-vehicle target by using a Kalman filtering algorithm;
c13, comparing the actual measured value of the next period of the front multi-vehicle target with the predicted value of the previous period, and performing consistency test;
and C14, for the target meeting the consistency requirement, updating the data information of the target, and carrying out prediction of the next period. And when two continuous periods meet the consistency requirement, generating a motion track. For the targets that do not meet the consistency requirement, they are considered as emerging vehicles, temporarily reserved, and for the vehicles that are not detected in the next cycle, the targets are considered as disappearing.
Further, the step G specifically includes the following steps:
g1, calculating Euclidean distance of a vehicle detected between the front frame image and the rear frame image, carrying out weight distribution on a plurality of targets which are closer to each other according to the distance, and marking the weight distribution as w according to the sequence of 1,0.9,0.8 and … … 0 of the distance 1 ;
G2, calculating the cross ratio of each binding box in the front and rear frames, distributing weights according to the value of the cross ratio, and marking the weight as w according to the coverage ratio of 1,0.9,0.8 and … … 0 in sequence 2 ;
G3, will w 1 And w is equal to 2 Adding, recording, wherein the maximum value is the most probable target;
g4, when three continuous frames or four continuous frames have three frames of images to detect the same vehicle, generating a motion track of the vehicle, numbering and recording an ID; when two or three continuous frames cannot find the detected vehicle, the record is kept, if more than five frames cannot find the detected vehicle, the fact that the vehicle disappears from the visual field is determined, and the motion trail information of the vehicle is deleted.
Further, the step H specifically includes the following steps:
h1, comparing the distance and coordinate relationship obtained by millimeter wave radar and camera, when the two results are consistent or the difference is not obvious (the distance difference does not exceed the threshold value Q 1 And the pixel difference does not exceed 24x 24), then fusion is performed and the number is re-recorded. (the difference in the XOZ plane is measured as the distance difference and the difference in the XOY plane is measured as the pixel difference);
h2, when the difference between the two is significant, the classification is performed according to the longitudinal distance.
Further, the step H2 specifically includes the following steps:
h21, the longitudinal distance from the vehicle measured by millimeter wave is smaller than d 1 When the tracking track is used, the tracking track obtained by the millimeter wave radar is mainly used as the basis of the track obtained by the camera, and the tracking track obtained by the millimeter wave radar is used as the detection of the track of the camera, and the tracking track and the detection of the track obtained by the millimeter wave radar are comparedTracking a track point trace graph, wherein the track point trace graph and the track point trace graph are similar in shape, and are considered to be identical, if inconsistent, independent tracking is kept, if data exceeding m frames cannot be fused, two targets are considered, and processing is carried out according to the newly-appearing targets;
h22, the longitudinal distance from the vehicle measured by millimeter wave is d 1 --d 2 The longitudinal distance difference between the two does not exceed the threshold value Q 2 If the distance of the coordinate difference on the image is not more than 48x48 pixel points, taking the median value for fusion, and if the longitudinal distance exceeds Q 2 Then the track is deleted directly;
h23, measured in millimeter wave, has a longitudinal distance d from the vehicle 2 --d 3 The tracking track is based on the track formed by the millimeter wave radar, the track obtained by the camera is used as track detection of the millimeter wave radar, the track point track patterns of the two tracking tracks are compared, the shapes of the track point track patterns are similar, if the track is inconsistent, independent tracking is kept, if data exceeding m frames cannot be fused, the track point track is considered as two targets, and the two targets are processed according to the newly-appearing targets.
Compared with the prior art, the invention has the beneficial effects that:
1. the front multi-vehicle tracking method based on the millimeter wave radar and the deep learning vision fusion is different from the previous method for forming the region of interest on the image through the millimeter wave radar, and adopts a decision-level fusion method, so that different fusion strategies are designed according to the characteristics of different sensors, the advantages of the sensors are fully utilized, and the accuracy of tracking the front multi-vehicle is improved;
2. the method has the advantages that the strong feature learning capability of deep learning is adopted, the traditional machine learning is avoided, the features are selected manually, the extracted features are richer, the expression capability is stronger, and the obtained result is more accurate.
3. In the invention, the number of layers of the used deep learning model is less, and the real-time performance of tracking can be better realized. Can be better applied to the unmanned field.
Drawings
FIG. 1 is a flow chart of a front multi-vehicle tracking method of the present invention incorporating millimeter wave radar and deep learning vision;
fig. 2 is a diagram showing a conversion relation between a millimeter wave radar and a vehicle coordinate system;
FIG. 3 is a graph of the conversion relationship between the camera and the vehicle coordinate system;
FIG. 4 is a flow chart of generating a tracking trajectory using millimeter wave radar data;
fig. 5 is a flowchart of fusing a millimeter wave radar track and a camera track.
Detailed Description
The invention is further described below with reference to the accompanying drawings:
the front multi-vehicle target tracking method integrating millimeter wave radar and deep learning in the automatic driving field realizes the tracking of the front multi-vehicle. The invention obtains front data information by utilizing millimeter wave radar, including distance, angle, speed, reflected intensity of echo, width information and the like. And (3) carrying out information elimination according to the echo reflection intensity and the width information of the obtained front data information, eliminating invalid information and only keeping the vehicle information in front. And then according to a method of combining millimeter wave radar and deep learning (camera), generating a motion track and carrying out track association by filtering radar information and an online tracking model, improving the accuracy and the robustness of multi-target tracking and reducing the false alarm rate. Next, the preceding vehicle for which track association has been performed is recorded and numbered. For the front vehicles with the generated tracks and numbered, the repeated processing of the steps is only needed for the data of the next period, and consistency check is carried out, so that the data are added to the numbered tracks. For newly-appearing vehicles, track generation, track association and numbering are performed according to the initial steps.
As shown in fig. 1, 2 and 3, the front multi-vehicle tracking method with the fusion of millimeter wave radar and deep learning vision of the invention comprises the following steps:
A. establishing a coordinate conversion relation between a millimeter wave radar coordinate system and a camera coordinate system:
the millimeter wave radar coordinate system is a two-dimensional coordinate system of a horizontal plane, the camera coordinate system is a two-dimensional coordinate system of a vertical plane, and the conversion relation between the millimeter wave radar coordinate system and the camera coordinate system is found out by establishing the conversion relation between the millimeter wave radar and the world three-dimensional coordinate system and the conversion relation between the camera and the world three-dimensional coordinate system, so that the conversion is performed.
A1, establishing a conversion relation between a millimeter wave radar coordinate system and a world three-dimensional coordinate system, wherein the specific process comprises the following steps:
a11, millimeter wave radar coordinate system is two-dimensional coordinate system of horizontal plane, as shown in figure, X 0 O 0 Z is the coordinate system of millimeter wave radar, the coordinate plane of Z is parallel to the XOZ plane of O-XYZ of the three-dimensional world coordinate system, and X 0 O 0 The Z plane is located at a Y1 position below the XOZ plane, and Y1 is the installation height of the millimeter wave radar. Projecting the XOZ plane of the three-dimensional world coordinate system O-XYZ to the millimeter wave radar coordinate system X 0 O 0 On Z, OX axis and O 0 X 0 Distance Z between 0 O is the origin of the world coordinate system, and O is the origin of the millimeter wave radar coordinate system, namely the installation position of the millimeter wave radar.
A12, assuming that the front vehicle M is found in the scanning range of the millimeter wave radar, the relative distance between the front vehicle M and the millimeter wave radar is R, and the relative angle is alpha, namely MO 0 R unit is mm, +_mo 0 Z=a, in degrees.
A13, transferring the vehicle target in the millimeter wave radar coordinate system to a three-dimensional world coordinate system to obtain X=Rxsin alpha and Z=Z 0 +R x cosɑ。
A2, establishing a conversion relation between a camera coordinate system and a three-dimensional world coordinate system, wherein the specific process comprises the following steps of:
a21, the camera coordinate system is a two-dimensional coordinate system XOY in a vertical plane, O is the origin of the camera coordinate system, and the coordinate plane is parallel to the XOY plane of the three-dimensional world coordinate system O-XYZ. Where O is the origin of coordinates of the three-dimensional world coordinate system and is also the center of the lens of the camera, i.e. oo=f, f is the effective focal length of the camera in mm.
A22, the camera mounting process requires its optical axis to be parallel to the ground,i.e. the Y value in the three-dimensional world coordinate system remains unchanged, i.e. y=y 0 ,Y 0 The mounting height of the camera is in mm.
A23, the vehicle object M (X, Y 0 Z) to the image plane in the camera coordinate system, the conversion relationship is as follows:
A3, combining the coordinate relationship between the millimeter wave radar and the three-dimensional world and the coordinate relationship between the camera and the three-dimensional world, which are deduced in the step A1 and the step A2, and deducing the coordinate relationship between the millimeter wave radar and the camera image as follows:
B. the method for eliminating the invalid targets comprises the following steps of:
b1, the front data received by the millimeter wave radar include the distance, angle, relative speed range, reflection intensity power and width of the front target.
And B2, calculating the received data by utilizing a calculation protocol specified by the millimeter wave radar, and removing the static target and the invalid target.
B3, performing target screening according to the reflection intensity and width of the front target, and setting a reflection intensity threshold u 0 And a width threshold v 0 When the reflection intensity is larger than or equal to u 0 And width is greater than or equal to v 0 And confirming the vehicle target.
C. B, filtering the millimeter wave radar data received in the step B, tracking a front vehicle by using Kalman filtering, generating a track and numbering the track; when a new vehicle appears, the new track is generated by tracking after the data acquired by the radar are calculated and determined, and the number is added. The method comprises the following specific steps:
and C1, predicting the vehicle state of the next period by using Kalman filtering, reading the millimeter wave radar actual measurement data of the next period, and matching the prediction state with the actual measurement state prediction. (four frames are taken as a period, and the step length is taken as two frames)
C12, predicting the state of the next period of the detected front multi-vehicle target by using a Kalman filtering algorithm;
c13, comparing the actual measured value of the next period of the front multi-vehicle target with the predicted value of the previous period, and performing consistency test;
and C14, for the target meeting the consistency requirement, updating the data information of the target, and carrying out prediction of the next period. And when two continuous periods meet the consistency requirement, generating a motion track. For the targets which do not meet the consistency requirement, the targets are regarded as emerging vehicles, temporarily reserved, and for the vehicles which are not detected in the next period, the targets are regarded as disappeared;
and C2, repeating the step C1 for the newly-appearing vehicle, renumbering the newly-appearing vehicle, and generating a new tracking track.
D. Preprocessing an image acquired by a camera. In order to ensure the time consistency when the millimeter wave radar and the deep learning (camera) are fused, the minimum sampling frequency of the millimeter wave radar and the camera is used for sampling and gray processing is carried out, so that the image acquisition and pre-screening are carried out in the image acquisition process of the camera.
E. The deep learning neural network is trained offline. In training of the neural network, training is performed using an ImageNet database, and testing is performed.
F. And D, sending the image which is preprocessed in the step D into a deep learning neural network which is pre-trained in the step D, detecting and positioning a front vehicle target, finally obtaining the positions of multiple vehicles in the image and the detection reliability (deleting the vehicle signals with the reliability lower than M percent), and numbering the vehicles.
G. And F, tracking the plurality of vehicles detected in the step F by using a tracking calculation method, generating a running track of the vehicle, and finishing track association and numbering, wherein the specific steps are as follows:
g1, calculating Euclidean distance of a vehicle detected between the front frame image and the rear frame image, carrying out weight distribution on a plurality of targets which are closer to each other according to the distance, and marking the weight distribution as w according to the sequence of 1,0.9,0.8 and … … 0 of the distance 1 。
G2, calculating the cross ratio of each binding box in the front and rear frames, distributing weights according to the value of the cross ratio, and marking the weight as w according to the coverage ratio of 1,0.9,0.8 and … … 0 in sequence 2 。
G.3, w 1 And w is equal to 2 The addition is performed, the maximum value is the most probable target, and recording is performed.
And G4, when three continuous frames or four continuous frames have three frames of images and detect the same vehicle, generating a motion track of the vehicle, numbering and recording an ID. When two or three continuous frames cannot find the detected vehicle, the record is kept, if more than five frames cannot find the detected vehicle, the fact that the vehicle disappears from the visual field is determined, and the motion trail information of the vehicle is deleted.
H. The data processing centers of the millimeter wave radar and the camera respectively send the local tracks to the data fusion center, and the fusion center performs decision-level data fusion on the distance and the coordinate relation of the output data of the millimeter wave radar and the camera, and track correlation is performed based on the tracks of the millimeter wave radar and the camera, as shown in fig. 5.
H1, comparing the distance and coordinate relationship obtained by millimeter wave radar and camera, when the two results are consistent or the difference is not obvious (the distance difference does not exceed the threshold value Q 1 And the pixel difference does not exceed 24x 24), then fusion is performed and the number is re-recorded. (the difference in the XOZ plane is measured as the distance difference and the difference in the XOY plane is measured as the pixel difference);
h2, when the difference between the two is significant, the classification is performed according to the longitudinal distance.
H21, the longitudinal distance from the vehicle measured by millimeter wave is smaller than d 1 Track during the timeThe trace is mainly based on a trace obtained by a camera, a trace obtained by a millimeter wave radar is used as the detection of the trace of the camera, the trace point trace patterns of the trace point trace patterns and the trace point trace patterns are compared, the trace point trace patterns are considered to be identical, if the trace point patterns are inconsistent, independent trace is kept, if data exceeding m frames cannot be fused, the trace point patterns are considered to be two targets, and the two targets are processed according to the newly-appearing targets;
h22, the longitudinal distance from the vehicle measured by millimeter wave is d 1 --d 2 The longitudinal distance difference between the two does not exceed the threshold value Q 2 If the distance of the coordinate difference on the image is not more than 48x48 pixel points, taking the median value for fusion, and if the longitudinal distance exceeds Q 2 Then the track is deleted directly;
h23, measured in millimeter wave, has a longitudinal distance d from the vehicle 2 --d 3 The tracking track is based on the track formed by the millimeter wave radar, the track obtained by the camera is used as track detection of the millimeter wave radar, the track point track patterns of the two tracking tracks are compared, the shapes of the track point track patterns are similar, if the track is inconsistent, independent tracking is kept, if data exceeding m frames cannot be fused, the track point track is considered as two targets, and the two targets are processed according to the newly-appearing targets.
I. Repeating the steps, and updating the track to obtain a tracking result.
In summary, the invention provides a front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision, which combines the advantages of high distance detection precision and small influence of environmental change of the millimeter wave radar and the accuracy of the vision deep learning in detection and tracking, and improves the accuracy and robustness of target tracking of the front multi-vehicle.
Claims (5)
1. The front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision is characterized by comprising the following steps of:
A. establishing a coordinate conversion relation between a millimeter wave radar coordinate system and a camera coordinate system, unifying the coordinate systems of the millimeter wave radar coordinate system and the camera coordinate system, and sampling by using the lowest sampling frequency of the millimeter wave radar and the camera so as to keep the consistency in time;
B. receiving data of the millimeter wave radar, resolving according to rules, and performing corresponding processing, so that a front vehicle is screened out, and invalid targets are removed;
C. filtering the millimeter wave radar data received in the step B, tracking a front vehicle by using Kalman filtering, and generating a track to number the track; when a new vehicle appears, after the data acquired by the radar are calculated and determined, tracking is performed to generate a new track and a number is added; the method specifically comprises the following steps:
c1, predicting the vehicle state of the next period by using Kalman filtering, reading millimeter wave radar actual measurement data of the next period, matching the prediction state with the actual measurement state prediction, and taking four frames as a period and taking the step length as two frames;
the step C1 specifically comprises the following steps:
c12, predicting the state of the next period of the detected front multi-vehicle target by using a Kalman filtering algorithm;
c13, comparing the actual measured value of the next period of the front multi-vehicle target with the predicted value of the previous period, and performing consistency test;
c14, for the target meeting the consistency requirement, updating the data information of the target, and predicting the next period; when two continuous periods meet the consistency requirement, generating a track; for the targets which do not meet the consistency requirement, the targets are regarded as emerging vehicles, temporarily reserved, and for the vehicles which are not detected in the next period, the targets are regarded as disappeared;
c2, repeating the step C1 for the newly-appearing vehicle, renumbering the newly-appearing vehicle, and generating a new track;
D. preprocessing an image acquired by a camera;
E. training the deep learning neural network offline to identify the vehicles in front;
F. the image which is preprocessed in the step D is sent to a deep learning neural network which is pre-trained in the step D, a front vehicle target is detected and positioned, the position and the detection reliability of a plurality of vehicles in front in the image are obtained, vehicle signals with the reliability lower than M% are deleted, and the vehicles are numbered;
G. tracking the vehicle detected in the step F by using an online tracking model, generating a track of the vehicle and numbering the track; the method specifically comprises the following steps:
g1, calculating Euclidean distance of a vehicle detected between the front frame image and the rear frame image, carrying out weight distribution on a plurality of targets which are closer to each other according to the distance, and marking the weight distribution as w according to the sequence of 1,0.9,0.8 and … … 0 of the distance 1 ;
G2, calculating the cross ratio of each binding box in the front and rear frames, distributing weights according to the value of the cross ratio, and marking the weight as w according to the coverage ratio of 1,0.9,0.8 and … … 0 in sequence 2 ;
G3, will w 1 And w is equal to 2 Adding, recording, wherein the maximum value is the most probable target;
g4, when three continuous frames or four continuous frames have three frames of images to detect the same vehicle, generating a track of the vehicle, numbering, and recording an ID; when two or three continuous frames cannot find the detected vehicle, keeping records, if more than five frames cannot find the detected vehicle, determining that the vehicle disappears from the field of view, and deleting track information of the vehicle;
H. the data processing centers of the millimeter wave radar and the camera respectively send the track to a data fusion center, and the fusion center performs data fusion on the distance and the coordinate relation of the output data of the millimeter wave radar and the camera, and track correlation is performed based on the track of the millimeter wave radar and the track of the camera; the method specifically comprises the following steps:
h1, comparing the distance and the coordinate relationship obtained by the millimeter wave radar and the camera, when the results are consistent or the difference between the two is not obvious, namely the distance difference does not exceed the threshold value Q 1 And if the pixel difference is not more than 24x24, fusing and re-recording the numbers, wherein the difference on the XOZ plane is measured by the distance difference and the difference on the XOY plane is measured by the pixel difference;
h2, grading according to longitudinal distance when the difference between the two is obvious;
the step H2 specifically comprises the following steps:
h21, the longitudinal distance from the vehicle measured by millimeter wave is smaller than d 1 When the track is based on the track obtained by the camera, the track obtained by the millimeter wave radar is used as the detection of the camera track, the track point trace patterns of the track point trace patterns and the track point trace patterns are compared, the track point trace patterns are considered to be identical, if the track point trace patterns are inconsistent, independent tracking is kept, if the data display exceeding m frames can not be fused, the track point trace patterns are considered to be two targets, and the two targets are processed according to the newly-appearing targets;
h22, the longitudinal distance from the vehicle measured by millimeter wave is d 1 --d 2 The longitudinal distance difference between the two does not exceed the threshold value Q 2 If the distance of the coordinate difference on the image is not more than 48x48 pixel points, taking the median value for fusion, and if the longitudinal distance exceeds Q 2 Then the track is deleted directly;
h23, measured in millimeter wave, has a longitudinal distance d from the vehicle 2 --d 3 The track is based on the track formed by the millimeter wave radar, the track obtained by the camera is used for track detection of the millimeter wave radar, the track point trace patterns of the track and the track point trace pattern are compared, the track is considered to be accurate if the track is inconsistent, independent tracking is kept, if the data display exceeding m frames can not be fused, the track is considered to be two targets, and the two targets are processed according to the newly-appearing targets;
I. repeating the steps, and updating the track to obtain a tracking result.
2. The method for tracking a plurality of vehicles in front by fusing millimeter wave radar and deep learning vision according to claim 1, wherein the step a, the step of unifying the coordinate system is as follows:
a1, establishing a conversion relation between a millimeter wave radar and a world three-dimensional coordinate system, wherein the millimeter wave radar coordinate system is a two-dimensional coordinate system of a horizontal plane;
a2, establishing a conversion relation between a camera coordinate system and a three-dimensional world coordinate system, wherein the camera coordinate system is a two-dimensional coordinate system of a vertical plane;
a3, combining the coordinate relationship between the millimeter wave radar and the three-dimensional world and the coordinate relationship between the camera and the three-dimensional world, which are deduced in the step A1 and the step A2, and deducing the coordinate relationship between the millimeter wave radar and the camera image, wherein the coordinate relationship is as follows:
3. the method for tracking a plurality of vehicles in front by combining millimeter wave radar with deep learning vision according to claim 2, wherein the step A1 specifically comprises the following steps:
A11、X 0 O 0 z is the coordinate system of millimeter wave radar, the coordinate plane of Z is parallel to the XOZ plane of O-XYZ of the three-dimensional world coordinate system, and X 0 O 0 The Z plane is located at a Y1 position below the XOZ plane, Y1 is the installation height of the millimeter wave radar, and the XOZ plane of the three-dimensional world coordinate system O-XYZ is projected to the millimeter wave radar coordinate system X 0 O 0 On Z, OX axis and O 0 X 0 Distance Z between 0 O is the origin of the world coordinate system, and O is the origin of the millimeter wave radar coordinate system, namely the installation position of the millimeter wave radar;
a12, assuming that the front vehicle M is found in the scanning range of the millimeter wave radar, the relative distance between the front vehicle M and the millimeter wave radar is R, and the relative angle is alpha, namely MO 0 R unit is mm, +_mo 0 Z=α in degrees;
a13, transferring the vehicle target in the millimeter wave radar coordinate system to a three-dimensional world coordinate system to obtain
X=R x sina,Z=Z 0 +R x cosa。
4. The method for tracking a plurality of vehicles in front by combining millimeter wave radar with deep learning vision according to claim 2, wherein the step A2 specifically comprises the following steps:
a21, the camera coordinate system is a two-dimensional coordinate system XOY in a vertical plane, O is the origin of the camera coordinate system, and the coordinate plane is parallel to the XOY plane of the three-dimensional world coordinate system O-XYZ; wherein O is the origin of coordinates of the three-dimensional world coordinate system and is also the center of the lens of the camera, i.e. oo=f, f is the effective focal length of the camera in mm;
a22, the camera mounting process requires its optical axis to be parallel to the ground, i.e. the Y value in the three-dimensional world coordinate system remains unchanged, i.e. y=y 0 ,Y 0 The mounting height of the camera is in mm;
a23, the vehicle object M (X, Y 0 Z) to the image plane in the camera coordinate system, the conversion relationship is as follows:
5. The method for tracking a plurality of vehicles in front by combining millimeter wave radar with deep learning vision according to claim 1, wherein the step B specifically comprises the following steps:
b1, the front data received by the millimeter wave radar comprises a distance, an angle, a relative speed range, a reflection intensity power and a width of a front target;
b2, calculating the received data by utilizing a calculation protocol specified by the millimeter wave radar, and removing the static target and the invalid target;
b3, performing target screening according to the reflection intensity and width of the front target, and setting a reflection intensity threshold u 0 And a width threshold v 0 When the reflection intensity is larger than or equal to u 0 And width is greater than or equal to v 0 And confirming the vehicle target.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811219589.9A CN109459750B (en) | 2018-10-19 | 2018-10-19 | Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811219589.9A CN109459750B (en) | 2018-10-19 | 2018-10-19 | Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109459750A CN109459750A (en) | 2019-03-12 |
CN109459750B true CN109459750B (en) | 2023-05-23 |
Family
ID=65607929
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811219589.9A Active CN109459750B (en) | 2018-10-19 | 2018-10-19 | Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109459750B (en) |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109960264A (en) * | 2019-03-28 | 2019-07-02 | 潍柴动力股份有限公司 | A kind of target identification method and system |
CN110068818A (en) * | 2019-05-05 | 2019-07-30 | 中国汽车工程研究院股份有限公司 | The working method of traffic intersection vehicle and pedestrian detection is carried out by radar and image capture device |
CN110264586A (en) * | 2019-05-28 | 2019-09-20 | 浙江零跑科技有限公司 | L3 grades of automated driving system driving path data acquisitions, analysis and method for uploading |
CN110288832A (en) * | 2019-07-10 | 2019-09-27 | 南京慧尔视智能科技有限公司 | It is merged based on microwave with the multiple-object information of video and visual presentation method |
CN110422173B (en) * | 2019-07-11 | 2021-01-15 | 惠州市德赛西威智能交通技术研究院有限公司 | Driving environment identification method |
CN110398720A (en) * | 2019-08-21 | 2019-11-01 | 深圳耐杰电子技术有限公司 | A kind of anti-unmanned plane detection tracking interference system and photoelectric follow-up working method |
CN110794392B (en) * | 2019-10-15 | 2024-03-19 | 上海创昂智能技术有限公司 | Vehicle positioning method and device, vehicle and storage medium |
CN110632589B (en) * | 2019-10-17 | 2022-12-06 | 安徽大学 | Radar photoelectric information fusion technology |
CN110794397B (en) * | 2019-10-18 | 2022-05-24 | 北京全路通信信号研究设计院集团有限公司 | Target detection method and system based on camera and radar |
CN110736982B (en) * | 2019-10-28 | 2022-04-05 | 江苏集萃智能传感技术研究所有限公司 | Underground parking lot vehicle tracking method and device based on radar monitoring |
CN111104960B (en) * | 2019-10-30 | 2022-06-14 | 武汉大学 | Sign language identification method based on millimeter wave radar and machine vision |
CN111090095B (en) * | 2019-12-24 | 2023-03-14 | 上海汽车工业(集团)总公司 | Information fusion environment perception system and perception method thereof |
CN113095345A (en) * | 2020-01-08 | 2021-07-09 | 富士通株式会社 | Data matching method and device and data processing equipment |
CN111398923A (en) * | 2020-04-28 | 2020-07-10 | 东风汽车集团有限公司 | Multi-millimeter wave radar combined self-calibration method and system |
CN111731272A (en) * | 2020-06-17 | 2020-10-02 | 重庆长安汽车股份有限公司 | Obstacle collision avoidance method based on automatic parking system |
CN111880196A (en) * | 2020-06-29 | 2020-11-03 | 安徽海博智能科技有限责任公司 | Unmanned mine car anti-interference method, system and computer equipment |
CN111967498A (en) * | 2020-07-20 | 2020-11-20 | 重庆大学 | Night target detection and tracking method based on millimeter wave radar and vision fusion |
CN111862157B (en) * | 2020-07-20 | 2023-10-10 | 重庆大学 | Multi-vehicle target tracking method integrating machine vision and millimeter wave radar |
CN112034445B (en) * | 2020-08-17 | 2022-04-12 | 东南大学 | Vehicle motion trail tracking method and system based on millimeter wave radar |
CN111814769A (en) * | 2020-09-02 | 2020-10-23 | 深圳市城市交通规划设计研究中心股份有限公司 | Information acquisition method and device, terminal equipment and storage medium |
CN112033429B (en) * | 2020-09-14 | 2022-07-19 | 吉林大学 | Target-level multi-sensor fusion method for intelligent automobile |
CN112201040B (en) * | 2020-09-29 | 2022-12-16 | 同济大学 | Traffic data cleaning method and system based on millimeter wave radar data |
CN112380927B (en) * | 2020-10-29 | 2023-06-30 | 中车株洲电力机车研究所有限公司 | Rail identification method and device |
CN112346046B (en) * | 2020-10-30 | 2022-09-06 | 合肥中科智驰科技有限公司 | Single-target tracking method and system based on vehicle-mounted millimeter wave radar |
CN112415517A (en) * | 2020-11-03 | 2021-02-26 | 上海泽高电子工程技术股份有限公司 | Rail identification method based on millimeter wave radar |
CN113030944B (en) * | 2021-04-16 | 2024-02-02 | 深圳市众云信息科技有限公司 | Radar target tracking method |
CN113343849A (en) * | 2021-06-07 | 2021-09-03 | 西安恒盛安信智能技术有限公司 | Fusion sensing equipment based on radar and video |
CN114137512A (en) * | 2021-11-29 | 2022-03-04 | 湖南大学 | Front multi-vehicle tracking method based on fusion of millimeter wave radar and deep learning vision |
CN114299112B (en) * | 2021-12-24 | 2023-01-13 | 萱闱(北京)生物科技有限公司 | Multi-target-based track identification method, device, medium and computing equipment |
CN114518573B (en) * | 2022-04-21 | 2022-07-19 | 山东科技大学 | Vehicle tracking method, equipment and medium for multiple radars |
CN115184917B (en) * | 2022-09-13 | 2023-03-10 | 湖南华诺星空电子技术有限公司 | Regional target tracking method integrating millimeter wave radar and camera |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106461774A (en) * | 2014-02-20 | 2017-02-22 | 御眼视觉技术有限公司 | Advanced driver assistance system based on radar-cued visual imaging |
CN107076842A (en) * | 2014-08-25 | 2017-08-18 | 兰普洛克斯公司 | Positioned using the indoor location of delayed sweep beam reflector |
CN108613679A (en) * | 2018-06-14 | 2018-10-02 | 河北工业大学 | A kind of mobile robot Extended Kalman filter synchronous superposition method |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3918791B2 (en) * | 2003-09-11 | 2007-05-23 | トヨタ自動車株式会社 | Object detection device |
JP2006292621A (en) * | 2005-04-13 | 2006-10-26 | Toyota Motor Corp | Object detection apparatus |
US7460951B2 (en) * | 2005-09-26 | 2008-12-02 | Gm Global Technology Operations, Inc. | System and method of target tracking using sensor fusion |
US8704887B2 (en) * | 2010-12-02 | 2014-04-22 | GM Global Technology Operations LLC | Multi-object appearance-enhanced fusion of camera and range sensor data |
CN102508246B (en) * | 2011-10-13 | 2013-04-17 | 吉林大学 | Method for detecting and tracking obstacles in front of vehicle |
US10848739B2 (en) * | 2012-09-13 | 2020-11-24 | California Institute Of Technology | Coherent camera |
US10565468B2 (en) * | 2016-01-19 | 2020-02-18 | Aptiv Technologies Limited | Object tracking system with radar/vision fusion for automated vehicles |
US20170242117A1 (en) * | 2016-02-19 | 2017-08-24 | Delphi Technologies, Inc. | Vision algorithm performance using low level sensor fusion |
CN107862287A (en) * | 2017-11-08 | 2018-03-30 | 吉林大学 | A kind of front zonule object identification and vehicle early warning method |
-
2018
- 2018-10-19 CN CN201811219589.9A patent/CN109459750B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106461774A (en) * | 2014-02-20 | 2017-02-22 | 御眼视觉技术有限公司 | Advanced driver assistance system based on radar-cued visual imaging |
CN107076842A (en) * | 2014-08-25 | 2017-08-18 | 兰普洛克斯公司 | Positioned using the indoor location of delayed sweep beam reflector |
CN108613679A (en) * | 2018-06-14 | 2018-10-02 | 河北工业大学 | A kind of mobile robot Extended Kalman filter synchronous superposition method |
Also Published As
Publication number | Publication date |
---|---|
CN109459750A (en) | 2019-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109459750B (en) | Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision | |
CN109444911B (en) | Unmanned ship water surface target detection, identification and positioning method based on monocular camera and laser radar information fusion | |
CN104183127B (en) | Traffic surveillance video detection method and device | |
CN110675418B (en) | Target track optimization method based on DS evidence theory | |
CN108549084B (en) | Target detection and attitude estimation method based on sparse two-dimensional laser radar | |
CN111781608B (en) | Moving target detection method and system based on FMCW laser radar | |
CN110689562A (en) | Trajectory loop detection optimization method based on generation of countermeasure network | |
CN108592876A (en) | Tunnel appearance Defect inspection robot based on laser scanning imaging principle | |
CN113359097B (en) | Millimeter wave radar and camera combined calibration method | |
GB2317066A (en) | Method of detecting objects for road vehicles using stereo images | |
CN107796373B (en) | Distance measurement method based on monocular vision of front vehicle driven by lane plane geometric model | |
CN114280611A (en) | Road side sensing method integrating millimeter wave radar and camera | |
CN115482195B (en) | Train part deformation detection method based on three-dimensional point cloud | |
CN109100697B (en) | Target condensation method based on ground monitoring radar system | |
CN114879217B (en) | Target pose judgment method and system | |
CN112862858A (en) | Multi-target tracking method based on scene motion information | |
CN111359913A (en) | Method for sorting ores through laser radar | |
CN111999735A (en) | Dynamic and static target separation method based on radial velocity and target tracking | |
CN108983194B (en) | Target extraction and condensation method based on ground monitoring radar system | |
CN114035188A (en) | Ground-based radar glacier flow speed high-precision monitoring algorithm and system | |
CN110490903A (en) | Multiple target fast Acquisition and tracking in a kind of Binocular vision photogrammetry | |
KR20230101560A (en) | Vehicle lidar system and object detecting method thereof | |
CN104537690B (en) | One kind is based on the united moving spot targets detection method of maximum time index | |
CN115453570A (en) | Multi-feature fusion mining area dust filtering method | |
CN111239761B (en) | Method for indoor real-time establishment of two-dimensional map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |