CN113848545B - Fusion target detection and tracking method based on vision and millimeter wave radar - Google Patents

Fusion target detection and tracking method based on vision and millimeter wave radar Download PDF

Info

Publication number
CN113848545B
CN113848545B CN202111018876.5A CN202111018876A CN113848545B CN 113848545 B CN113848545 B CN 113848545B CN 202111018876 A CN202111018876 A CN 202111018876A CN 113848545 B CN113848545 B CN 113848545B
Authority
CN
China
Prior art keywords
target
millimeter wave
wave radar
clustering
speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111018876.5A
Other languages
Chinese (zh)
Other versions
CN113848545A (en
Inventor
李曙光
郑珂
李振旭
赵洋
程洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202111018876.5A priority Critical patent/CN113848545B/en
Publication of CN113848545A publication Critical patent/CN113848545A/en
Application granted granted Critical
Publication of CN113848545B publication Critical patent/CN113848545B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention belongs to the field of computer vision and pattern recognition, and particularly relates to a fusion target detection and tracking method based on vision and millimeter wave radar. The invention is based on the assumption that the motion characteristics of the same target under different coordinate systems are the same, and the image and the millimeter wave radar data are treated as two independent branches. Tracking the target by adopting a CenterTrack in the image branch, performing multi-target tracking on the millimeter wave radar branch based on a Kalman filtering algorithm, and obtaining a short-time speed curve of the target through tracking; and then, after a short-time speed curve of the target is obtained through tracking, defining the difference degree between the short-time speed curve and the target according to the assumption that the motion characteristics of the same target under different coordinate systems are the same, and matching the target through measuring the difference degree of the curve. Therefore, the target-level fusion of the millimeter wave radar and the camera is realized on the premise of not carrying out the combined calibration of the camera and the millimeter wave radar.

Description

Fusion target detection and tracking method based on vision and millimeter wave radar
Technical Field
The invention belongs to the field of computer vision and pattern recognition, and particularly relates to a fusion target detection and tracking method based on vision and millimeter wave radar.
Background
The automatic driving automobile is usually provided with different types of sensors, the common sensors mainly comprise a camera, a laser radar and a millimeter wave radar, and the three sensors have respective advantages and disadvantages. The camera is the simplest sensor which is closest to human eyes on the automatic driving vehicle, and has the advantages of clear imaging and low cost; the disadvantage is that it is greatly influenced by the environment and cannot provide three-dimensional information. The laser radar has the advantages of high resolution, good concealment, strong active interference resistance and the like, and has the biggest defect that the laser radar is greatly influenced by weather and atmosphere during working, and the propagation distance is greatly influenced under the weather conditions of heavy rain, heavy fog and the like. Compared with a camera and a laser radar, the millimeter wave radar has strong anti-environmental interference capability, can meet the adaptability requirement of vehicles on all-day climate, but is far inferior to the laser radar in resolution. Therefore, the characteristics of the sensors are complemented by fusing the sensors, and the sensing capability of the automatic driving automobile to the surrounding environment can be effectively improved.
In recent years, most of the research on the sensor fusion method has focused on fusing a laser radar and a camera. The fusion of the two sensors has been proved to be capable of realizing high-precision detection in most cases, but both sensors are susceptible to rain, snow, haze and the like in the air, and the large number of scanning points obtained by the laser radar provides great challenges for data processing and real-time performance. Compared with the fusion of two sensors, namely a laser radar sensor and a camera, the fusion of the millimeter wave radar and the camera is minimally affected by the environment.
At present, there are three strategies for data fusion of millimeter wave radar and camera: a feature level, a target level, and a signal level. The feature level fusion is to project a point target of the millimeter wave radar on an image, respectively extract the millimeter wave radar and the image feature, supplement the image feature by using the millimeter wave radar feature, and mainly complete tasks such as target detection and the like by using the image. The target level fusion method usually obtains a target detection frame only according to image data, and a detection point of the millimeter wave radar is directly associated with the detection frame or associated with the detection frame after most noise points are removed through filtering or other methods, so that depth and speed information corresponding to a target are given to the corresponding detection frame; the signal level fusion method is different from the two methods for processing the detection point of the millimeter wave radar, and the original data of the millimeter wave radar and the image data are fused. In the three fusion modes, two sensors are jointly calibrated in the fusion process, and the millimeter wave radar data points can be projected to an image plane to complete fusion. Due to the fact that the combined calibration process is complicated, certain errors can be introduced in the calibration process, the problems of workload increase and fusion difficulty increase cannot be avoided, and the target identification accuracy can be reduced due to the errors introduced in the calibration process. Therefore, it is necessary to provide a new method for integrating the millimeter wave radar and the camera.
Disclosure of Invention
The invention aims to: the method for detecting and tracking the fusion target based on the vision and the millimeter wave radar can realize the fusion of the millimeter wave radar and the camera without carrying out combined calibration on the millimeter wave radar and the camera.
In order to achieve the purpose, the invention adopts the following technical scheme:
a fusion target detection and tracking method based on vision and millimeter wave radar comprises the following steps:
step 1, respectively obtaining a short-time speed change curve of an image target and a short-time speed change curve of a millimeter wave radar target;
the short-time speed change curve obtaining process of the image target comprises the following steps:
(1) Detecting and tracking the image target by adopting a 3D target tracking network to obtain an image sequence containing target behaviors; marking the depth value of each target and the ID corresponding to each target for each image;
(2) According to the time t, the speed of the target is the ratio of the depth difference of two adjacent frames of the same target to the time interval, the speed value of each target at each time is calculated, and a multi-target short-time speed change curve in the image is obtained;
the acquisition process of the short-time speed change curve of the millimeter wave radar target is as follows:
(1) Acquiring millimeter wave radar point cloud data with the same image timestamp, and filtering out static targets in the point cloud data;
(2) Clustering data points in the point cloud data by adopting a nearest neighbor clustering algorithm to obtain a plurality of clustering centers; regarding each clustering center as a target, and respectively calculating the position coordinate and the speed value of each clustering center at each moment;
(3) Respectively constructing a Kalman filter for predicting the position and the speed of the next moment t +1 aiming at each clustering center; based on the position and the speed, matching the prediction result with the clustering center at the corresponding moment by adopting a neighbor matching mode, updating the parameters of the Kalman filter by utilizing the clustering center matched at the moment t +1, and then taking the updated parameters as initial values of next prediction; directly taking the current prediction result as a next prediction initial value for the unmatched clustering center to perform next prediction, and if two continuous frames are unmatched to the corresponding target, determining that the center runs out of an observable range;
repeating the process of the steps to realize the association of the same target in the continuous frames to obtain a short-time speed change curve of the millimeter wave radar target;
step 2, carrying out similarity measurement on the image target and the millimeter wave target so as to realize target pole fusion of the image and the millimeter wave radar
(1) According to the timestamp information of the image sequence and the point cloud data, time sequence registration is carried out on the image sequence and the point cloud data, and the time consistency of the fused data of the two sensors is ensured;
(2) Based on the assumption that the same target speed is the same under different sensor coordinate systems, the speed difference degree V of the millimeter wave radar target and the image target is defined d :
Figure BDA0003241017690000021
Wherein v is radar,i Value of target velocity of millimeter wave radar indicating i time, v camerai And represents the velocity value of the image object at time i. Frame _ num represents the number of frames used to calculate the speed variation degree.
(3) And calculating the difference between each millimeter wave radar target and each image target at the same time by using a defined speed difference formula, and then taking the target corresponding to the minimum value as a matching result to finish the measurement of the similarity of the targets under the coordinates of the two sensors so as to realize the target-level fusion of the images and the millimeter wave radar.
Further, the process of clustering data points in the point cloud data by using a nearest neighbor clustering algorithm adopted in the process of obtaining the millimeter wave radar short-time speed curve to obtain a plurality of clustering centers is as follows:
(1) Setting a speed difference threshold T between two points v Distance threshold T d And a speed difference allowable value Deltav T
(2) Clustering data points of each frame of the millimeter wave radar to obtain a plurality of clustering centers
Firstly, selecting a first data point as a first clustering center; then selecting a second data point, and calculating the speed difference and Euclidean distance between the second data point and the first clustering center to obtain the speed difference and the Euclidean distance between the second data point and the first clustering center;
comparing the calculated speed difference and Euclidean distance with a set threshold, and when the calculated speed difference and the Euclidean distance are both greater than the set threshold, determining that a second data point is a second clustering center;
selecting a third data point from the current remaining data points, and respectively calculating the speed difference and Euclidean distance between the third data point and the previous two clustering centers; when all the calculated speed differences and the Euclidean distances are larger than the set threshold value, judging that a third data point is a third cluster center; repeating the steps until all data points are judged to be finished, and obtaining a plurality of clustering centers;
(3) Categorizing non-cluster-centered data points
Selecting any data point from the data points of the non-clustering centers, and calculating the speed difference and the distance between the data point and each clustering center; taking the cluster center closest to the current data point, and if the speed difference is less than the allowable speed difference value delta v T If so, the millimeter wave radar data point belongs to the cluster corresponding to the cluster center; if the speed difference is greater than the allowable speed difference value Deltav T If so, abandoning the cluster center and taking the cluster center which is the second nearest;
and then, circulating until all data points finish clustering, and calculating the average value of the speed and the position of all data points in the cluster as the clustering center of the clustering cluster after obtaining a plurality of clustering clusters.
Further, the 3D target tracking network used for detecting and tracking the image target is centrtrack.
The invention provides a fusion target detection and tracking method based on vision and millimeter wave radar, which realizes the fusion of a millimeter wave radar target and an image target based on the assumption that the motion characteristics of the same target are the same under different coordinate systems. In the fusion process, firstly, the image and the millimeter wave radar data are treated as two independent parts: the image part adopts a 3D target tracking network to complete the tracking of the target, and a short-time speed curve of the image target is obtained; and the millimeter wave radar part performs multi-target tracking by adopting a nearest neighbor clustering algorithm and combining with a Kalman filter to obtain a short-time speed curve of the millimeter wave radar target. And then according to the assumption that the motion characteristics of the same target are the same under different coordinate systems, a speed difference degree formula of the millimeter wave radar target and the image target is defined, and the difference degree of the millimeter wave radar target and the image target at the same moment is calculated through the formula to carry out target matching. Therefore, the target level fusion of the millimeter wave radar and the camera is realized on the premise of not carrying out the combined calibration of the camera and the millimeter wave radar.
Compared with the existing fusion method of the millimeter wave radar and the vision, the invention provides a different fusion mode, and in the fusion process of the invention, the target-level fusion of the millimeter wave radar and the camera can be completed without carrying out combined calibration on the millimeter wave radar and the vision. On one hand, related work of early calibration is reduced, and the fusion difficulty is reduced; and on the other hand, the problem of reduction of target identification accuracy rate caused by errors introduced in the calibration process is avoided.
Drawings
FIG. 1 is a tracking flow chart of the present invention;
FIG. 2 is a diagram showing the results of a CenterTrack trace;
FIG. 3 is a diagram showing the clustering results of the example;
FIG. 4 is a graph of the short-time velocity obtained from the image of the embodiment and the short-time velocity obtained from the millimeter-wave radar;
fig. 5 is a diagram showing the matching result of the image target and the millimeter wave radar target.
Detailed Description
The invention provides a fusion target detection and tracking method based on vision and millimeter wave radar, as shown in figure 1, comprising the following steps:
a fusion target detection and tracking method based on vision and millimeter wave radar comprises the following steps:
step 1, detecting and tracking an image target to acquire a short-time speed change curve of the image target
1.1, detecting and tracking an image target by adopting a 3D target tracking network to obtain an image sequence containing target behaviors; and a depth value of each target and an ID corresponding to each target are noted for each image. Since the centrtrack network can directly give much information needed for tracking, such as location, size and depth values, see fig. 2 in particular. In order to simplify the later operation process, the detection and tracking of the image target are completed by using a CenterTrack network.
1.2, according to the average speed v = s/Δ t in a period of time in the existing speed formula, it can be known that the speed of the target at the time t is the ratio of the depth difference of two adjacent frames of the same target to the time interval, that is, the speed is the ratio of the depth difference of the two adjacent frames of the same target, that is, the time interval is the average speed v = s/Δ t
Figure BDA0003241017690000041
Where s represents the displacement within the time interval and Δ t represents the time interval; v. of t Representing the velocity value at time t, h t Indicates the depth of time t, h t-1 Indicating the depth at time t-1. Δ t represents the time difference between the two moments.
And (2) substituting the depth value of each target acquired in the step 1.1 into the speed calculation formula, namely the speed value of each target in each frame.
And 1.3, taking 5 continuous frames to obtain a multi-target short-time speed change curve in the image.
Step 2, tracking the target of the millimeter wave radar, and acquiring the short-time speed change curve of the target of the millimeter wave radar
2.1, millimeter wave radar point cloud data with the same image timestamp is obtained, and static targets in the point cloud data are filtered;
2.2, clustering data points in the point cloud data by adopting a nearest neighbor clustering algorithm to obtain a plurality of clustering centers; and regarding each clustering center as a target, and respectively calculating the position coordinates and the speed values of each clustering center at each moment. Specifically, the method comprises the following steps:
2.2.1 setting a speed difference threshold T between two points v Distance threshold value T d And a speed difference allowable value Deltav T
2.2.2, clustering data points in each frame of the millimeter wave radar to obtain a plurality of clustering centers
Firstly, selecting a first data point as a first clustering center; and then selecting a second data point, and calculating the speed difference and the Euclidean distance between the second data point and the first clustering center to obtain the speed difference and the Euclidean distance between the second data point and the first clustering center. Speed difference Δ v Euclidean distance calculation of sum position d The formula is as follows:
Δ d =||p i -p 0 ||
Δ v =||v i -v 0 ||
wherein p is i =(x i ,y i ) Coordinates representing the ith millimeter wave radar data point; p is a radical of formula 0 Coordinates representing a first cluster center; x is the number of i ,y i Representing their x and y coordinates, respectively. v. of i Representing a velocity value of an ith millimeter wave radar data point; v. of 0 Representing a velocity value of a first cluster center; .
And comparing the calculated speed difference and the Euclidean distance with a set threshold, and judging that the second data point is the second clustering center when the calculated speed difference and the Euclidean distance are both greater than the set threshold.
Selecting a third data point from the current remaining data points, and respectively calculating the speed difference and Euclidean distance between the third data point and each cluster center; when all the calculated speed differences and the Euclidean distances are larger than a set threshold value, judging that a third data point is a third clustering center; repeating the steps until all data points are judged to be finished, and obtaining a plurality of clustering centers;
2.2.3 classifying data points that are not cluster centers according to the cluster centers of step 2.2.3
Selecting any data point from data points except for the data point serving as the clustering center in the same frame, and then calculating the speed difference and the distance between the data point and each clustering center;
taking the cluster center closest to the current data point, and if the speed difference is less than the allowable speed difference value delta v T If so, the millimeter wave radar data point belongs to the cluster corresponding to the cluster center; if the speed difference is greater than the allowable speed difference value Deltav T If so, abandoning the cluster center, taking the cluster center which is the second nearest, and judging again according to the process; and circulating the steps until all the data points are clustered to obtain a plurality of clustering clusters, and calculating the average value of the speed and the position of all the data points in the clusters to be used as the clustering center of the clustering clusters.
2.3, respectively constructing a Kalman filter for each clustering center to predict the position and the speed of the next moment t + 1;
based on the position and the speed, matching the prediction result at the t +1 moment with the clustering center at the t +1 moment in the step 2.2 in a neighbor matching mode; then, updating the parameters of the Kalman filter by using the matched clustering center at the time t +1, and then taking the updated parameters as initial values of next prediction; and directly predicting the current prediction result for the unmatched cluster centers next time, and if two continuous frames are unmatched to the corresponding target, determining that the centers are driven out of the observable range.
The present embodiment illustrates the construction of the kalman filter prediction process with a single objective:
to facilitate the description of the formula, the present embodiment defines the acceleration a and the additional control vector according to the process of Kalman filtering remote formula derivation
Figure BDA0003241017690000061
Extracting a temporal clustering target X t Position coordinate X of t =[x t v t ] T The sum velocity value is used as an input value, and a covariance matrix P is adopted t Representing the correlation of position and speed in time t and time t + 1; in combination with a defined acceleration a and an additional control vector ≥>
Figure BDA0003241017690000062
And predicting the speed value and the position data at the time t + 1. The update step equation in the prediction process is as follows:
Figure BDA0003241017690000063
P' t+1 =P t+1 -K'H t+1 P t+1
Figure BDA0003241017690000064
wherein K' is Kalman gainYi, H t+1 Is data of the sensor, z t+1 Is an observed value.
Note that, the observation value refers to a coordinate value of the target at the next time t + 1; and after the Kalman filter predicts the position of the next moment of the target to obtain a predicted coordinate value, matching the measured value of the moment by adopting an Euclidean distance.
The multi-target tracking is established on the basis of single target tracking, and a plurality of single target Kalman filters are adopted; the calculation is the same as for a single target.
And 2.5, repeating the step 2.4 to realize the association of the same target in the continuous frames to obtain a short-time speed change curve of the millimeter wave radar target.
In the above process, it is noted that: in the neighbor matching process, for a data point which is not matched, whether the data point is a vehicle target or not is judged according to the speed value of the data point, and a new single target Kalman filter is created. And for the Kalman filter which is not matched, continuously predicting 2 frames, and if no observation on the matching exists in the 2 frames, abandoning the Kalman filter and considering that the Kalman filter exits the observable area. And (4) taking the detection results of the continuous 4 frames to obtain the final short-time speed change curve of the millimeter wave radar target. In this embodiment, the speed curve obtained by the image and the speed curve obtained by the millimeter wave radar are shown in fig. 4. The dashed line in fig. 4 is the velocity profile of the image target, implemented as the velocity profile of a millimeter wave radar.
Step 3, carrying out similarity measurement on the image target and the millimeter wave target so as to realize target pole fusion of the image and the millimeter wave radar
3.1, according to the time stamp information of the image sequence and the point cloud data, carrying out time sequence registration on the image sequence and the point cloud data to ensure that the time of fusing data of the two sensors is consistent;
3.2, based on the assumption that the speeds of the same target under different sensor coordinate systems are the same, defining the speed difference degree V between the millimeter wave radar target and the image target d :
Figure BDA0003241017690000071
Wherein v is radar,i Value of target velocity of millimeter wave radar indicating i time, v camerai And represents the velocity value of the image object at time i. Frame _ num denotes the number of frames used to calculate the speed disparity.
3.3, calculating the difference between each millimeter wave radar target and each image target at the same time by using the speed difference formula defined in the step 3.2, and then taking the target corresponding to the minimum value as a matching result to finish the measurement of the similarity of the targets under the coordinates of the two sensors, thereby realizing the target-level fusion of the images and the millimeter wave radar.
If different millimeter wave radar targets are matched with the same image target, the difference degree of the millimeter wave radar targets and the image target is compared, and the millimeter wave radar target with the minimum difference degree with the image target is selected for matching. The remaining millimeter wave radar targets that are not matched with the image target are re-matched with the remaining other image targets on the premise of removing the image target, and the matching result is shown in fig. 5.
It can be seen that the present invention provides a means of fusion that is different from the prior art. In the whole fusion process, the image and the millimeter wave radar are regarded as two independent parts, and respective short-time speed curves are obtained respectively, so that the advantages of the image and the millimeter wave radar can be fully exerted, and the accuracy of target identification is ensured. In the image part, the mature CenterTrack network is directly utilized for tracking, and the obtained image target information is accurate and complete; in the millimeter wave radar part, multi-target tracking is completed through the cooperation of a nearest neighbor clustering algorithm and a Kalman filter, so that the whole operation amount can be reduced, and the condition that the target of the millimeter wave radar is lost in tracking is avoided. And then, aiming at the tracked target short-time speed curve, carrying out target matching by measuring the difference degree of the curve. Therefore, the target-level fusion of the millimeter wave radar and the camera is realized on the premise of not carrying out the combined calibration of the camera and the millimeter wave radar.

Claims (3)

1. A fusion target detection and tracking method based on vision and millimeter wave radar is characterized in that: the method comprises the following steps:
step 1, respectively obtaining a short-time speed change curve of an image target and a short-time speed change curve of a millimeter wave radar target;
the short-time speed change curve obtaining process of the image target comprises the following steps:
(1) Detecting and tracking the image target by adopting a 3D target tracking network to obtain an image sequence containing target behaviors; marking the depth value of each target and the ID corresponding to each target for each image;
(2) According to the time t, the speed of the target is the ratio of the depth difference of two adjacent frames of the same target to the time interval, the speed value of each target at each time is calculated, and a multi-target short-time speed change curve in the image is obtained;
the short-time speed change curve of the millimeter wave radar target is obtained in the following process:
(1) Acquiring millimeter wave radar point cloud data with the same image timestamp, and filtering out static targets in the point cloud data;
(2) Clustering data points in the point cloud data by adopting a nearest neighbor clustering algorithm to obtain a plurality of clustering centers; regarding each clustering center as a target, and respectively calculating the position coordinate and the speed value of each clustering center at each moment;
(3) Respectively constructing a Kalman filter for predicting the position and the speed of the next moment t +1 aiming at each clustering center; based on the position and the speed, matching the prediction result with the clustering center at the corresponding moment by adopting a neighbor matching mode, updating the parameters of the Kalman filter by utilizing the clustering center matched at the moment t +1, and then taking the updated parameters as initial values of next prediction; for the unmatched clustering centers, directly taking the current prediction result as a next prediction initial value for next prediction, and if the two continuous frames are not matched with the corresponding targets, determining that the centers are driven out of an observable range;
repeating the process of the steps to realize the association of the same target in the continuous frames to obtain a short-time speed change curve of the millimeter wave radar target;
step 2, carrying out similarity measurement on the image target and the millimeter wave target so as to realize target pole fusion of the image and the millimeter wave radar
(1) According to the timestamp information of the image sequence and the point cloud data, time sequence registration is carried out on the image sequence and the point cloud data, and the time consistency of the fused data of the two sensors is ensured;
(2) Based on the assumption that the same target speed is the same under different sensor coordinate systems, the speed difference degree V between the millimeter wave radar target and the image target is defined d :
Figure QLYQS_1
Wherein v is radar,i Value of target velocity of millimeter wave radar indicating i time, v camerai The Frame _ num represents the Frame number used for calculating the speed difference degree;
(3) And calculating the difference between each millimeter wave radar target and each image target at the same time by using a defined speed difference formula, and then taking the target corresponding to the minimum value as a matching result to finish the measurement of the similarity of the targets under the coordinates of the two sensors so as to realize the target-level fusion of the images and the millimeter wave radar.
2. The fusion target detection and tracking method based on vision and millimeter wave radar according to claim 1, characterized in that: the process of clustering the data points in the point cloud data to obtain a plurality of clustering centers by using the nearest neighbor clustering algorithm adopted in the process of acquiring the short-time speed curve of the millimeter wave radar is as follows:
(1) Setting a speed difference threshold T between two points v Distance threshold T d And a speed difference allowable value Deltav T
(2) Clustering data points of each frame of the millimeter wave radar to obtain a plurality of clustering centers
Firstly, selecting a first data point as a first clustering center; selecting a second data point, and calculating the speed difference and the Euclidean distance between the second data point and the first clustering center to obtain the speed difference and the Euclidean distance between the second data point and the first clustering center;
comparing the calculated speed difference and Euclidean distance with a set threshold value, and when the calculated speed difference and the Euclidean distance are both greater than the set threshold value, judging that a second data point is a second clustering center;
selecting a third data point from the current remaining data points, and respectively calculating the speed difference and Euclidean distance between the third data point and the previous two clustering centers; when all the calculated speed differences and the Euclidean distances are larger than a set threshold value, judging that a third data point is a third clustering center; repeating the steps until all data points are judged to be finished, and obtaining a plurality of clustering centers;
(3) Categorizing non-cluster-centered data points
Selecting any data point from the data points of the non-clustering centers, and calculating the speed difference and the distance between the data point and each clustering center; taking the cluster center closest to the current data point, and if the speed difference is less than the allowable speed difference value delta v T If the millimeter wave radar data point belongs to the cluster corresponding to the cluster center; if the speed difference is greater than the allowable speed difference value Deltav T If so, discarding the cluster center and taking the cluster center which is the second nearest;
and then, circulating until all data points finish clustering, and calculating the average value of the speed and the position of all data points in the cluster as the clustering center of the clustering cluster after obtaining a plurality of clustering clusters.
3. The fusion target detection and tracking method based on vision and millimeter wave radar as claimed in claim 1, wherein: the 3D target tracking network adopted for the detection and tracking of the image target is CenterTrack.
CN202111018876.5A 2021-09-01 2021-09-01 Fusion target detection and tracking method based on vision and millimeter wave radar Active CN113848545B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111018876.5A CN113848545B (en) 2021-09-01 2021-09-01 Fusion target detection and tracking method based on vision and millimeter wave radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111018876.5A CN113848545B (en) 2021-09-01 2021-09-01 Fusion target detection and tracking method based on vision and millimeter wave radar

Publications (2)

Publication Number Publication Date
CN113848545A CN113848545A (en) 2021-12-28
CN113848545B true CN113848545B (en) 2023-04-14

Family

ID=78976637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111018876.5A Active CN113848545B (en) 2021-09-01 2021-09-01 Fusion target detection and tracking method based on vision and millimeter wave radar

Country Status (1)

Country Link
CN (1) CN113848545B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114442101B (en) * 2022-01-28 2023-11-14 南京慧尔视智能科技有限公司 Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
CN116977362A (en) * 2022-04-20 2023-10-31 深圳市普渡科技有限公司 Target tracking method, device, computer equipment and storage medium
CN115542308B (en) * 2022-12-05 2023-03-31 德心智能科技(常州)有限公司 Indoor personnel detection method, device, equipment and medium based on millimeter wave radar

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19746970A1 (en) * 1997-10-24 1999-04-29 Cit Alcatel Obstacle recognition for rail vehicle with automatic guidance
JP2008145295A (en) * 2006-12-11 2008-06-26 Mitsubishi Electric Corp Sensor system
CN107817488A (en) * 2017-09-28 2018-03-20 西安电子科技大学昆山创新研究院 The unmanned plane obstacle avoidance apparatus and barrier-avoiding method merged based on millimetre-wave radar with vision
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision
CN111157984A (en) * 2020-01-08 2020-05-15 电子科技大学 Pedestrian autonomous navigation method based on millimeter wave radar and inertial measurement unit
CN111862157A (en) * 2020-07-20 2020-10-30 重庆大学 Multi-vehicle target tracking method integrating machine vision and millimeter wave radar
CN111967498A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection and tracking method based on millimeter wave radar and vision fusion

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7460951B2 (en) * 2005-09-26 2008-12-02 Gm Global Technology Operations, Inc. System and method of target tracking using sensor fusion
US10634778B2 (en) * 2014-10-21 2020-04-28 Texas Instruments Incorporated Camera assisted tracking of objects in a radar system
GB2590115B (en) * 2019-09-13 2023-12-06 Motional Ad Llc Extended object tracking using radar

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19746970A1 (en) * 1997-10-24 1999-04-29 Cit Alcatel Obstacle recognition for rail vehicle with automatic guidance
JP2008145295A (en) * 2006-12-11 2008-06-26 Mitsubishi Electric Corp Sensor system
CN107817488A (en) * 2017-09-28 2018-03-20 西安电子科技大学昆山创新研究院 The unmanned plane obstacle avoidance apparatus and barrier-avoiding method merged based on millimetre-wave radar with vision
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision
CN111157984A (en) * 2020-01-08 2020-05-15 电子科技大学 Pedestrian autonomous navigation method based on millimeter wave radar and inertial measurement unit
CN111862157A (en) * 2020-07-20 2020-10-30 重庆大学 Multi-vehicle target tracking method integrating machine vision and millimeter wave radar
CN111967498A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection and tracking method based on millimeter wave radar and vision fusion

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Josip Ćesić 等.Radar and stereo vision fusion for multitarget tracking on the special Euclidean group.《Robotics and Autonomous Systems》.2016,第83卷338-348. *
张炳力 等.基于毫米波雷达和机器视觉融合的车辆检测.《汽车工程》.2021,第43卷(第04期),478-484. *
李宁 等.汽车ACC毫米波雷达和视觉传感器目标级融合方法研究.《汽车实用技术》.2021,第46卷(第07期),19-22. *
甘耀东 等.融合毫米波雷达与深度视觉的多目标检测与跟踪.《汽车工程》.2021,第43卷(第07期),1022-1029. *

Also Published As

Publication number Publication date
CN113848545A (en) 2021-12-28

Similar Documents

Publication Publication Date Title
CN113848545B (en) Fusion target detection and tracking method based on vision and millimeter wave radar
CN111693972B (en) Vehicle position and speed estimation method based on binocular sequence images
CN107330925B (en) Multi-obstacle detection and tracking method based on laser radar depth image
CN109949375B (en) Mobile robot target tracking method based on depth map region of interest
CN111369541B (en) Vehicle detection method for intelligent automobile under severe weather condition
CN111260683A (en) Target detection and tracking method and device for three-dimensional point cloud data
CN103064086B (en) Vehicle tracking method based on depth information
CN110942449A (en) Vehicle detection method based on laser and vision fusion
CN102806913B (en) Novel lane line deviation detection method and device
CN111932580A (en) Road 3D vehicle tracking method and system based on Kalman filtering and Hungary algorithm
CN108645375B (en) Rapid vehicle distance measurement optimization method for vehicle-mounted binocular system
CN115032651A (en) Target detection method based on fusion of laser radar and machine vision
CN112991391A (en) Vehicle detection and tracking method based on radar signal and vision fusion
CN113850102B (en) Vehicle-mounted vision detection method and system based on millimeter wave radar assistance
CN114170274B (en) Target tracking method and device, electronic equipment and storage medium
CN111723778B (en) Vehicle distance measuring system and method based on MobileNet-SSD
CN114495064A (en) Monocular depth estimation-based vehicle surrounding obstacle early warning method
CN109541601A (en) Differentiating obstacle and its detection method based on millimeter wave
CN115308732A (en) Multi-target detection and tracking method integrating millimeter wave radar and depth vision
CN114280611A (en) Road side sensing method integrating millimeter wave radar and camera
CN110703272B (en) Surrounding target vehicle state estimation method based on vehicle-to-vehicle communication and GMPHD filtering
CN113221739B (en) Monocular vision-based vehicle distance measuring method
CN112489080A (en) Binocular vision SLAM-based vehicle positioning and vehicle 3D detection method
CN111539278A (en) Detection method and system for target vehicle
CN115471526A (en) Automatic driving target detection and tracking method based on multi-source heterogeneous information fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant