CN115390066A - Improved Hungarian algorithm target tracking matching method based on fusion of camera and millimeter wave radar - Google Patents

Improved Hungarian algorithm target tracking matching method based on fusion of camera and millimeter wave radar Download PDF

Info

Publication number
CN115390066A
CN115390066A CN202210988217.2A CN202210988217A CN115390066A CN 115390066 A CN115390066 A CN 115390066A CN 202210988217 A CN202210988217 A CN 202210988217A CN 115390066 A CN115390066 A CN 115390066A
Authority
CN
China
Prior art keywords
target
matching
track
fusion
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210988217.2A
Other languages
Chinese (zh)
Inventor
罗马思阳
王利杰
万印康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Guangmu Automobile Technology Co ltd
Original Assignee
Suzhou Guangmu Automobile Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Guangmu Automobile Technology Co ltd filed Critical Suzhou Guangmu Automobile Technology Co ltd
Priority to CN202210988217.2A priority Critical patent/CN115390066A/en
Publication of CN115390066A publication Critical patent/CN115390066A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses an improved Hungarian algorithm target tracking matching method based on the fusion of a camera and a millimeter wave radar radar,refr 、Φ camera,refc Wherein the corner mark radar represents the millimeter wave radar from the sensor, and the corner mark refr represents the reference coordinate system is from the millimeter wave radar; the target tracking matching method comprises the following steps: s1, carrying out time registration on multi-sensor data; s2, carrying out spatial registration on the multi-sensor data; s3, matching a sensor target with a system track; the accuracy of the Hungarian matching algorithm is improved; the matrix dimension is decomposed and reduced, so that the Hungarian matching speed is increased; hungarian matching sum cardThe target tracking fusion matching effect of the Kalman filtering is stable, and the method is suitable for the fusion matching work of various sensors.

Description

Improved Hungarian algorithm target tracking matching method based on fusion of camera and millimeter wave radar
Technical Field
The invention belongs to the technical field of automobile driving, and particularly relates to an improved Hungarian algorithm target tracking matching method based on fusion of a camera and a millimeter wave radar.
Background
With the continuous progress of the automobile industry and the continuous improvement of the economic level of China, the automobile keeping amount of China is continuously increased. But brings many problems such as frequent traffic accidents, more serious environmental pollution and traffic jam. According to research of research institutions, poor driving behaviors of drivers and non-strict adherence to traffic regulations are main reasons for frequent traffic accidents of various types. At present, various manufacturers continuously optimize the automobile structure and apply high-strength materials to improve the passive safety of the automobile, and simultaneously, an Advanced Driving Assistance System (ADAS) continuously iteratively upgraded by means of technology plays a powerful role. The ADAS system can alert drivers in emergency situations or take necessary safety measures proactively, which can reduce the probability and severity of traffic accidents to a large extent.
The sensors commonly used at present are laser radar, millimeter wave radar, camera, ultrasonic radar and the like. Millimeter wave radar and camera are the most common sensors of ADAS system, have low cost's advantage and technology and are more mature, are convenient for realize industrialization and use. The camera has with low costs, can discern different objects, and is comparatively outstanding in aspects such as object height and width measurement accuracy, accuracy that pedestrian, road sign discerned. The binocular camera arranged in front of the vehicle can also realize the positioning function. However, the detection effect of the camera is easily affected by severe environments, such as rain and fog weather and dark days. The millimeter wave radar realizes the functions of distance measurement and speed measurement by emitting electromagnetic waves, is not influenced by illumination and weather factors, but cannot identify lane lines, traffic signs and the like. Therefore, the camera and the radar are combined to realize the perception of the road environment, the advantages of the camera and the radar are complemented, the stable and reliable ADAS function is realized, and the method has important significance for improving the driving safety. The multi-sensor fusion can improve the accuracy of vehicle detection and tracking and has strong robustness. According to the data abstraction level, the information fusion system can be divided into three levels: data level fusion, feature level fusion, and decision level fusion. The data level fusion directly carries out fusion processing on the sensor observation data, and carries out feature extraction and judgment decision on the fused result, and the data fusion requires that the sensors are of the same type. Feature level fusion can be divided into two major categories, target state information fusion and target feature information fusion. The target state information fusion is mainly used in the field of multi-sensor target tracking, and data correlation and state estimation are performed after data registration of sensor data is completed. The decision-level fusion is a high-level fusion, each sensor makes a decision based on own data, and then fusion processing of local decisions is completed in a fusion center.
The fusion of multiple sensors at a target level becomes an important direction, hungarian matching is one of common target and track matching algorithms, the algorithm is simple and consumes few resources, but in practical use, hungarian is often used for forcibly matching targets which should not be matched in order to ensure the global maximum matching, so that the target tracks which should be correctly matched are mismatched, and therefore, the method has important significance for improving the matching accuracy of the targets in the target level fusion as much as possible, and therefore an improved Hungarian algorithm target tracking and matching method based on the fusion of a camera and a millimeter wave radar is provided.
Disclosure of Invention
The invention aims to provide an improved Hungarian algorithm target tracking matching method based on the fusion of a camera and a millimeter wave radar, so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: a target tracking matching method based on an improved Hungarian algorithm fused by a camera and a millimeter wave radar comprises a sensor adopted by a system, wherein a target set sent to a fusion center by the sensor adopted by the system is phi radar,refr 、Φ camera,refc Wherein the corner mark radar represents the millimeter wave radar from the sensor, and the corner mark refr represents the reference coordinate system is from the millimeter wave radar;
the target tracking and matching method comprises the following steps:
s1, carrying out time registration on multi-sensor data;
s2, carrying out spatial registration on the multi-sensor data;
s3, matching a sensor target with a system track;
the sensor target-system track matching comprises the following steps:
A. screening out matched targets: when the single sensor locally detects and tracks the target, the single sensor locally maintains a track set and is allocated with a track ID for distinguishing which target is tracked, the target set is matched with the target track set ID maintained by the fusion center after being sent to the fusion center, and if the same ID exists, the target set is the same tracked target and is directly matched;
B. calculating a correlation matrix M: calculating the distance similarity between each target in the unmatched target set and each track in the unmatched system tracks, and filling the distance similarity to the corresponding position of the incidence matrix, wherein the distance similarity is calculated in the following way:
Figure RE-GDA0003891243690000031
C. the maximum connected subgraph segmentation comprises the following steps:
step one, carrying out binarization on the incidence matrix to form a new graph G:
step two, subgraph segmentation: by adopting a data structure method and breadth-first search, using depth-first search with backtracking, the number of maximum connected subgraphs and the contained nodes can be divided, the maximum connected subgraphs are recorded, a new incidence matrix is formed for each subgraph, the value of the corresponding position of the incidence matrix is taken from the initial incidence matrix M to represent the distance similarity between a target and a system track, and an incidence matrix set Ms = { M is obtained 1 ,M 2 ,...,M n H, wherein n is the number of the segmented connected subgraphs;
D. matching in a subgraph;
E. and (6) sorting and checking.
Preferably, in the new graph matrix of the new graph G in the maximum connected subgraph segmentation step, the value of each element in the new graph matrix is as follows:
Figure RE-GDA0003891243690000041
preferably, M in the new graph matrix in the S3 maximum connected subgraph segmentation is M i,j Value = distance similarity, th, representing the corresponding position of the incidence matrix sensor1 The threshold value set when the sensor 1 is matched with the system track is shown, size (unassigndenebjs) shows the number of unmatched targets, namely the row/column number of the incidence matrix, and size (unassigndeneblocks) shows the number of unmatched system tracks, wherein the value of 1 shows that two nodes in the diagram are communicated, and 0 shows that the nodes are not communicated.
Preferably, the time registration in S1 is: taking a newer timestamp t in the camera and the millimeter wave target set as a predicted time, namely
Figure RE-GDA0003891243690000042
Secondly, predicting the time stamp earlier than the target at the predicted moment by adopting a prediction step of Kalman filtering, and predicting the time stamp by using a constant speed model; the prediction step formula of kalman filtering is as follows:
the system state equation is:
x k =Ax k-1 +Bu k-1 +w k-1
then, the predicting step estimates the state of the current time (k time) according to the posterior estimation value of the previous time (k-1 time) to obtain the prior estimation value of the k time:
Figure RE-GDA0003891243690000051
Figure RE-GDA0003891243690000052
thereby implementing the time synchronization, part of the kalman prediction.
Preferably, the spatial registration of S2 is: and (3) carrying out coordinate transformation on targets sent to the fusion center by different sensors, and transforming the data of the sensors to a reference system of the system track of the fusion center:
Figure RE-GDA0003891243690000053
wherein
Figure RE-GDA0003891243690000054
Representing homogeneous coordinates formed by the coordinates of the object in the sensor coordinate system,
Figure RE-GDA0003891243690000055
the coordinate of the sensor is converted into the homogeneous coordinate of the system track, R represents a rotation matrix from the sensor coordinate to the system track coordinate, and t represents a translation matrix from the sensor coordinate to the system track coordinate, and is determined by the installation position of the sensor hardware.
Preferably, the pseudo code for filtering out the matched target by the track matching in S3 is as follows:
Figure RE-GDA0003891243690000056
Figure RE-GDA0003891243690000061
and (4) sorting the successfully matched target-track pairs, the unmatched targets and the unmatched tracks.
Preferably, x in the calculation mode of calculating the correlation matrix M in the track matching in S3 s ,y s ,z s ,v s ,
Figure RE-GDA0003891243690000062
Respectively, the coordinates, velocity and position angle, x, of the sensor target under the system track coordinate system and after the predicted step has been performed c ,y c ,z c ,v c ,
Figure RE-GDA0003891243690000063
Coordinates saved for the system track and after the predicted step has been passedSpeed and position angle, p x ,p y ,p z ,p v ,
Figure RE-GDA0003891243690000064
Weight coefficients corresponding to the distance similarity components, respectively;
and when the obtained distance similarity d is greater than a set threshold value, directly taking the distance similarity d to the maximum value of the data structure, and indicating that the mark is unmatchable.
Preferably, the intra-subgraph matching of S4 is performed by using a hungarian matching method for the interior of each connected subgraph, and finally the following results in the subgraph are obtained: target-track matching pairs Φ in subgraph match,i Unmatched target phi in subgraph umobjs,i Unmatched flight path phi in subgraph umtracks,i
Preferably, the sorting and verifying of S5 includes: and (5) sorting the matching result of each sub-image matched in the S4 sub-image to obtain a target-track matching pair, an unmatched target and an unmatched track as follows:
Figure RE-GDA0003891243690000071
Figure RE-GDA0003891243690000072
Figure RE-GDA0003891243690000073
u represents the union set, n is the number of subgraphs,
to phi match Verifying each matching pair by adopting a set threshold value, if the matching pair is larger than the threshold value, changing the matching pair into unmatched matching, and adding the target and the flight path corresponding to the matching pair into phi umobjs And phi umtracks In (1).
Preferably, the matching result between the sensor target set and the system track set is a matching pair set phi match Not matching the target phi umobjs Unmatched flight path phi umtracks
Compared with the prior art, the invention has the beneficial effects that:
1. the accuracy of the Hungarian matching algorithm is improved; the matrix dimension is decomposed and reduced, so that the Hungarian matching speed is increased; the target tracking fusion matching effect of the Hungarian matching and the Kalman filtering is stable, and the method is suitable for the fusion matching work of various sensors.
2. The invention carries out pruning operation of connected subgraph segmentation on the condition that mismatching is easy to occur in Hungarian matching, thereby improving the matching accuracy of the matching, and on the other hand, because the quantity of targets, namely the dimensionality of an incidence matrix, is reduced by pruning, the time complexity of a program can be reduced by exponential order.
Drawings
FIG. 1 is a schematic block diagram of the process of the present invention;
FIG. 2 is a schematic diagram of the shortcomings of forced matching of the Hungarian matching algorithm of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 and fig. 2, the present invention provides a technical solution: an improved Hungarian algorithm target tracking matching method based on fusion of a camera and a millimeter wave radar comprises a sensor adopted by a system, wherein a target set transmitted to a fusion center by the sensor adopted by the system is phi radar,refr 、Φ camera,refc Wherein the corner mark radar represents the millimeter wave radar from the sensor, and the corner mark refr represents the reference coordinate system is from the millimeter wave radar;
the target tracking matching method comprises the following steps:
s1, time registration is carried out on multi-sensor data: taking the newer timestamp t in the camera and millimeter wave target set as the predicted time, namely
Figure RE-GDA0003891243690000081
Then, predicting the target with the timestamp earlier than the predicted time by adopting a prediction step of Kalman filtering, and predicting the timestamp by using a constant speed model; the prediction step formula of kalman filtering is as follows:
the system state equation is:
x k =Ax k-1 +Bu k-1 +w k-1
then, the predicting step estimates the state of the current time (k time) according to the posterior estimation value of the previous time (k-1 time) to obtain the prior estimation value of the k time:
Figure RE-GDA0003891243690000082
Figure RE-GDA0003891243690000083
thereby implementing a time synchronization, part of kalman prediction;
s2, carrying out spatial registration on the multi-sensor data: because the coordinate systems referred to by different sensors are not necessarily the same as the target sent to the fusion center, the data of the sensors are transformed to the reference system of the system track of the fusion center by coordinate transformation:
Figure RE-GDA0003891243690000091
wherein
Figure RE-GDA0003891243690000092
Representing homogeneous coordinates formed by the coordinates of the target in the sensor coordinate system,
Figure RE-GDA0003891243690000093
expressing the homogeneous coordinate transformed to the system track coordinate, R expressing the rotation matrix from the sensor coordinate to the system track coordinate, t expressing the translation matrix from the sensor coordinate to the system track coordinate, and determined by the installation position of sensor hardware;
s3, matching the sensor target with a system track, wherein the matching comprises the following steps:
A. screening out matched targets: when the single sensor locally detects and tracks the target, the single sensor locally maintains a track set and is allocated with a track ID to distinguish which target is tracked. After the target set is sent to the fusion center, the target set is matched with a target track set ID maintained by the fusion center, if the same ID exists, the target set is the same tracked target and is directly matched, and pseudo codes of the target set are as follows:
Figure RE-GDA0003891243690000094
Figure RE-GDA0003891243690000101
sorting the successfully matched target-track pairs, unmatched targets and unmatched tracks;
B. calculating a correlation matrix M: calculating the distance similarity between each target in the unmatched target set and each track in the unmatched system tracks, and filling the distance similarity to the corresponding position of the incidence matrix, wherein the distance similarity is calculated in the following way:
Figure RE-GDA0003891243690000102
wherein x s ,y s ,z s ,v s ,
Figure RE-GDA0003891243690000103
Are respectively the sensor target in seriesCoordinates, speed and position angle, x, under the system track coordinate system and after the predicted step has been taken c ,y c ,z c ,v c ,
Figure RE-GDA0003891243690000104
Coordinates, speed and position angle, p, respectively, saved for the system track and after the predicted step has been passed x ,p y ,p z ,p v ,
Figure RE-GDA0003891243690000105
Weight coefficients corresponding to the distance similarity components, respectively;
when the obtained distance similarity d is larger than a set threshold value, directly taking the distance similarity d to the maximum value of the data structure, and indicating that the distance similarity d is unmatchable;
C. the maximum connected subgraph segmentation is carried out, because the Hungarian matching algorithm can form maximum matching as far as possible, the defect of forced matching is easily caused, as shown in FIG. 2, in the Hungarian matching, in order to ensure the global maximum matching, the phenomena that the points 1 and 2 are matched and the points 3 and 4 are matched are easily caused. However, the distance between 1 and 2 exceeds the threshold, and after the final verification stage is abandoned, 2 belongs to an unmatched state, and 3 and 4 are matched, in order to avoid this, a maximum connected subgraph segmentation method is adopted, and the flow is as follows:
step one, carrying out binarization on the incidence matrix to form a new graph G: the value of each element in the new graph matrix of the new graph G is as follows:
Figure RE-GDA0003891243690000111
wherein M is i,j Value = distance similarity, th, representing the corresponding position of the incidence matrix sensor1 The threshold value set when the sensor 1 matches the system track is shown, the size (unassigned nedobjs) shows the number of unmatched targets, that is, the number of rows/columns of the correlation matrix, and the size (unassigned newsteps) shows the number of unmatched system tracks. The value of 1 indicates that two nodes in the graph are communicated, and the value of 0 indicates that the two nodes are not communicated;
step two, subgraph segmentation: the process adopts a data structure method, and the number of the maximum connected subgraphs and the contained nodes can be conveniently divided by using depth-first search with backtracking or breadth-first search. Recording it and forming a new incidence matrix for each subgraph, the value of the corresponding position of the incidence matrix, the initial incidence matrix M is taken to represent the distance similarity between the target and the system track, and an incidence matrix set Ms = { M = is obtained 1 ,M 2 ,...,M n H, wherein n is the number of the segmented connected subgraphs;
D. matching in a subgraph: and matching the interior of each connected subgraph by using a Hungarian matching method to finally obtain the following components in the subgraph: target-track matching pairs Φ within subgraph match,i Unmatched target phi in subgraph umobjs,i And unmatched flight path phi in the subgraph umtracks,i
E. Sorting and checking: and (4) sorting the matching results of each sub-image matched in the S4 sub-image to obtain a target-track matching pair, an unmatched target and an unmatched track as follows:
Figure RE-GDA0003891243690000112
Figure RE-GDA0003891243690000113
Figure RE-GDA0003891243690000114
u represents the union set, and n is the number of subgraphs.
For phi match Verifying each matching pair by adopting a set threshold value, if the matching pair is greater than the threshold value, changing the matching pair into an unmatched matching pair, and adding the target and the flight path corresponding to the matching pair into phi umobjs And phi umtracks In (1).
Thus, a sensor target set and a system navigation are obtainedMatching the pairing result between trace sets and matching the pair set phi match Unmatched target Φ umobjs Unmatched flight path phi umtracks . It can be used for subsequent fusion work.
As can be seen from the above description, the present invention has the following advantageous effects: aiming at the targets and track sets which are respectively identified and tracked by the camera and the millimeter waves, the target tracking matching method based on improved Hungarian matching and Kalman filtering is designed, and the accuracy of the Hungarian matching algorithm is improved; the matrix dimension is decomposed and reduced, so that the Hungarian matching speed is increased; the fusion matching effect is stable, and the method is applicable to the fusion matching work of various sensors.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
The above description is only for the purpose of illustrating the technical solutions of the present invention and not for the purpose of limiting the same, and other modifications or equivalent substitutions made by those skilled in the art to the technical solutions of the present invention should be covered within the scope of the claims of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A target tracking matching method based on an improved Hungarian algorithm and fused by a camera and a millimeter wave radar is characterized by comprising the following steps: comprising a sensor employed by the system, said systemThe target set sent to the fusion center by the sensors adopted in the system is phi radar,refr 、Φ camera,refc Wherein the corner mark radar represents the millimeter wave radar from the sensor, and the corner mark refr represents the reference coordinate system is from the millimeter wave radar;
the target tracking matching method comprises the following steps:
s1, carrying out time registration on multi-sensor data;
s2, carrying out spatial registration on the multi-sensor data;
s3, matching a sensor target with a system track;
the sensor target-system track matching comprises the following steps:
A. screening out matched targets: when the single sensor locally detects and tracks the target, the single sensor locally maintains a track set and is allocated with a track ID for distinguishing which target is tracked, the target set is matched with the target track set ID maintained by the fusion center after being sent to the fusion center, and if the same ID exists, the target set is the same tracked target and is directly matched;
B. calculating a correlation matrix M: calculating the distance similarity between each target in the unmatched target set and each track in the unmatched system tracks, and filling the distance similarity to the corresponding position of the incidence matrix, wherein the distance similarity is calculated in the following way:
Figure FDA0003802897600000011
C. the maximum connected subgraph segmentation comprises the following steps:
step one, carrying out binarization on the incidence matrix to form a new graph G:
step two, subgraph segmentation: by adopting a data structure method and breadth-first search, and using depth-first search with backtracking, the number of the maximum connected subgraphs and the contained nodes can be divided and recorded, a new incidence matrix is formed for each subgraph, the value of the corresponding position of the incidence matrix is taken from the initial incidence matrix M,representing the distance similarity of the target to the system track, the incidence matrix set Ms = { M) is obtained 1 ,M 2 ,...,M n N is the number of the segmented connected subgraphs;
D. matching in a subgraph;
E. and (6) sorting and checking.
2. The improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 1, is characterized in that: in the new graph matrix of the new graph G in the maximum connected subgraph segmentation step, the value of each element in the new graph matrix is as follows:
Figure FDA0003802897600000021
i=0,1,2,...,size(unassignedobjs)-1,j=0,1,2,...,size(unassignedtracks)-1。
3. the improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 2, characterized in that: m in a new graph matrix in the S3 maximum connected subgraph segmentation i,j Value = distance similarity, th, representing the corresponding position of the incidence matrix sensor1 The threshold value set when the sensor 1 is matched with the system track is shown, size (unassigndenebjs) shows the number of unmatched targets, namely the row/column number of the incidence matrix, and size (unassigndeneblocks) shows the number of unmatched system tracks, wherein the value of 1 shows that two nodes in the diagram are communicated, and 0 shows that the nodes are not communicated.
4. The improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 1, is characterized in that: the temporal registration in S1 is: taking a newer timestamp t in the camera and the millimeter wave target set as a predicted time, namely
Figure FDA0003802897600000022
Then, predicting the target with the timestamp earlier than the predicted time by adopting a prediction step of Kalman filtering, and predicting the timestamp by using a constant speed model; the prediction step formula of kalman filtering is as follows:
the system state equation is:
x k =Ax k-1 +Bu k-1 +w k-1
then, the predicting step estimates the state of the current time (k time) according to the posterior estimation value of the previous time (k-1 time) to obtain the prior estimation value of the k time:
Figure FDA0003802897600000031
Figure FDA0003802897600000032
thereby implementing the time synchronization, part of the kalman prediction.
5. The improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 1, is characterized in that: the spatial registration of S2 is: and (3) carrying out coordinate transformation on targets sent to the fusion center by different sensors, and transforming the data of the sensors to a reference system of the system track of the fusion center:
Figure FDA0003802897600000033
wherein
Figure FDA0003802897600000034
Representing homogeneous coordinates formed by the coordinates of the target in the sensor coordinate system,
Figure FDA0003802897600000035
to representAnd the homogeneous coordinate transformed to the system track coordinate, R represents a rotation matrix from the sensor coordinate to the system track coordinate, and t represents a translation matrix from the sensor coordinate to the system track coordinate, and is determined by the installation position of the sensor hardware.
6. The improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 1, is characterized in that: in the step S3, the pseudo code for filtering out the matched target by the track matching is as follows:
Figure FDA0003802897600000036
Figure FDA0003802897600000041
and (4) sorting the successfully matched target-track pairs, the unmatched targets and the unmatched tracks.
7. The improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 1, is characterized in that: x in the calculation mode of calculating the incidence matrix M in the track matching in the S3 s ,y s ,z s ,v s ,
Figure FDA0003802897600000042
Respectively, the coordinates, speed and position angle, x, of the sensor target under the system track coordinate system and after the predicted step has been taken c ,y c ,z c ,v c ,
Figure FDA0003802897600000043
Coordinates, speed and position angle, p, respectively, saved for the system track and after the predicted step has been passed x ,p y ,p z ,p v ,
Figure FDA0003802897600000044
Weight coefficients corresponding to the distance similarity components are respectively set;
and when the obtained distance similarity d is larger than a set threshold value, directly taking the obtained distance similarity d to the maximum value of the data structure, and indicating that the mark is unmatchable.
8. The improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 1, characterized in that: and the sub-graph internal matching of the S4 is to match the interior of each connected sub-graph by using a Hungarian matching method, and finally the sub-graphs are obtained: target-track matching pairs Φ within subgraph match,i Unmatched target phi in subgraph umobjs,i Unmatched flight path phi in subgraph umtracks,i
9. The improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 1, is characterized in that: the sorting and checking of the S5 comprises the following steps: and (4) sorting the matching results of each sub-image matched in the S4 sub-image to obtain a target-track matching pair, an unmatched target and an unmatched track as follows:
Figure FDA0003802897600000051
Figure FDA0003802897600000052
Figure FDA0003802897600000053
u represents the union set, n is the number of subgraphs,
for phi match In each matching pair, the verification is carried out by adopting a set threshold value, if the threshold value is largeChanging the target to be unmatched when the threshold is reached, and adding the target and the flight path corresponding to the matched pair into phi umobjs And phi umtracks In (1).
10. The improved Hungarian algorithm target tracking matching method based on the fusion of the camera and the millimeter wave radar as claimed in claim 9, is characterized in that: matching the pairing result between the sensor target set and the system track set with a matching pair set phi match Unmatched target Φ umobjs Unmatched flight path phi umtracks
CN202210988217.2A 2022-08-17 2022-08-17 Improved Hungarian algorithm target tracking matching method based on fusion of camera and millimeter wave radar Pending CN115390066A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210988217.2A CN115390066A (en) 2022-08-17 2022-08-17 Improved Hungarian algorithm target tracking matching method based on fusion of camera and millimeter wave radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210988217.2A CN115390066A (en) 2022-08-17 2022-08-17 Improved Hungarian algorithm target tracking matching method based on fusion of camera and millimeter wave radar

Publications (1)

Publication Number Publication Date
CN115390066A true CN115390066A (en) 2022-11-25

Family

ID=84120639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210988217.2A Pending CN115390066A (en) 2022-08-17 2022-08-17 Improved Hungarian algorithm target tracking matching method based on fusion of camera and millimeter wave radar

Country Status (1)

Country Link
CN (1) CN115390066A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240004915A1 (en) * 2022-06-29 2024-01-04 Microsoft Technology Licensing, Llc Ontology customization for indexing digital content

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110675431A (en) * 2019-10-08 2020-01-10 中国人民解放军军事科学院国防科技创新研究院 Three-dimensional multi-target tracking method fusing image and laser point cloud
CN111505624A (en) * 2020-04-30 2020-08-07 中国汽车工程研究院股份有限公司 Environment sensing method based on machine vision and millimeter wave radar data fusion
CN114035187A (en) * 2021-10-26 2022-02-11 北京国家新能源汽车技术创新中心有限公司 Perception fusion method of automatic driving system
WO2022116375A1 (en) * 2020-12-01 2022-06-09 中国人民解放军海军航空大学 Method for performing tracking-before-detecting on multiple weak targets by high-resolution sensor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110675431A (en) * 2019-10-08 2020-01-10 中国人民解放军军事科学院国防科技创新研究院 Three-dimensional multi-target tracking method fusing image and laser point cloud
CN111505624A (en) * 2020-04-30 2020-08-07 中国汽车工程研究院股份有限公司 Environment sensing method based on machine vision and millimeter wave radar data fusion
WO2022116375A1 (en) * 2020-12-01 2022-06-09 中国人民解放军海军航空大学 Method for performing tracking-before-detecting on multiple weak targets by high-resolution sensor
CN114035187A (en) * 2021-10-26 2022-02-11 北京国家新能源汽车技术创新中心有限公司 Perception fusion method of automatic driving system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李华楠 等: "结合匈牙利指派和改进粒子滤波的多目标跟踪算法", 电讯技术, no. 05, 28 May 2019 (2019-05-28) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240004915A1 (en) * 2022-06-29 2024-01-04 Microsoft Technology Licensing, Llc Ontology customization for indexing digital content

Similar Documents

Publication Publication Date Title
CN113379805B (en) Multi-information resource fusion processing method for traffic nodes
US9292750B2 (en) Method and apparatus for detecting traffic monitoring video
Kim et al. Robust lane detection based on convolutional neural network and random sample consensus
CN115372958A (en) Target detection and tracking method based on millimeter wave radar and monocular vision fusion
CN103064086A (en) Vehicle tracking method based on depth information
CN110753892A (en) Method and system for instant object tagging via cross-modality verification in autonomous vehicles
CN110869559A (en) Method and system for integrated global and distributed learning in autonomous vehicles
CN111222568A (en) Vehicle networking data fusion method and device
US11926318B2 (en) Systems and methods for detecting a vulnerable road user in an environment of a vehicle
JP2021026644A (en) Article detection apparatus, article detection method, and article-detecting computer program
Muresan et al. Multi-object tracking of 3D cuboids using aggregated features
CN116299500B (en) Laser SLAM positioning method and device integrating target detection and tracking
CN111612818A (en) Novel binocular vision multi-target tracking method and system
CN112598715A (en) Multi-sensor-based multi-target tracking method, system and computer readable medium
Lu et al. Fusion of camera-based vessel detection and ais for maritime surveillance
CN115390066A (en) Improved Hungarian algorithm target tracking matching method based on fusion of camera and millimeter wave radar
Meuter et al. 3D traffic sign tracking using a particle filter
Imad et al. Navigation system for autonomous vehicle: A survey
CN116953724A (en) Vehicle track tracking method, system and storage medium
CN115656962A (en) Method for identifying height-limited object based on millimeter wave radar
US20220405513A1 (en) Multi-Object Tracking For Autonomous Vehicles
CN115690721A (en) Method and equipment for detecting obstacles on route
CN114067552A (en) Pedestrian crossing track tracking and predicting method based on roadside laser radar
Lu et al. Target detection algorithm based on mmw radar and camera fusion
EP3770798A1 (en) Method and system for improving detection capabilities of machine learning-based driving assistance systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 201-116, Building 5, No. 8, Zhujiawan Street, Gusu District, Suzhou City, Jiangsu Province, 215000

Applicant after: Suzhou Guangmu Intelligent Technology Co.,Ltd.

Address before: Room 201-116, Building 5, No. 8, Zhujiawan Street, Gusu District, Suzhou City, Jiangsu Province, 215000

Applicant before: Suzhou Guangmu Automobile Technology Co.,Ltd.