CN110660084A - Multi-target tracking method and device - Google Patents
Multi-target tracking method and device Download PDFInfo
- Publication number
- CN110660084A CN110660084A CN201910943907.4A CN201910943907A CN110660084A CN 110660084 A CN110660084 A CN 110660084A CN 201910943907 A CN201910943907 A CN 201910943907A CN 110660084 A CN110660084 A CN 110660084A
- Authority
- CN
- China
- Prior art keywords
- target
- tracking
- matching degree
- track
- calculating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a multi-target tracking method and a multi-target tracking device. The invention belongs to the field of machine vision and multi-target tracking, and particularly relates to a method and a device for realizing multi-target tracking in a video scene by using an image analysis technology and a target motion trajectory prediction technology. The method comprises the following steps: predicting a target track, calculating a target track matching degree, extracting tracking target features, calculating a target feature matching degree, matching the target and obtaining a target ID. The method can effectively improve the accuracy of target tracking under the conditions that the target is blocked, the target is too dense, the video frame skipping is too large and the target moves fast, and reduce the target ID exchange phenomenon.
Description
Technical Field
The invention belongs to the field of machine vision and multi-target tracking, and particularly relates to a method and a device for realizing multi-target tracking by using an image analysis technology and a target motion trajectory prediction technology.
Background
The main tasks of multi-target tracking are to locate multiple targets simultaneously in a given video, maintain the ID of each target, and record the track of each target. The multi-target tracking needs to solve the following problems: determining the number of tracked targets, maintaining the ID of each target, solving the influence of frequent occlusion and solving the influence of similar appearance. The multi-target tracking method needs to solve the problems that the characteristics of all targets in a frame are measured, whether the targets between the frames are the same or not is judged based on the target characteristics, whether a leaving target exists or not is judged, whether a newly added target exists or not is judged, the IDs of the same targets are determined, the IDs of the leaving targets are deleted, and the IDs of the newly added targets are added.
The current mature multi-target tracking technology is a data association mode: by calculating the mahalanobis distance between the feature spaces of two objects in two frames, the one with the shortest distance is considered to be the same object. The method has the advantages of simplicity and high matching speed, and has the defect that the tracking fails when the conditions that the target is blocked, the target is too dense, the video frame skipping is too large, the target moves rapidly and the like are met.
The invention solves the problems by using a target track prediction technology and a target characteristic weighting matching technology, and improves the accuracy of target tracking in a complex scene.
Disclosure of Invention
The invention aims to solve the defects of multi-target tracking in a complex scene, and provides a method and a device for realizing multi-target tracking in the complex scene by using target track prediction, target matching, target feature extraction, inter-frame target feature comparison and weighted matching strategies, aiming at solving and optimizing the accuracy of target tracking under the conditions of more targets, target overlapping, similar target features and rapid target movement in the scene and reducing the target ID exchange phenomenon.
The invention is realized by adopting the following technical scheme:
the method introduces a target motion track prediction model, describes the motion track of a target by using a multi-dimensional space model, predicts the state (position) of the tracked target at the next moment (k moment) according to the state (position) of the tracked target at the current moment (k-1 moment) by using the track prediction model, and updates the target motion track.
The method introduces the target track matching degree to describe the matching degree of the tracking target at the current moment and the predicted track of the target. The target track matching degree is the distance between the actual position of the tracking target at the current moment and the predicted track of the target.
The invention introduces a target feature extraction model, and uses the model to calculate and extract the feature of each tracked target in each frame.
The method introduces the target feature matching degree to describe the matching degree of the features of the tracking target at the current moment (moment k) and the tracking target at the last moment (moment k-1), wherein the target feature matching degree is the distance between the features of the tracking target at the current moment and the features of the tracking target at the last moment.
The invention introduces a characteristic association model and outputs the characteristic association model: and (5) associating and matching the target. And the characteristic association model uses the target track matching degree and the target characteristic matching degree as input parameters to calculate and obtain the target association matching degree.
According to the method, a target association matching degree threshold value is introduced, the target association matching degree between two targets is higher than the threshold value and is considered to be effectively matched, the next calculation model can be entered, and the two targets are considered to be not matched if the target association matching degree is lower than the threshold value.
The present invention introduces a target departure time and a target departure time threshold. The target departure time is the time from the last successful match to the current time. When the target departure time is above the target departure time threshold, the target is considered to have departed.
A method of target tracking, comprising:
1. predicting target trajectory
And predicting the track position of the tracking target in the next state (k moment) according to the current state (k-1 moment) of the tracking target by using a track prediction model, and updating the motion track.
2. Calculating the matching degree of the target track
And calculating the target track matching degree between the real position of the target at the current time (k time) and the predicted track.
3. Extracting tracking target features
And extracting the characteristic information of the tracking target at the current moment.
4. Calculating the matching degree of the target characteristics
Calculating the target feature matching degree between the current time feature and the last time feature of the target
5. Object matching and obtaining object ID
And taking the target track matching degree and the target feature matching degree as input, calculating by using a target feature correlation model to obtain a target correlation matching degree, taking the target correlation matching degree as a cost model, and performing optimal target matching by using a matching model to obtain the ID of each target at the current moment.
In the matching process, setting a proper target association matching degree threshold, wherein the target is lower than the target association matching degree threshold but has not been successfully matched, and if the target is the target at the current moment, the target is a newly added target relative to the last moment at the current moment, and the newly added target is called a newly added target; if the target is the target at the last moment, the target which is already departed from the last moment at the current moment is called the departed target. And for the newly added target, considering that a new track is possibly generated, observing a plurality of next moments, and if continuous matching is successful, considering that the new track (new target) is generated and allocating a new ID to the target. And (3) recording the target leaving time for each track, and when the value is larger than the target leaving time threshold value, considering that the track is not matched for a long time, eliminating the tracked target and deleting the track and the target ID.
When the similarity of the target motion information and the characteristic information is high, the method improves the accuracy of target ID matching and substantially improves the phenomenon of ID exchange.
An apparatus for target tracking, comprising:
1. target trajectory prediction module
And predicting the track position of the tracking target in the next state (k moment) according to the current state (k-1 moment) of the tracking target by using a track prediction model, and updating the motion track.
2. Module for calculating target track matching degree
And calculating the target track matching degree between the real position of the target at the current time (k time) and the predicted track.
3. Module for extracting tracking target characteristics
And extracting the characteristic information of the tracking target at the current moment.
4. Module for calculating target feature matching degree
Calculating the target feature matching degree between the current time feature and the last time feature of the target
5. Object matching and object ID obtaining module
And taking the target track matching degree and the target feature matching degree as input, calculating by using a target feature correlation model to obtain a target correlation matching degree, taking the target correlation matching degree as a cost model, and performing optimal target matching by using a matching model to obtain the ID of each target at the current moment.
When the similarity of the target motion information and the characteristic information is high, the device improves the accuracy of target ID matching and substantially improves the phenomenon of ID exchange.
Drawings
Fig. 1 is a schematic flowchart illustrating a multi-target tracking method according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a multi-target tracking apparatus according to an embodiment of the present invention.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
The embodiment discloses a method for tracking multiple personnel based on the multi-target tracking method, please refer to fig. 1, which includes:
step S101, predicting a target motion track
Using an 8-dimensional space (x, y, r, h, x)', y', r', h') To depict the state of the motion trail of the person at a certain moment, x, y, r and h respectively represent the horizontal coordinate, the vertical coordinate, the aspect ratio and the height of the central position of the main trunk frame of the person, and x'、y'、r'、h'The increments of x, y, r, and h, respectively, are used to describe the corresponding velocity information of the person in the image coordinates. The trajectory is then predicted and updated using a kalman filter. The Kalman filter adopts a constant speed model and a linear observation model, and the observation variables are (x, y, r, h).
Step S102, calculating the matching degree of the target track
Calculating the Mahalanobis distance between the position of the person predicted by the Kalman filter and the actual position of the person at the current moment to obtain the target track matching degree of the person:represents the motion matching degree between the jth person and the ith track at the current moment, wherein SiThe trajectory being predicted by a Kalman filterThe measured covariance matrix, y, of the observation space at the current timeiIs the predicted observation of the trajectory at the current time, djIs the status (x, y, r, h) of the jth person at the current time.
Step S103, extracting tracking target characteristics
And extracting and calculating color histograms of main trunk frame images of all people in the current frame image in an hsv color space to serve as tracking target features.
Step S104, calculating the matching degree of the target characteristics
And calculating the matching degree of the target characteristics of the current time characteristics and the previous time characteristics of the personnel. The target feature matching degree is the Papanicolaou distance between the current frame and the previous frame of target features:and indicating the target feature matching degree between the jth person and the ith track of the current frame.
Step S105, weighted matching and target ID obtaining
And matching the personnel and the track by using a track distribution strategy. And taking the weighted average of the target track matching degree and the target feature matching degree as a target association matching degree for realizing target ID distribution measurement:and generating a cost matrix accordingly.
And then, performing matching by using a Hungarian algorithm to obtain the IDs of all the people at the current moment. And during matching, regarding persons who are not successfully matched as possible to generate a new track, observing a plurality of next frames, and if continuous matching is successful, regarding the generation of the new track, and allocating a new ID to the target. In addition, there is a value for each track to record the time from the last match to the current time, and when the value is greater than the threshold, the track is considered to have no match for a long time, and the track and the target ID are deleted.
The characteristic distance model is not limited in this embodiment, and preferably includes a mahalanobis distance, an euclidean distance, a cosine distance, a chebyshev distance, a normalized euclidean distance, and a hamming distance.
The trajectory prediction model is not limited in this embodiment, and preferably, a kalman filter method, a particle filter method, and a mean shift method are provided.
The embodiment does not limit the target association matching degree model, and preferably includes a hungarian method, a markov chain monte carlo method, and a greedy method.
The present embodiment is not limited to the related calculation model, and the specific implementation model does not limit the technical solution of the present embodiment unless otherwise specified, and should be understood as an example for facilitating the understanding of the technical solution by those skilled in the art.
The embodiment also discloses a device for tracking multiple personnel based on the multi-target tracking device, please refer to fig. 2, which includes:
module S201, predicting target motion trail module
The person movement locus is predicted and updated using the method of predicting the target movement locus in step S101 of the above embodiment.
Module S202, module for calculating target track matching degree
The target track matching degree of the tracked person is calculated by using the method for calculating the target track matching degree in step S102 of the above embodiment.
Module S203, extracting tracking target characteristic module
The method for extracting the tracking target features in step S103 of the above embodiment is used to extract the target features of the tracked person.
Module S204 and module for calculating target feature matching degree
The target feature matching degree of the tracker is calculated using the step S104 of the above embodiment.
Module S205, weighted matching and target ID obtaining module
The tracker ID is obtained using the weighted matching and target ID obtaining method of step S105 of the above embodiment.
The present embodiment does not limit the type of the tracking target, and in the case of not particularly claiming, the specific type of the tracking target does not limit the technical solution of the present embodiment, and it should be understood as an example for facilitating the understanding of the technical solution by those skilled in the art.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as methods and apparatus. The present invention has been described with reference to flowchart illustrations and structural schematic illustrations of methods and apparatus according to embodiments of the invention.
It should be understood that the above examples are only for clearly illustrating the present invention and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.
Claims (7)
1. A multi-target tracking method is characterized in that: predicting a target track, calculating a target track matching degree, extracting tracking target features, calculating a target feature matching degree, matching the target and obtaining a target ID.
2. The method for multi-target tracking according to claim 1, wherein the method for predicting the target trajectory predicts the trajectory position of the tracking target in the next state (k time) according to the current state (k-1 time) of the tracking target by using a trajectory prediction model, and updates the motion trajectory.
3. The method for multi-target tracking according to claim 1, wherein the calculating the target track matching degree is calculating the distance between the real position of the target at the current time (k time) and the predicted track.
4. The multi-target tracking method according to claim 1, wherein the extracting of the tracking target feature is extracting feature information of the tracking target at the current time.
5. The method of claim 1, wherein the calculating the target feature matching degree is calculating a distance between a current time feature of the target and a previous time feature of the target.
6. The multi-target tracking method according to claim 1, wherein the target matching and target ID obtaining are performed by taking a target track matching degree and a target feature matching degree as input, calculating a target association matching degree by using a target feature association model, taking the target association matching degree as a cost model, and performing optimal target matching by using a matching model to obtain the ID of each target at the current moment.
7. A multi-target tracking device is characterized in that: the system comprises a target track predicting module, a target track matching degree calculating module, a tracking target feature extracting module, a target feature matching degree calculating module and a target matching and target ID obtaining module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910943907.4A CN110660084A (en) | 2019-09-30 | 2019-09-30 | Multi-target tracking method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910943907.4A CN110660084A (en) | 2019-09-30 | 2019-09-30 | Multi-target tracking method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110660084A true CN110660084A (en) | 2020-01-07 |
Family
ID=69038778
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910943907.4A Pending CN110660084A (en) | 2019-09-30 | 2019-09-30 | Multi-target tracking method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110660084A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111652150A (en) * | 2020-06-04 | 2020-09-11 | 北京环境特性研究所 | Infrared anti-interference tracking method |
CN112184769A (en) * | 2020-09-27 | 2021-01-05 | 上海高德威智能交通系统有限公司 | Tracking abnormity identification method, device and equipment |
CN112802067A (en) * | 2021-01-26 | 2021-05-14 | 深圳市普汇智联科技有限公司 | Multi-target tracking method and system based on graph network |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107292911A (en) * | 2017-05-23 | 2017-10-24 | 南京邮电大学 | A kind of multi-object tracking method merged based on multi-model with data correlation |
CN107516321A (en) * | 2017-07-04 | 2017-12-26 | 深圳大学 | A kind of video multi-target tracking and device |
CN108765459A (en) * | 2018-04-18 | 2018-11-06 | 中国人民解放军国防科技大学 | Semi-online visual multi-target tracking method based on small trajectory graph association model |
CN109785368A (en) * | 2017-11-13 | 2019-05-21 | 腾讯科技(深圳)有限公司 | A kind of method for tracking target and device |
CN109785363A (en) * | 2018-12-29 | 2019-05-21 | 中国电子科技集团公司第五十二研究所 | A kind of unmanned plane video motion Small object real-time detection and tracking |
-
2019
- 2019-09-30 CN CN201910943907.4A patent/CN110660084A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107292911A (en) * | 2017-05-23 | 2017-10-24 | 南京邮电大学 | A kind of multi-object tracking method merged based on multi-model with data correlation |
CN107516321A (en) * | 2017-07-04 | 2017-12-26 | 深圳大学 | A kind of video multi-target tracking and device |
CN109785368A (en) * | 2017-11-13 | 2019-05-21 | 腾讯科技(深圳)有限公司 | A kind of method for tracking target and device |
CN108765459A (en) * | 2018-04-18 | 2018-11-06 | 中国人民解放军国防科技大学 | Semi-online visual multi-target tracking method based on small trajectory graph association model |
CN109785363A (en) * | 2018-12-29 | 2019-05-21 | 中国电子科技集团公司第五十二研究所 | A kind of unmanned plane video motion Small object real-time detection and tracking |
Non-Patent Citations (1)
Title |
---|
陈慧岩等: "《智能车辆理论与应用》", 北京理工大学出版社, pages: 157 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111652150A (en) * | 2020-06-04 | 2020-09-11 | 北京环境特性研究所 | Infrared anti-interference tracking method |
CN111652150B (en) * | 2020-06-04 | 2024-03-19 | 北京环境特性研究所 | Infrared anti-interference tracking method |
CN112184769A (en) * | 2020-09-27 | 2021-01-05 | 上海高德威智能交通系统有限公司 | Tracking abnormity identification method, device and equipment |
CN112184769B (en) * | 2020-09-27 | 2023-05-02 | 上海高德威智能交通系统有限公司 | Method, device and equipment for identifying tracking abnormality |
CN112802067A (en) * | 2021-01-26 | 2021-05-14 | 深圳市普汇智联科技有限公司 | Multi-target tracking method and system based on graph network |
CN112802067B (en) * | 2021-01-26 | 2024-01-26 | 深圳市普汇智联科技有限公司 | Multi-target tracking method and system based on graph network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113034548B (en) | Multi-target tracking method and system suitable for embedded terminal | |
CN107292911B (en) | Multi-target tracking method based on multi-model fusion and data association | |
CN112836640B (en) | Single-camera multi-target pedestrian tracking method | |
CN113269098A (en) | Multi-target tracking positioning and motion state estimation method based on unmanned aerial vehicle | |
CN108573496B (en) | Multi-target tracking method based on LSTM network and deep reinforcement learning | |
CN110751674A (en) | Multi-target tracking method and corresponding video analysis system | |
CN108022258B (en) | Real-time multi-target tracking method based on single multi-frame detector and Kalman filtering | |
CN105957105B (en) | The multi-object tracking method and system of Behavior-based control study | |
CN113674328A (en) | Multi-target vehicle tracking method | |
CN110660084A (en) | Multi-target tracking method and device | |
CN112991391A (en) | Vehicle detection and tracking method based on radar signal and vision fusion | |
CN111739053B (en) | Online multi-pedestrian detection tracking method under complex scene | |
CN111626194A (en) | Pedestrian multi-target tracking method using depth correlation measurement | |
CN111199556A (en) | Indoor pedestrian detection and tracking method based on camera | |
CN110827321B (en) | Multi-camera collaborative active target tracking method based on three-dimensional information | |
CN107871156B (en) | Ant colony multi-cell tracking system based on pheromone prediction | |
CN110688940A (en) | Rapid face tracking method based on face detection | |
CN111986225A (en) | Multi-target tracking method and device based on angular point detection and twin network | |
CN111582349A (en) | Improved target tracking algorithm based on YOLOv3 and kernel correlation filtering | |
CN116883458B (en) | Transformer-based multi-target tracking system fusing motion characteristics with observation as center | |
CN111639570B (en) | Online multi-target tracking method based on motion model and single-target clue | |
CN113537077A (en) | Label multi-Bernoulli video multi-target tracking method based on feature pool optimization | |
CN113608663A (en) | Fingertip tracking method based on deep learning and K-curvature method | |
CN114926859A (en) | Pedestrian multi-target tracking method in dense scene combined with head tracking | |
CN107016412A (en) | Adaptive template-updating strategy based on outward appearance and motion continuity cross validation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |