CN107516303A - Multi-object tracking method and system - Google Patents

Multi-object tracking method and system Download PDF

Info

Publication number
CN107516303A
CN107516303A CN201710778477.6A CN201710778477A CN107516303A CN 107516303 A CN107516303 A CN 107516303A CN 201710778477 A CN201710778477 A CN 201710778477A CN 107516303 A CN107516303 A CN 107516303A
Authority
CN
China
Prior art keywords
frame image
target
current frame
tracking
result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710778477.6A
Other languages
Chinese (zh)
Inventor
李航
陈志超
周剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Tongjia Youbo Technology Co Ltd
Original Assignee
Chengdu Tongjia Youbo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Tongjia Youbo Technology Co Ltd filed Critical Chengdu Tongjia Youbo Technology Co Ltd
Priority to CN201710778477.6A priority Critical patent/CN107516303A/en
Publication of CN107516303A publication Critical patent/CN107516303A/en
Pending legal-status Critical Current

Links

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The embodiment of the present invention proposes a kind of multi-object tracking method and system, is related to technical field of image processing.The multi-object tracking method and system, by being filtered processing to the previous frame image of current frame image, to predict that the moving target in the previous frame image appears in the position of the current frame image, and then obtain tracking result, moving object detection is carried out to the current frame image to obtain testing result, the testing result is carried out into matching with the tracking result to associate, to obtain association results, the position of the moving target in the current frame image is determined according to the association results.The multi-object tracking method and the applicable scene of system are wide, and computational efficiency is high, and consumption resource is less, has better stability and higher accuracy rate when reality is tracked to multiple targets.

Description

Multi-object tracking method and system
Technical field
The present invention relates to technical field of image processing, in particular to a kind of multi-object tracking method and system.
Background technology
Multiple target tracking is a study hotspot in computer vision field.Multiple target tracking refers to utilize calculating Machine, position, the size of each self-movement target interested, that there is certain notable visual signature are determined in the video sequence With the complete movement locus of each target.In recent years, with computer digital animation ability be skyrocketed through and graphical analysis skill The development of art, the real-time tracing technology of object are shown one's talent, and it is in video monitoring, video compression coding, robot navigation with determining There is very important practical value in the fields such as position, intelligent human-machine interaction, virtual reality.Meanwhile in military field, multiple target with Track has been used for scounting aeroplane battle reconnaissance, cruise missile terminal guidance, ballistic missile defense, marine surveillance, battlefield surveillance etc. Aspect.
Existing multi-object tracking method algorithm complex is higher, treatment effeciency is not high, consumption resource is more, actual right Less stable when multiple targets are tracked, accuracy rate be not high.
The content of the invention
It is an object of the invention to provide a kind of multi-object tracking method, to solve calculating existing for existing multiple target tracking Complexity, treatment effeciency be not high and tracks the problem of accuracy is not high.
It is an object of the invention to provide a kind of multiple-target system, to solve calculating existing for existing multiple target tracking Complexity, treatment effeciency be not high and tracks the problem of accuracy is not high.
To achieve these goals, the technical scheme that the embodiment of the present invention uses is as follows:
In a first aspect, the embodiment of the present invention proposes a kind of multi-object tracking method, the multi-object tracking method includes:It is right The previous frame image of current frame image is filtered processing, to predict that it is described that the moving target in the previous frame image appears in The position of current frame image, and then obtain tracking result;Moving object detection is carried out to the current frame image to be detected As a result;The testing result is carried out into matching with the tracking result to associate, to obtain association results;According to the association results Determine the position of the moving target in the current frame image.
Second aspect, the embodiment of the present invention also propose a kind of multiple-target system, and the multiple-target system includes Tracking result acquisition module, module of target detection, matching relating module and position determination module.The tracking result acquisition module Handled for being filtered to the previous frame image of current frame image to predict that the moving target in the previous frame image occurs In the position of the current frame image, and then obtain tracking result;The module of target detection is used for the current frame image Moving object detection is carried out to obtain testing result;The matching relating module is used to tie the testing result and the tracking Fruit carries out matching association, to obtain association results;The position determination module, for determining described work as according to the association results The position of moving target in prior image frame.
Compared with the prior art, the invention has the advantages that:Multi-object tracking method provided in an embodiment of the present invention And system, by being filtered processing to the previous frame image of current frame image, to predict the motion in the previous frame image Target appears in the position of the current frame image, and then obtains tracking result, and moving target is carried out to the current frame image The testing result is carried out matching with the tracking result and associated by detection to obtain testing result, to obtain association results, according to The position of the moving target in the current frame image is determined according to the association results.The multi-object tracking method and system-computed Efficiency high, consumption resource is less, has better stability and higher accuracy rate when reality is tracked to multiple targets, fits Higher scene is required for various monitors environments and to accuracy rate.
To enable the above objects, features and advantages of the present invention to become apparent, preferred embodiment cited below particularly, and coordinate Appended accompanying drawing, is described in detail below.
Brief description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below by embodiment it is required use it is attached Figure is briefly described, it will be appreciated that the following drawings illustrate only certain embodiments of the present invention, therefore be not construed as pair The restriction of scope, for those of ordinary skill in the art, on the premise of not paying creative work, can also be according to this A little accompanying drawings obtain other related accompanying drawings.
Fig. 1 shows the block diagram for the user terminal that can be applied to the embodiment of the present invention.
Fig. 2 shows the high-level schematic functional block diagram for the multiple-target system that first embodiment of the invention is provided.
Fig. 3 shows the high-level schematic functional block diagram that relating module is matched in Fig. 2.
Fig. 4 shows the schematic flow sheet for the multi-object tracking method that second embodiment of the invention is provided.
Fig. 5 shows the idiographic flow schematic diagram of step S103 in Fig. 4.
Icon:100- user terminals;110- memories;120- storage controls;130- processors;140- Peripheral Interfaces; 150- radio frequency units;160- display units;170- communication units;400- multiple-target systems;410- tracking results obtain mould Block;420- module of target detection;430- matches relating module;440- position determination modules;431- bipartite graph modular converters;432- Matching module.
Embodiment
Below in conjunction with accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Ground describes, it is clear that described embodiment is only part of the embodiment of the present invention, rather than whole embodiments.Generally exist The component of the embodiment of the present invention described and illustrated in accompanying drawing can be configured to arrange and design with a variety of herein.Cause This, the detailed description of the embodiments of the invention to providing in the accompanying drawings is not intended to limit claimed invention below Scope, but it is merely representative of the selected embodiment of the present invention.Based on embodiments of the invention, those skilled in the art are not doing The every other embodiment obtained on the premise of going out creative work, belongs to the scope of protection of the invention.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi It is defined, then it further need not be defined and explained in subsequent accompanying drawing in individual accompanying drawing.Meanwhile the present invention's In description, term " first ", " second " etc. are only used for distinguishing description, and it is not intended that instruction or hint relative importance.
Fig. 1 is refer to, the multi-object tracking method and system that the embodiment of the present invention is provided can be applied to as shown in Figure 1 In user terminal 100.In the present embodiment, user terminal 100 may be, but not limited to, PC (personal Computer, PC), smart mobile phone, tablet personal computer, personal digital assistant (personal digital assistant, PDA), Mobile internet surfing equipment (mobile Internet device, MID) etc..User terminal 100 includes memory 110, storage control Device 120, processor 130, Peripheral Interface 140, radio frequency unit 150, display unit 160 and communication unit 170.
Memory 110, storage control 120, processor 130, Peripheral Interface 140, radio frequency unit 150, display unit 160 And communication unit 170, directly or indirectly it is electrically connected between each element, to realize the transmission of data or interaction.For example, this A little elements can be realized by one or more communication bus or signal wire be electrically connected between each other.In the present embodiment, more mesh Mark tracking system 400 can be stored in memory 110 or consolidated in the form of software or firmware (firmware) including at least one Change the software function module in the operating system (operating system, OS) of user terminal 100.Processor 130 is used for Perform the executable module that stores in memory 110, for example, software function module included by the multiple-target system 400 and Computer program etc..
Wherein, the memory 110 can be but not limited to, random access memory (Random Access Memory, RAM), read-only storage (Read Only Memory, ROM), programmable read only memory (Programmable Read-Only Memory, PROM), erasable read-only memory (Erasable Programmable Read-Only Memory, EPROM), Electricallyerasable ROM (EEROM) (Electric Erasable Programmable Read-Only Memory, EEPROM) etc.. Memory 110 can be used for storage software program and module, and processor 130 is used for after execute instruction is received, and performs the journey Sequence.The access of processor 130 and other possible components to memory 110 can be carried out under the control of storage control 120.
Processor 130 is probably a kind of IC chip, has the disposal ability of signal.Processor 130 can be logical With processor, including central processing unit (Central Processing Unit, CPU), network processing unit (Network Processor, NP) etc.;Can also be digital signal processor (DSP)), application specific integrated circuit (ASIC), ready-made programmable gate Array (FPGA) either other PLDs, discrete gate or transistor logic, discrete hardware components.
Peripheral Interface 140 is by various input/output devices (such as radio frequency unit 150, display unit 160) coupled to processing Device 130 and memory 110.In certain embodiments, Peripheral Interface 140, processor 130 and storage control 120 can be with Realized in one single chip.In some other example, they can be realized by independent chip respectively.
Radio frequency unit 150 is used to receiving and sending radio wave signal, realizes that radio wave and the mutual of electric signal turn Change, so as to realize the radio communication between user terminal 100 and other equipment (such as web camera etc.).
The display unit 160 is used to provide an interactive interface, and video is observed in order to the user of user terminal 100 Image.In the present embodiment, the display unit 160 can be liquid crystal display or touch control display, and it can be to support single-point With the capacitance type touch control screen of multi-point touch operation or resistance type touch control screen etc..Single-point and multi-point touch operation is supported to refer to that touch-control shows Show that device can sense the touch control operation caused by one or more positions on the touch control display, and touched what this was sensed Control operation transfers to processor 130 to be calculated and handled.
Communication unit 170, which is used to establish with other equipment, to be connected, so as to realize between user terminal 100 and other equipment Communication connection.For example, the radiofrequency signal that the communication unit 170 can utilize radio frequency unit 150 to send is connected to network, Jin Eryu Outside other equipment establishes communication connection.
It should be noted that the user terminal 100 can also carry out wired connection with other equipment, data transfer is realized, This is not limited.
It is appreciated that the structure shown in Fig. 1 is only to illustrate, the user terminal 100 may also include more more than shown in Fig. 1 Either less component or there is the configuration different from shown in Fig. 1.Each component shown in Fig. 1 can use hardware, software Or its combination is realized.
First embodiment
Fig. 2 is refer to, the functional module signal of the multiple-target system 400 provided by first embodiment of the invention Figure.The multiple-target system 400 can be applied to above-mentioned user terminal 100, for entering to multiple targets in frame of video Line trace, realize video monitoring, navigation, positioning etc..The multiple-target system 400 include tracking result acquisition module 410, Module of target detection 420, matching relating module 430 and position determination module 440.
The tracking result acquisition module 410 is used to be filtered processing to the previous frame image of current frame image, with pre- The moving target surveyed in the previous frame image appears in the position of the current frame image, and then obtains tracking result.
In the present embodiment, for the current frame image in the video image of acquisition, place is filtered to previous frame image Reason, is predicted with the track to moving target present in previous frame image, and the moving target for obtaining previous frame image occurs Position in current frame image, it is tracking result of each moving target of previous frame image in current frame image.
Specifically, the filtering algorithm that the tracking result acquisition module 410 uses when being filtered processing is included substantially Comprise the following steps:(1) filter state equation y is pre-establishedk+1=f (yk,k)+qkWith observational equation zk+1=h (yk+1,k+1)+ rk+1, wherein, yk∈R*Represent the state at k moment;zk∈R*Represent the measured value at k moment;qk~N (0, Qk) it is that Gauss measurement is made an uproar Sound;rk~N (0, Rk) it is Gauss measurement noise;(2) initialized, obtain init state averageWith association side Difference(3) converted according to UT, the sampling dot matrix X at k moment is worth to using statek, According to the XkThe state variable y at k+1 moment is obtained with the state equation of foundationk+1=f (Xk,k);(4) according to state variable yk+1= f(Xk, k) and weight proportioning can obtain the average at k+1 momentAnd covarianceThe average for the state predicted and association side Difference;(5) renewal sampling dot matrix, i.e., according to the average at k+1 momentAnd covarianceThe sampling dot matrix updated(6) according to the observational equation of foundation and the sampling dot matrix of renewalObtain predict the k+1 moment measured value be(7) average of measured value is calculatedCovariance With the cross covariance of sampled point and measured valueWherein, w and W is respectively weight;(8) filtering is calculated GainThe state average of renewalWith covarianceIt that is to say, the tracking result acquisition module 410 uses above-mentioned filtering algorithm, pin To the moving target in previous frame image, predicted state is calculated by the state equation of foundation, then passes through the sight of foundation Survey equation and obtain observation to be modified and update to predicted state, and then obtain accurate tracking result.
The module of target detection 420 is used to carry out moving object detection to the current frame image to obtain detection knot Fruit.In the present embodiment, it can use and be carried out based on motion analysis, based on algorithm of target detection such as machine learning to current frame image Moving object detection, obtain testing result.For example, the testing result can contain the moving target in current frame image Frame is limited, reflects positional information of the moving target detected in current frame image.
The matching relating module 430 associates for the testing result to be carried out into matching with the tracking result, to obtain Take association results.
In the present embodiment, the tracking result of above-mentioned acquisition and testing result are entered line number by the matching relating module 430 According to association, so that tracking result is matched and associated with the moving target in testing result, and then association results are obtained.
As shown in figure 3, in the present embodiment, the matching relating module 430 specifically may include bipartite graph modular converter 431 And matching module 432.
The bipartite graph modular converter 431 is used to the testing result and the tracking result being respectively converted into one two The vertex set of component.Specifically, the testing result and the tracking result are divided into two by the bipartite graph modular converter 431 Individual to occur simultaneously for empty vertex set X and Y, point of any two in identity set is all not attached in wherein X and Y, all frontier junctures It is associated in two summits, a summit belongs to X, and a summit belongs to Y, to obtain bipartite graph subsequently through searching augmenting path Maximum matching.For a bipartite graph G, M is the subset of G sides collection, if M meets that any two sides therein do not depend on together One summit, then can M be referred to as a matching, include the most maximum matching for being matched as bipartite graph of matching side number.
The matching module 432 is used to be that matching is found to obtain associated objects and not in each summit in the vertex set Associated objects.It is appreciated that the matching module 432 is by corresponding with the testing result in the above-mentioned bipartite graph being converted to Summit and summit corresponding with the tracking result matched.Specifically, it may include following steps:(1) initialize feasible Top mark, i.e., to one, each summit label in the bipartite graph;(2) each summit is found using augmenting path theorem and matched, Each summit is matched by way of finding one on M augmenting path, the foundation of matching be preferably space away from From i.e. nearest neighbouring rule.Specifically, do not match summit A from one, successively by it is non-matching while, matching while, it is non-matching The alternating path on side ..., if not matching summit B (not matched with other summits also) by way of another, this alternating path is known as Augmenting path, now think that summit A is matched by summit B.When all summits in bipartite graph are all matched, then show to obtain is Perfect matching.It should be noted that in the application of reality, inevitably new moving target, and due to the moon The reasons such as overlapping covering, moving target between shadow, object be similar to background, there may be in bipartite graph part summit can not by It is fitted on, therefore, associated objects and not associated target (not having in Perfect matching) is included in the association results, associated objects are at this It is the summit matched in bipartite graph in embodiment, not associated target is the summit not matched in bipartite graph.
The position determination module 440 is used to determine the moving target in the current frame image according to the association results Position.
In the present embodiment, the position determination module 440 is to association results obtained above by carrying out analysis judgement, Associated objects and not associated target in association results determine the position of the moving target in the current frame image, can have Effect improves loses because tracking target caused by the uncertain factor such as overlapping between moving target is blocked, moving target is similar to background The problems such as appearance, substantially increase the Stability and veracity of tracking.
Specifically, the position determination module 440 be used for will the tracking result conduct corresponding with the associated objects The position of moving target in the current frame image.Because the data in tracking result have more reliability, in testing result Data easily there is flase drop, therefore in the present embodiment will tracking result corresponding with the associated objects as described current The position of moving target in two field picture.
The position determination module 440 be additionally operable to be predicted the testing result corresponding to the not associated target with The position of new moving target is established in the current frame image.For example, (it can be managed for the not associated target in association results Solve as some summit in bipartite graph), if corresponding to the not associated target being a testing result, show the testing result Corresponding moving target is probably the new moving target into monitor area, and the position determination module 440 is to the new motion mesh Mark carries out trajectory predictions and obtains the position of new moving target, and using the position of the new moving target as the present frame figure The position of a moving target as in.
The frame number that the position determination module 440 is additionally operable to continuously associate failure when the not associated target meets default bar During part, the tracking result corresponding to the not associated target is eliminated.For example, for the not associated mesh in association results Mark (it will also be appreciated that being some summit in bipartite graph), if corresponding to the not associated target it is a tracking result, and Continuous N frames (N >=2) do not find matching (i.e. association failure), then have shown the corresponding moving target of the tracking result Monitor area is have left, is now eliminated the tracking result.
Second embodiment
Fig. 4 shows the schematic flow sheet for the multi-object tracking method that second embodiment of the invention provides.Need what is illustrated It is that the multi-object tracking method described in the embodiment of the present invention is not using Fig. 4 and particular order as described below as limitation, its base Present principles and caused technique effect are identical with first embodiment, to briefly describe, do not refer to part in the present embodiment, refer to Corresponding contents in first embodiment.It should be appreciated that in other embodiments, multi-object tracking method of the present invention is wherein The order of part steps can be exchanged with each other according to being actually needed, or part steps therein can also be omitted or deleted.Under Face, the idiographic flow shown in Fig. 4 will be described in detail.
Step S101, processing is filtered to the previous frame image of current frame image, to predict in the previous frame image Moving target appear in the position of the current frame image, and then obtain tracking result.
It is appreciated that step S101 can be performed by above-mentioned tracking result acquisition module 410.
Step S102, moving object detection is carried out to the current frame image to obtain testing result.
It is appreciated that step S102 can be performed by above-mentioned module of target detection 420.
Step S103, the testing result is subjected to matching with the tracking result and associated, to obtain association results.
It is appreciated that step S103 can be performed by above-mentioned matching relating module 430.
Further, as shown in figure 5, the step S103 may particularly include following sub-step:
Sub-step S1031, the testing result and the tracking result are respectively converted into the vertex set of a bipartite graph.
It is appreciated that the step can be performed by above-mentioned bipartite graph modular converter 431.
Sub-step 1032, it is that matching is found to obtain associated objects and not associated mesh in each summit in the vertex set Mark.
It is appreciated that the step can be performed by above-mentioned matching module 432.
Step S104, the position of the moving target in the current frame image is determined according to the association results.
Specifically, using tracking result corresponding with the associated objects as the moving target in the current frame image Position;Pair testing result corresponding with the not associated target is predicted to establish new motion in the current frame image The position of target;When the frame number that the not associated target continuously associates failure meets preparatory condition, by the not associated target Corresponding tracking result is eliminated.
It is appreciated that the step can have above-mentioned position determination module 440 to perform.
In summary, the multi-object tracking method and system that the embodiment of the present invention is provided, by current frame image Previous frame image is filtered processing, to predict that the moving target in the previous frame image appears in the current frame image Position, and then obtain tracking result;Moving object detection is carried out to the current frame image to obtain testing result, by the inspection Survey result carries out matching with the tracking result and associated, i.e., by the way that the testing result and the tracking result are converted into one The vertex set of bipartite graph, it is that matching is found to obtain association results in each summit in the vertex set, the association results include Associated objects and not associated target;The position of the moving target in the current frame image is determined according to the association results, is had Body, will tracking result conduct corresponding with the associated objects for the associated objects in association results and not associated target The position of moving target in the current frame image, pair testing result corresponding with the not associated target are predicted To establish the position of new moving target in the current frame image, when tracking result corresponding with the not associated target connects When the frame number of continuous association failure meets preparatory condition, tracking result corresponding with the not associated target is eliminated, so as to Realize the tracking of multiple target.The multi-object tracking method and system are high using filtering and the method computational efficiency associated, consumption money Source is less, and the locus of the moving target to detecting is analyzed, and can be accurately positioned the position of current kinetic target, with And the historical movement track of moving target is obtained, compared to traditional method, there is higher accuracy rate.In addition, the present invention is implemented The multi-object tracking method and system that example is provided can be applied in general monitoring field or demographics scene, can also apply In the scene that some require higher for accuracy rate, practicality is wide.
It should be noted that herein, the relational terms of such as " first " and " second " or the like are used merely to one Individual entity or operation make a distinction with another entity or operation, and not necessarily require or imply these entities or operate it Between any this actual relation or order be present.Moreover, term " comprising ", "comprising" or its any other variant are intended to Cover including for nonexcludability, so that process, method, article or equipment including a series of elements not only include those Key element, but also the other element including being not expressly set out, or also include for this process, method, article or set Standby intrinsic key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that Other identical element in the process including the key element, method, article or equipment also be present.
The preferred embodiments of the present invention are the foregoing is only, are not intended to limit the invention, for the skill of this area For art personnel, the present invention can have various modifications and variations.Within the spirit and principles of the invention, that is made any repaiies Change, equivalent substitution, improvement etc., should be included in the scope of the protection.It should be noted that:Similar label and letter exists Similar terms is represented in following accompanying drawing, therefore, once being defined in a certain Xiang Yi accompanying drawing, is then not required in subsequent accompanying drawing It is further defined and explained.

Claims (10)

1. a kind of multi-object tracking method, it is characterised in that the multi-object tracking method includes:
Processing is filtered to the previous frame image of current frame image, to predict that the moving target in the previous frame image occurs In the position of the current frame image, and then obtain tracking result;
Moving object detection is carried out to the current frame image to obtain testing result;
The testing result is carried out into matching with the tracking result to associate, to obtain association results;
The position of the moving target in the current frame image is determined according to the association results.
2. multi-object tracking method as claimed in claim 1, it is characterised in that described by the testing result and the tracking As a result matching association is carried out, is included the step of to obtain association results:
The testing result and the tracking result are respectively converted into the vertex set of a bipartite graph;
Matching is found for each summit in the vertex set to obtain associated objects and not associated target.
3. multi-object tracking method as claimed in claim 2, it is characterised in that described according to described in association results determination The step of position of moving target in current frame image, includes:
Position using the tracking result corresponding with the associated objects as the moving target in the current frame image.
4. multi-object tracking method as claimed in claim 2, it is characterised in that described according to described in association results determination The step of position of moving target in current frame image, also includes:
The testing result corresponding to the not associated target is predicted to establish new fortune in the current frame image The position of moving-target.
5. multi-object tracking method as claimed in claim 2, it is characterised in that described according to described in association results determination The step of position of moving target in current frame image, also includes:
When the frame number that the not associated target continuously associates failure meets preparatory condition, by institute corresponding to the not associated target Tracking result is stated to be eliminated.
6. a kind of multiple-target system, it is characterised in that the multiple-target system includes:
Tracking result acquisition module, it is described previous to predict for being filtered processing to the previous frame image of current frame image Moving target in two field picture appears in the position of the current frame image, and then obtains tracking result;
Module of target detection, for carrying out moving object detection to the current frame image to obtain testing result;
Relating module is matched, is associated for the testing result to be carried out into matching with the tracking result, to obtain association results;
Position determination module, for determining the position of the moving target in the current frame image according to the association results.
7. multiple-target system as claimed in claim 6, it is characterised in that the matching relating module includes:
Bipartite graph modular converter, for the testing result and the tracking result to be respectively converted into the summit of a bipartite graph Collection;
Matching module, for finding matching for each summit in the vertex set to obtain associated objects and not associated target.
8. multiple-target system as claimed in claim 7, it is characterised in that the position determination module be used for will with it is described Position of the tracking result as the moving target in the current frame image corresponding to associated objects.
9. multiple-target system as claimed in claim 7, it is characterised in that the position determination module is additionally operable to described The testing result corresponding to not associated target is predicted to establish the position of new moving target in the current frame image Put.
10. multiple-target system as claimed in claim 7, it is characterised in that the position determination module is additionally operable to work as institute When stating not associated target and continuously associating the frame number of failure and meet preparatory condition, the tracking corresponding to the not associated target is tied Fruit is eliminated.
CN201710778477.6A 2017-09-01 2017-09-01 Multi-object tracking method and system Pending CN107516303A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710778477.6A CN107516303A (en) 2017-09-01 2017-09-01 Multi-object tracking method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710778477.6A CN107516303A (en) 2017-09-01 2017-09-01 Multi-object tracking method and system

Publications (1)

Publication Number Publication Date
CN107516303A true CN107516303A (en) 2017-12-26

Family

ID=60723773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710778477.6A Pending CN107516303A (en) 2017-09-01 2017-09-01 Multi-object tracking method and system

Country Status (1)

Country Link
CN (1) CN107516303A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108446622A (en) * 2018-03-14 2018-08-24 海信集团有限公司 Detecting and tracking method and device, the terminal of target object
CN108986138A (en) * 2018-05-24 2018-12-11 北京飞搜科技有限公司 Method for tracking target and equipment
CN109087335A (en) * 2018-07-16 2018-12-25 腾讯科技(深圳)有限公司 A kind of face tracking method, device and storage medium
CN109753911A (en) * 2018-12-28 2019-05-14 深圳先进技术研究院 The method and relevant apparatus of the more object tracking abilities of promotion of virtual reality system
CN109903310A (en) * 2019-01-23 2019-06-18 平安科技(深圳)有限公司 Method for tracking target, device, computer installation and computer storage medium
CN109934849A (en) * 2019-03-08 2019-06-25 西北工业大学 Online multi-object tracking method based on track metric learning
CN110555867A (en) * 2019-09-05 2019-12-10 杭州立宸科技有限公司 Multi-target object tracking method fusing object capturing and identifying technology
CN110569785A (en) * 2019-09-05 2019-12-13 杭州立宸科技有限公司 Face recognition method based on fusion tracking technology
CN110706256A (en) * 2019-09-27 2020-01-17 杭州博雅鸿图视频技术有限公司 Detection tracking algorithm optimization method based on multi-core heterogeneous platform
CN111079525A (en) * 2019-11-05 2020-04-28 阿里巴巴集团控股有限公司 Image processing method, apparatus, system and storage medium
CN111127508A (en) * 2018-10-31 2020-05-08 杭州海康威视数字技术股份有限公司 Target tracking method and device based on video
CN111899275A (en) * 2020-08-12 2020-11-06 中国科学院长春光学精密机械与物理研究所 Target detection tracking method, device and storage medium
CN112639872A (en) * 2020-04-24 2021-04-09 华为技术有限公司 Method and device for difficult mining in target detection
WO2021232652A1 (en) * 2020-05-22 2021-11-25 北京百度网讯科技有限公司 Target tracking method and apparatus, electronic device, and computer-readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101783020A (en) * 2010-03-04 2010-07-21 湖南大学 Video multi-target fast tracking method based on joint probability data association
CN102156863A (en) * 2011-05-16 2011-08-17 天津大学 Cross-camera tracking method for multiple moving targets
US20110216939A1 (en) * 2010-03-03 2011-09-08 Gwangju Institute Of Science And Technology Apparatus and method for tracking target
CN103500456A (en) * 2013-10-22 2014-01-08 北京大学 Object tracking method and equipment based on dynamic Bayes model network
CN104217428A (en) * 2014-08-22 2014-12-17 南京邮电大学 Video monitoring multi-target tracking method for fusion feature matching and data association
EP2840528A2 (en) * 2013-08-20 2015-02-25 Ricoh Company, Ltd. Method and apparatus for tracking object

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110216939A1 (en) * 2010-03-03 2011-09-08 Gwangju Institute Of Science And Technology Apparatus and method for tracking target
CN101783020A (en) * 2010-03-04 2010-07-21 湖南大学 Video multi-target fast tracking method based on joint probability data association
CN102156863A (en) * 2011-05-16 2011-08-17 天津大学 Cross-camera tracking method for multiple moving targets
EP2840528A2 (en) * 2013-08-20 2015-02-25 Ricoh Company, Ltd. Method and apparatus for tracking object
CN103500456A (en) * 2013-10-22 2014-01-08 北京大学 Object tracking method and equipment based on dynamic Bayes model network
CN104217428A (en) * 2014-08-22 2014-12-17 南京邮电大学 Video monitoring multi-target tracking method for fusion feature matching and data association

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108446622A (en) * 2018-03-14 2018-08-24 海信集团有限公司 Detecting and tracking method and device, the terminal of target object
CN108986138A (en) * 2018-05-24 2018-12-11 北京飞搜科技有限公司 Method for tracking target and equipment
CN109087335A (en) * 2018-07-16 2018-12-25 腾讯科技(深圳)有限公司 A kind of face tracking method, device and storage medium
CN111127508A (en) * 2018-10-31 2020-05-08 杭州海康威视数字技术股份有限公司 Target tracking method and device based on video
CN111127508B (en) * 2018-10-31 2023-05-02 杭州海康威视数字技术股份有限公司 Target tracking method and device based on video
CN109753911A (en) * 2018-12-28 2019-05-14 深圳先进技术研究院 The method and relevant apparatus of the more object tracking abilities of promotion of virtual reality system
CN109753911B (en) * 2018-12-28 2022-11-11 深圳先进技术研究院 Method and related device for improving multi-object tracking capability of virtual reality system
CN109903310A (en) * 2019-01-23 2019-06-18 平安科技(深圳)有限公司 Method for tracking target, device, computer installation and computer storage medium
WO2020151167A1 (en) * 2019-01-23 2020-07-30 平安科技(深圳)有限公司 Target tracking method and device, computer device and readable storage medium
CN109934849A (en) * 2019-03-08 2019-06-25 西北工业大学 Online multi-object tracking method based on track metric learning
CN110569785A (en) * 2019-09-05 2019-12-13 杭州立宸科技有限公司 Face recognition method based on fusion tracking technology
CN110555867A (en) * 2019-09-05 2019-12-10 杭州立宸科技有限公司 Multi-target object tracking method fusing object capturing and identifying technology
CN110555867B (en) * 2019-09-05 2023-07-07 杭州智爱时刻科技有限公司 Multi-target object tracking method integrating object capturing and identifying technology
CN110569785B (en) * 2019-09-05 2023-07-11 杭州智爱时刻科技有限公司 Face recognition method integrating tracking technology
CN110706256A (en) * 2019-09-27 2020-01-17 杭州博雅鸿图视频技术有限公司 Detection tracking algorithm optimization method based on multi-core heterogeneous platform
CN111079525A (en) * 2019-11-05 2020-04-28 阿里巴巴集团控股有限公司 Image processing method, apparatus, system and storage medium
CN111079525B (en) * 2019-11-05 2023-05-30 阿里巴巴集团控股有限公司 Image processing method, device, system and storage medium
CN112639872A (en) * 2020-04-24 2021-04-09 华为技术有限公司 Method and device for difficult mining in target detection
CN112639872B (en) * 2020-04-24 2022-02-11 华为技术有限公司 Method and device for difficult mining in target detection
WO2021232652A1 (en) * 2020-05-22 2021-11-25 北京百度网讯科技有限公司 Target tracking method and apparatus, electronic device, and computer-readable storage medium
JP7375192B2 (en) 2020-05-22 2023-11-07 ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド Target tracking methods, devices, electronic devices, computer readable storage media and computer program products
CN111899275A (en) * 2020-08-12 2020-11-06 中国科学院长春光学精密机械与物理研究所 Target detection tracking method, device and storage medium

Similar Documents

Publication Publication Date Title
CN107516303A (en) Multi-object tracking method and system
Lei et al. Design of a new low-cost unmanned aerial vehicle and vision-based concrete crack inspection method
CN110986969B (en) Map fusion method and device, equipment and storage medium
CN110598559B (en) Method and device for detecting motion direction, computer equipment and storage medium
Leon et al. Video hand gestures recognition using depth camera and lightweight cnn
CN110909701B (en) Pedestrian feature extraction method, device, equipment and medium
US20220128683A1 (en) Method, system and computer program product for intelligent tracking
CN106713862A (en) Tracking monitoring method and apparatus
CN114595124B (en) Time sequence abnormity detection model evaluation method, related device and storage medium
Jing et al. Sports image detection based on FPGA hardware system and particle swarm algorithm
Varshney et al. Human activity recognition by combining external features with accelerometer sensor data using deep learning network model
Islam et al. Applied human action recognition network based on SNSP features
CN110008815A (en) The generation method and device of recognition of face Fusion Model
Cai et al. Multitask learning method for detecting the visual focus of attention of construction workers
CN112861808A (en) Dynamic gesture recognition method and device, computer equipment and readable storage medium
Zhao et al. Trine: Cloud-edge-device cooperated real-time video analysis for household applications
CN116168040A (en) Component direction detection method and device, electronic equipment and readable storage medium
Oztel et al. A hybrid LBP-DCNN based feature extraction method in YOLO: An application for masked face and social distance detection
CN115376203A (en) Data processing method and device
Gupta et al. Markerless tracking and gesture recognition using polar correlation of camera optical flow
CN115278014A (en) Target tracking method, system, computer equipment and readable medium
CN114527456A (en) UWB-based motion trajectory identification method and electronic equipment
Lawal Study on strawberry fruit detection using lightweight algorithm
Zhu et al. Motion-sensor fusion-based gesture recognition and its VLSI architecture design for mobile devices
Sun et al. Values of intelligent alarm system under photoelectric sensor networks

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171226

RJ01 Rejection of invention patent application after publication