CN101877130A - Moving target tracking method based on particle filter under complex scene - Google Patents

Moving target tracking method based on particle filter under complex scene Download PDF

Info

Publication number
CN101877130A
CN101877130A CN2009100827973A CN200910082797A CN101877130A CN 101877130 A CN101877130 A CN 101877130A CN 2009100827973 A CN2009100827973 A CN 2009100827973A CN 200910082797 A CN200910082797 A CN 200910082797A CN 101877130 A CN101877130 A CN 101877130A
Authority
CN
China
Prior art keywords
particle
posterior probability
frame
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2009100827973A
Other languages
Chinese (zh)
Inventor
张文生
丁欢
张水发
常晓夫
王虎
冯园园
杨名宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN2009100827973A priority Critical patent/CN101877130A/en
Publication of CN101877130A publication Critical patent/CN101877130A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a moving target tracking method based on a particle filter under a complex scene. In the invention, a particle filter frame is applied to video tracking, and the calculation of the particle posterior probability is carried out by using motion information at the systematic observation phase. The moving target tracking method comprises the following steps of: firstly, carrying out the subtraction between two adjacent frames of images to obtain a frame difference image, then comparing with an edge image to obtain the posterior probability of different particles by an evaluation function, and finally obtaining a video tracking result under the particle filter frame. In the invention, the auxiliary judgment is carried out by simultaneously using structured RGB (Red, Green and Blue) color distribution information so as to process the situation of no motion information in the static process of an object. In the provided method, the motion information is embedded into the particle filter to realize the video tracking, therefore, the tracking accuracy and the tracking robustness are improved.

Description

Under the complex scene based on the motion target tracking method of particle filter
Technical field
The invention belongs to digital image content understanding, artificial intelligence, field of video monitoring, be specifically related to video tracing method based on particle filter framework.
Background technology
Video frequency object tracking is actually video flowing is cut into the frame is the sequence of video images of unit, by the analysis to the characteristics of image in these image sequences, finds the process of our interested target therein.At present common video tracing method has multiple kind, such as based on the method for feature, based drive method, based on method of coupling or the like.
Because the purpose of video frequency object tracking is in the continuous images sequence, " find out " the user's interest target, and target can use status informations such as its shape, size, position to describe, therefore tracking problem can of equal value become the problem of finding the solution to dbjective state, and this solution procedure can realize with estimation theory.The estimation of state is by the observation of the relevant visual information of image is carried out, such as the half-tone information of target, movable information, profile information etc.Owing to be autotelic tracking, therefore often think that target has certain priori, such as gray scale template, color distribution or the like.According to the priori of target and to its observed result, can construct the Bayesian probability model, find the solution the posterior probability of dbjective state, by comparing the result that posterior probability obtains following the tracks of.
Particle filter (Particle Filter) is a kind of practical algorithm of finding the solution Bayesian probability, is called conditional probability density propagation algorithm, bootstrap filtering, sequential importance sampling etc. again.System's posterior probability density of needs when its basic thought is to use one group of particle (sample) set that has weights to come approximate representation to deal with problems.It is having original advantage as a kind of nonlinear filtering algorithm based on Bayesian Estimation aspect the processing nonlinear motion target following problem.It can be divided into several stages such as particle initialization, system state transfer, systematic observation, posterior probability calculating, particle resampling.In the prior art, the particle filter technology mainly is to calculate two stages obtain particle state by the use colouring information posterior probability in systematic observation and posterior probability.This method distributes at tracking target and background color and differs big and tracking target also can obtain tracking effect preferably under by the situation of partial occlusion in the short time, but it is under complex scene, especially when change of background is big, illumination variation is frequent, target can't obtain good estimated result at present with background color is approaching.
This shows that prior art has to motion target tracking under complex environment that accuracy rate is low, the shortcoming of poor reliability.
Summary of the invention
In order to solve the deficiency that remedies existing method, improve the accuracy and the robustness of motion target tracking, the object of the invention provides under a kind of complex scene the motion target tracking method based on particle filter.
For reaching described purpose, the invention provides under a kind of complex scene motion target tracking method based on particle filter, this method synthesis uses the colouring information and the movable information of the moving target of video image, can processing target be blocked, temporarily disappear, the rustle of leaves in the wind and complex scene under motion target tracking, based on the particle filter theoretical frame, the particle filter theory is used for the motion target tracking of video sequence, and its step is as described below:
Step S1: particle is carried out initialization based on the particle filter theoretical frame, be to use template matches in initial pictures, to seek the initial position of moving target, obtain its movable information, and as the initialization system state each particle is set, carry out the particle initialization with this;
Step S2: based on the particle filter theoretical frame, the movable information of particle selection tracking target is as the particIe system state, and propagate according to first-order autoregressive equation, promptly particle is that the system state of current generation adds a random perturbation particIe system state is shifted in the system state of next stage;
Step S3: based on the particle filter theoretical frame system state of particle is observed, utilization structure weighting RGB color histogram distributes, and calculates the posterior probability of each particle based on colouring information;
Step S4: based on the particle filter theoretical frame, by the frame-to-frame differences image and the edge image of each particle, obtain the movable information of particle, select evaluation function then, movable information is transformed into probability density distribution, obtain of the posterior probability distribution of each particle based on movable information;
Step S5: calculate posterior probability based on the particle filter theoretical frame, fusion obtains final particle associating posterior probability distribution based on the colouring information of moving target and the particle posterior probability distribution of movable information; Determine the influence of colouring information and movable information by adjusting parameter to final particle posterior probability distribution results;
Step S6: utilize the posterior probability of each particle, the system state of the particle of selection posterior probability maximum or the mathematical expectation of all particIe system states as the estimation of system state, are determined the movable information of tracking target, finish motion target tracking;
Step S7: after particle shifted through the several times system, the system state distance tracking target of a part of particle was far away, and particle is resampled, and abandons the too small particle of part posterior probability, returns step S2.
Preferably, the described obtaining step that distributes based on the particle posterior probability of colouring information: be the structuring weighting RGB color histogram that at first calculates the tracking target template, next calculates the structuring weighting RGB color histogram of each particle, then both are compared, obtain color distribution between particle picture and the templates of moving objects image apart from difference, and should carry out becoming probability distribution after Gaussization, the normalization apart from difference, distribute as particle posterior probability based on colouring information.
Preferably, described structuring weighting RGB color histogram obtaining step: be that the RGB color space is divided between a series of chromatic zoneses according to the RGB color value, according to locations of pixels and color value thereof, determine the contribution of this pixel in structuring weighting RGB color histogram then; , finally obtain structuring weighting RGB color histogram and distribute as each pixel in the region-of-interest by traversing graph.
Preferably, the step that the particle posterior probability of described calculating movable information distributes: be by frame-to-frame differences image and edge image are carried out pixel scale and computing, obtain the movement edge image, the similarity that comprises the number investigation particle of movement edge information according to particle picture, and the similarity of particle is treated as probability distribution with evaluation function, distribute as particle posterior probability based on movable information.
Preferably, described edge image is the marginal information of reflection current frame image, comprise the edge of moving target and static edge, what the frame-to-frame differences image reflected is the edge and the background color variation place greatly of moving target, edge image and frame-to-frame differences image are combined, and the marginal information that obtains moving target is the movable information of tracking target.
Beneficial effect:
From said method as can be seen, prior art has only been utilized colouring information, can't adapt to the motion target tracking under the complex scene, and the method that the embodiment of the invention provides, based on the particle filter theoretical frame, the motion target tracking that the particle filter theory is used for video sequence under the complex scene, utilized colouring information on the one hand, utilized the movable information of object on the other hand, can be blocked by processing target, the temporary transient disappearance, the rustle of leaves in the wind and complex scene under motion target tracking, can effectively improve at complex scene, especially background color changes greatly, illumination variation is more frequent, target and background color near etc. under the situation, accuracy and robustness that moving target is followed the tracks of.
Description of drawings
Fig. 1 is embodiment of the invention principle process flow diagram based on the motion target tracking of particle filter method under complex scene;
Fig. 2 is the principle process flow diagram of embodiment of the invention calculating based on the particle posterior probability of colouring information;
Fig. 3 is the frame-to-frame differences and the edge image example of video sequence image and particle;
Fig. 4 is the principle process flow diagram of embodiment of the invention calculating based on the particle posterior probability of movable information.
Embodiment
The present invention will be further described in detail below in conjunction with drawings and Examples.
These method main contents comprise: use motion state (position, size etc.) expression moving target, under the particle filter theoretical frame motion state is estimated.For the tracking effect that under complex environment, obtains,, use colouring information and movable information to calculate the posterior probability of particle simultaneously in the systematic observation stage of particle filter.
As shown in Figure 1, use the system state of the movable informations such as position, yardstick of moving object based on the motion target tracking method of particle filter under the complex scene as particle, through particle initialization, particIe system state transitions, calculate particle posterior probability, calculate particle posterior probability, calculate final particle posterior probability, obtain system state estimation and particle resampling several stages based on movable information based on colouring information, acquisition realizes motion target tracking to the estimation of the system state of particle.
1, particle initial phase
The particle initial phase comprises starter system state and definite number of particles that particle is set.In first two field picture of video sequence image, find the position of moving target, and with the system state of this initialization particle.Present embodiment adopts the method for color template coupling to seek the particle initial position.At first obtain to want the RGB color template of pursuit movement target according to priori, in initial frame, search for zone then with the template similarity maximum, as the initial position of target, and this initial position message is provided with to each particle as the initialization system state.
Number of particles is relevant with the number of particIe system state.When number of particles is too much, can cause calculated amount excessive, can not well guarantee the real-time requirement of motion target tracking; And number of particles is when very few, and particle is the covering system state space well, causes the tracking results accuracy to descend.In the present embodiment, the system state of given particle is level, the numerical value direction coordinate of target in picture frame, by test of many times, choose a value that relatively is fit to as number of particles, can guarantee that the accuracy of motion target tracking is taken into account arithmetic speed simultaneously.
2, the particIe system state transitions stage
The particIe system state transitions stage also is the propagation of particle, refers to the time renewal process of the system state of moving target.With find the solution t constantly the posterior probability of dbjective state be example, because the autokinetic movement trend of moving target is generally apparent in view, it can be a kind of random motion process that particle is propagated, and also promptly obeys single order ARP (Auto-Regressive Process, autoregressive process) equation: x t=Ax T-1+ Bw T-1X wherein tFor particle at t system state constantly, w tBe t normalization noisiness constantly, A and B are constants.Obviously working as A=1 is, t particIe system state constantly will be t-1 noisiness of state stack constantly.When the system state propagation of considering target has speed or acceleration, should adopt high-order ARP model.
In the present embodiment, adopt A=1, w tBe the random number between (0,1), obtain particIe system state transition equation: x t=x T-1+ Bw T-1The system state that is the current time particle was that the system state in a last moment adds a STOCHASTIC DIFFUSION amount, and B is called coefficient of diffusion.When the B value was big, the particle range of scatter was also bigger, and number of particles assurance particle coverage that need be more and bigger calculated amount are to guarantee accuracy as a result; And the B value hour, and the particle range of scatter is less, when the moving target movement velocity is very fast, occurs following the tracks of the situation of failure easily.In order to obtain the reasonable numerical value of B, present embodiment adopts the method that changes the value of B according to target speed.When target speed was big, the value of B was bigger, on the contrary B get smaller value.
It is to be noted, in the particIe system transition phase, the transfer of particIe system state and observed quantity are irrelevant, it only is the circulation way of " supposing " the moving target system state according to priori, whether the propagation of not knowing each particle is reasonable, need in the step of back, verify, obtain the posterior probability of particle by the systematic observation process.
3, calculating is based on the particle posterior probability stage of colouring information
Calculating is actually under the particle filter theoretical frame based on the particle posterior probability of colouring information, utilize the colouring information of target to carry out systematic observation, be to carry out " hypothesis " afterwards in the propagation to dbjective state on last stage, the observed quantity that utilizes current time to obtain is verified it, these two stages cooperate with each other, and are equal to the process of " supposing-verify ".So-called observed quantity is meant the video image that current time obtains.Use observed quantity that the result that system state shifts is verified, be actually the process of a similarity measurement.Because each particle is all represented a possibility of target state, then the purpose of systematic observation is exactly that weights that the particle close with actual conditions obtained are big, and is smaller with the weights that actual conditions differ bigger particle and obtain.The posterior probability that the weights of particle can be used as it is used for the estimation of particle end-state.Because the current video image that observes can be described with the different feature of gray feature, color characteristic, contour feature, textural characteristics or the like, can obtain the systematic observation result based on various characteristic informations thus.Present embodiment uses colouring information and movable information to carry out the calculating of posterior probability simultaneously.
The colouring information of image can obtain by a variety of methods, for example particle picture and templates of moving objects are carried out template matches, calculate the difference of the color value of both relevant positions, difference is more little, illustrate that particle picture and moving target are approaching more, the posterior probability of this particle is big more.In the present embodiment, the colouring information of utilization structure weighting RGB color histogram computed image.
So-called structuring weighting RGB color histogram is the RGB distributed intelligence of computed image, adds space distribution information simultaneously.At first the RGB color space is divided between a series of chromatic zoneses according to the value of R, G, B, according to the position relation of each pixel and picture centre in the image,, obtained the weighting color distribution of image then its color-weighted counting in the color distribution.In the near pixel in range image center, bigger to histogrammic contribution, and be tending towards the pixel of image border, and can ignore substantially the histogram contribution of image, so just can make structuring weighting color histogram pay close attention to the central area of image more.Using the advantage of color histogram is to consider the color distribution of image emphatically, and the locations of pixels that sticks within reason is convenient to handle the situation that moving target is a non-rigid object like this.
Calculating based on the step of the particle posterior probability of colouring information as shown in Figure 2, the structuring weighting RGB color histogram that calculates templates of moving objects at first according to the method described above distributes and the structuring weighting RGB color histogram of each particle representative image distributes.To each particle picture, calculate the distance between the distribution of its histogram distribution and To Template then.For this particle distance being converted into the form of posterior probability, need carry out Gaussization and normalization to this particle range difference, obtain particle posterior probability based on colouring information.
Can obtain distributing by said method based on the particle posterior probability of colouring information, can target be blocked or of short duration disappearance in obtain reasonable result, also can be applied to the motion tracking of non-rigid object simultaneously.But only can't handle under the complex scene according to colouring information, especially when change of background is big, illumination variation is frequent, target and background color than near the time situation, so present embodiment utilizes movable information to carry out the calculating of particle posterior probability simultaneously.
4, calculating is based on the particle posterior probability stage of movable information
In order to solve the motion target tracking problem under complex scene, the embodiment of the invention was introduced movable information and has been carried out the calculating of particle posterior probability as observed quantity in the systematic observation stage of particle filter framework.The movable information of target can obtain by several different methods, for example video background is carried out modeling, and the sport foreground of extracting current video image is for example set up the optical flow field model of video image as moving target, therefrom extracts the movable information of object etc.In embodiments of the present invention, adopt frame-to-frame differences and edge image to obtain the movable information of moving target jointly.
The frame-to-frame differences image is subtracted each other by current video image and adjacent former frame video image and obtains.The frame-to-frame differences image is carried out binary conversion treatment, and as Fig. 3-1 original image example, the frame-to-frame differences image that the video sequence image that illustrates is obtained is Fig. 3-2.As can be seen, be not that 0 pixel has been described the bigger zone of colouring information between adjacent two two field pictures in the frame-to-frame differences image after the binaryzation, these zones mainly comprise zone and the noise that edge, the change of background of moving object is bigger.In this three class, the first that has only of reference significance, the edge of moving object are arranged to calculating the particle posterior probability.Therefore, the edge of moving object need be separated from the frame-to-frame differences image.
Edge image can be found out the edge of current video image, comprises the edge of moving object and the static edge in the image, shown in Fig. 3-3 edge image.In conjunction with frame-to-frame differences image above-mentioned, both simultaneously are not the edge that 0 pixel can be thought moving object in the video image, the just movable information that needs in the systematic observation.Fig. 3-the 4th, the frame-to-frame differences image and the edge image of several different particles, as can be seen, the movable information that comprises when particle is many more, illustrate that particle and the target that will follow the tracks of are approaching more, passable thus comprise what of movable information, calculate particle posterior probability based on movable information according to each particle.
Calculating based on the step of the particle posterior probability of movable information as shown in Figure 4, at first with current video image and adjacent previous frame image subtraction, binaryzation obtains the frame-to-frame differences image.Then present image is carried out digital morphological and learn processing, obtain edge image.Again frame-to-frame differences image and edge image are carried out Pixel-level with operation, finally obtain reflecting the movement edge image of object of which movement information in the image.Next each particle is handled, added up the number of its movement edge pixel that comprises, and select suitable evaluation function, this number is mapped between 0 to 1, promptly change into the form of probability distribution.At last, with the probability distribution normalization of all particles, obtain particle posterior probability based on movable information.By the posterior probability based on movable information of said method according to particle among order computation Fig. 3-4 from left to right, the result is respectively 0.0192,0.0032, and 0.0001.As can be seen, said method can use the movable information in the image, objectively reflects the posterior probability of particle.
ParticIe system observation based on movable information can solve the motion target tracking problem under the complex scene, needs movable information but its prerequisite is a moving target, in other words, requires moving target to keep motion state.When moving target was static, said method just can't obtain correct result.For fear of the generation of this situation, the embodiment of the invention is comprehensively used colouring information and movable information, in conjunction with both advantages, handles the motion target tracking problem under the complex environment.
5, calculate final particle posterior probability
In embodiments of the present invention, by the calculating in above-mentioned two stages, obtained based on the particle posterior probability of colouring information and based on the particle posterior probability of movable information.Now two kinds of information are combined and obtain final particle associating posterior probability and can change of the influence of two kinds of information, so that adapt to different complex environments by adjusting parameter to tracking results.
6, obtain system state estimation
Calculate after the posterior probability of each particle, just can obtain the estimation of system state.Generally can adopt two kinds of criterions.
The one, maximum a posteriori criterion (MAP), promptly with the state of the particle that obtains maximum a posteriori probability as last system state.This criterion is very directly perceived, i.e. " the most similar be exactly possibility maximum ".
Another kind of criterion is a weighted criterion, and promptly each particle determines the ratio that it accounts for according to self posterior probability in state estimation, i.e. " the most similar occupy maximum ratio ".This criterion more meets the framework of particle filter theory, promptly is equivalent to ask the mathematical expectation of the system state of particle, and used probability in the process is exactly the posterior probability of particle.
In embodiments of the present invention, adopt weighted criterion to obtain the estimation of system state.The system state of hypothetical particle is x i, its posterior probability can be expressed as weight w i, then final system state can be expressed as: x Opt=∑ w ix i
7, particle resamples
A degenerate problem that significant drawbacks is exactly a particle of particle filter.Degenerate problem is meant particle in communication process, and the weights of the particle of some virtual condition that departs from objectives can be more and more littler, to such an extent as to finally have only the minority particle to have big weights, causes fearless calculated amount to be wasted on the particle of little weights.Although these little weights particles are also represented a possibility of dbjective state, possibility too hour should be ignored this part particle, and focus on the bigger particle of possibility.The resampling technology can be alleviated this problem to a certain extent.Abandon the too small particle of part weights, and derive some particles from the bigger particle of weights.
Appropriate estimating to degradation phenomena is the efficiently sampling yardstick, and the efficiently sampling yardstick is defined as: Wherein
Figure B2009100827973D0000092
Be i particle k posterior probability density constantly, owing to unascertainablely calculate above-mentioned value, the efficiently sampling yardstick can obtain with the method for approximate evaluation.Little efficiently sampling yardstick means serious degradation phenomena, need resample.
The embodiment of the invention can be used for the motion target tracking under the complex scene.It has the advantage based on the video motion tracking of colouring information, can keep the correctness of following the tracks of during by situations such as partial occlusion or of short duration disappearances in target, simultaneously it can also utilize the movable information of target, solve video background change violent, illumination variation greatly, target and background color be than the tracking problem under the situation such as approaching.The experiment proved that, can reach good motion target tracking effect under the complex scene such as the described method of the embodiment of the invention is alarmed by the rustling of leaves, illumination is more dim in video scene.
The above; only be the embodiment among the present invention, but protection scope of the present invention is not limited thereto, anyly is familiar with the people of this technology in the disclosed technical scope of the present invention; conversion or the replacement expected can be understood, all of the present invention comprising within the scope should be encompassed in.

Claims (5)

  1. Under the complex scene based on the motion target tracking method of particle filter, it is characterized in that, this method synthesis uses the colouring information and the movable information of moving object in video sequences, can processing target be blocked, temporarily disappear, the rustle of leaves in the wind and complex scene under motion target tracking, based on the particle filter theoretical frame, the particle filter theory is used for the motion target tracking of video sequence, and its step is as described below:
    Step S1: particle is carried out initialization based on the particle filter theoretical frame, be to use template matches in initial pictures, to seek the initial position of moving target, obtain its movable information, and as the initialization system state each particle is set, carry out the particle initialization with this;
    Step S2: based on the particle filter theoretical frame, the movable information of particle selection pursuit movement target is as the particIe system state, and propagate according to first-order autoregressive equation, promptly particle is that the system state of current generation adds a random perturbation particIe system state is shifted in the system state of next stage;
    Step S3: based on the particle filter theoretical frame system state of particle is observed, utilization structure weighting RGB color histogram distributes, and calculates the posterior probability of each particle based on colouring information;
    Step S4: based on the particle filter theoretical frame, by the frame-to-frame differences image and the edge image of each particle, obtain the movable information of particle, select evaluation function then, movable information is transformed into probability density distribution, obtain of the posterior probability distribution of each particle based on movable information;
    Step S5: calculate posterior probability based on the particle filter theoretical frame, fusion obtains final particle associating posterior probability distribution based on the colouring information of moving target and the particle posterior probability distribution of movable information; Determine the influence of colouring information and movable information by adjusting parameter to final particle posterior probability distribution results;
    Step S6: utilize the posterior probability of each particle, the system state of the particle of selection posterior probability maximum or the mathematical expectation of all particIe system states as the estimation of system state, are determined the movable information of tracking target, finish motion target tracking;
    Step S7: after particle shifted through the several times system, the system state distance tracking target of a part of particle was far away, and particle is resampled, and abandons the too small particle of part posterior probability, returns step S2.
  2. 2. motion target tracking method according to claim 1, it is characterized in that, the described obtaining step that distributes based on the particle posterior probability of colouring information: be the structuring weighting RGB color histogram that at first calculates the tracking target template, next calculates the structuring weighting RGB color histogram of each particle, then both are compared, obtain color distribution between particle picture and the templates of moving objects image apart from difference, and should carry out becoming probability distribution after Gaussization, the normalization apart from difference, distribute as particle posterior probability based on colouring information.
  3. 3. motion target tracking method according to claim 2, it is characterized in that, described structuring weighting RGB color histogram obtaining step: be that the RGB color space is divided between a series of chromatic zoneses according to the RGB color value, according to locations of pixels and color value thereof, determine the contribution of this pixel in structuring weighting RGB color histogram then; , finally obtain structuring weighting RGB color histogram and distribute as each pixel in the region-of-interest by traversing graph.
  4. 4. motion target tracking method according to claim 1, it is characterized in that, the step that the particle posterior probability of described calculating movable information distributes: be by frame-to-frame differences image and edge image are carried out pixel scale and computing, obtain the movement edge image, the similarity that comprises the number investigation particle of movement edge information according to particle picture, and the similarity of particle is treated as probability distribution with evaluation function, distribute as particle posterior probability based on movable information.
  5. 5. motion target tracking method according to claim 1, it is characterized in that, described edge image is the marginal information of reflection current frame image, comprise the edge of moving target and static edge, what the frame-to-frame differences image reflected is the edge and the background color variation place greatly of moving target, edge image and frame-to-frame differences image are combined, and the marginal information that obtains moving target is the movable information of tracking target.
CN2009100827973A 2009-04-29 2009-04-29 Moving target tracking method based on particle filter under complex scene Pending CN101877130A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100827973A CN101877130A (en) 2009-04-29 2009-04-29 Moving target tracking method based on particle filter under complex scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100827973A CN101877130A (en) 2009-04-29 2009-04-29 Moving target tracking method based on particle filter under complex scene

Publications (1)

Publication Number Publication Date
CN101877130A true CN101877130A (en) 2010-11-03

Family

ID=43019675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100827973A Pending CN101877130A (en) 2009-04-29 2009-04-29 Moving target tracking method based on particle filter under complex scene

Country Status (1)

Country Link
CN (1) CN101877130A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063625A (en) * 2010-12-10 2011-05-18 浙江大学 Improved particle filtering method for multi-target tracking under multiple viewing angles
CN102184548A (en) * 2011-04-22 2011-09-14 浙江工业大学 Video moving object tracking method based on cumulative histogram particle filtering
CN103020991A (en) * 2012-12-26 2013-04-03 中国科学技术大学 Method and system for sensing moving objects in video scene
CN103930769A (en) * 2011-11-15 2014-07-16 马克思-普朗克科学促进协会 Method of and apparatus for tracking a particle, particularly a single molecule, in a sample
CN104200226A (en) * 2014-09-01 2014-12-10 西安电子科技大学 Particle filtering target tracking method based on machine learning
US9291562B2 (en) 2011-11-15 2016-03-22 Max-Planck-Gesellschaft Zur Foerderung Der Wissenschaften E.V. Method and apparatus for tracking a particle, particularly a single molecule, in a sample
CN106228576A (en) * 2016-07-27 2016-12-14 潘燕 For processing the system of image for target following
CN106296731A (en) * 2016-07-27 2017-01-04 潘燕 A kind of target vehicle video frequency following system under complex scene
CN106780539A (en) * 2016-11-30 2017-05-31 航天科工智能机器人有限责任公司 Robot vision tracking
CN108257156A (en) * 2018-01-24 2018-07-06 清华大学深圳研究生院 A kind of method of the automatic tracing target object from video
CN108829248A (en) * 2018-06-01 2018-11-16 中国科学院软件研究所 A kind of mobile target selecting method and system based on the correction of user's presentation model
CN108921872A (en) * 2018-05-15 2018-11-30 南京理工大学 A kind of robustness visual target tracking method suitable for long-range tracking
CN109696908A (en) * 2019-01-18 2019-04-30 南方科技大学 Robot and flight path setting method and system thereof
CN110717414A (en) * 2019-09-24 2020-01-21 青岛海信网络科技股份有限公司 Target detection tracking method, device and equipment
CN112102356A (en) * 2019-06-18 2020-12-18 北京七鑫易维科技有限公司 Target tracking method and device, terminal equipment and storage medium
CN113487474A (en) * 2021-07-02 2021-10-08 杭州小影创新科技股份有限公司 Content-related GPU real-time particle special effect method

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063625B (en) * 2010-12-10 2012-12-26 浙江大学 Improved particle filtering method for multi-target tracking under multiple viewing angles
CN102063625A (en) * 2010-12-10 2011-05-18 浙江大学 Improved particle filtering method for multi-target tracking under multiple viewing angles
CN102184548A (en) * 2011-04-22 2011-09-14 浙江工业大学 Video moving object tracking method based on cumulative histogram particle filtering
US9291562B2 (en) 2011-11-15 2016-03-22 Max-Planck-Gesellschaft Zur Foerderung Der Wissenschaften E.V. Method and apparatus for tracking a particle, particularly a single molecule, in a sample
CN103930769A (en) * 2011-11-15 2014-07-16 马克思-普朗克科学促进协会 Method of and apparatus for tracking a particle, particularly a single molecule, in a sample
CN103020991B (en) * 2012-12-26 2015-11-18 中国科学技术大学 The method and system of moving target perception in a kind of video scene
CN103020991A (en) * 2012-12-26 2013-04-03 中国科学技术大学 Method and system for sensing moving objects in video scene
CN104200226A (en) * 2014-09-01 2014-12-10 西安电子科技大学 Particle filtering target tracking method based on machine learning
CN104200226B (en) * 2014-09-01 2017-08-25 西安电子科技大学 Particle filter method for tracking target based on machine learning
CN106228576A (en) * 2016-07-27 2016-12-14 潘燕 For processing the system of image for target following
CN106296731A (en) * 2016-07-27 2017-01-04 潘燕 A kind of target vehicle video frequency following system under complex scene
CN106780539B (en) * 2016-11-30 2019-08-20 航天科工智能机器人有限责任公司 Robot vision tracking
CN106780539A (en) * 2016-11-30 2017-05-31 航天科工智能机器人有限责任公司 Robot vision tracking
CN108257156A (en) * 2018-01-24 2018-07-06 清华大学深圳研究生院 A kind of method of the automatic tracing target object from video
CN108921872A (en) * 2018-05-15 2018-11-30 南京理工大学 A kind of robustness visual target tracking method suitable for long-range tracking
CN108921872B (en) * 2018-05-15 2022-02-01 南京理工大学 Robust visual target tracking method suitable for long-range tracking
CN108829248A (en) * 2018-06-01 2018-11-16 中国科学院软件研究所 A kind of mobile target selecting method and system based on the correction of user's presentation model
CN108829248B (en) * 2018-06-01 2020-11-20 中国科学院软件研究所 Moving target selection method and system based on user performance model correction
CN109696908A (en) * 2019-01-18 2019-04-30 南方科技大学 Robot and flight path setting method and system thereof
CN109696908B (en) * 2019-01-18 2022-06-21 南方科技大学 Robot and flight path setting method and system thereof
CN112102356A (en) * 2019-06-18 2020-12-18 北京七鑫易维科技有限公司 Target tracking method and device, terminal equipment and storage medium
CN112102356B (en) * 2019-06-18 2024-07-02 北京七鑫易维科技有限公司 Target tracking method, device, terminal equipment and storage medium
CN110717414A (en) * 2019-09-24 2020-01-21 青岛海信网络科技股份有限公司 Target detection tracking method, device and equipment
CN110717414B (en) * 2019-09-24 2023-01-03 青岛海信网络科技股份有限公司 Target detection tracking method, device and equipment
CN113487474A (en) * 2021-07-02 2021-10-08 杭州小影创新科技股份有限公司 Content-related GPU real-time particle special effect method

Similar Documents

Publication Publication Date Title
CN101877130A (en) Moving target tracking method based on particle filter under complex scene
WO2021208275A1 (en) Traffic video background modelling method and system
US7133537B1 (en) Method and apparatus for tracking a moving object
CN101923718B (en) Optimization method of visual target tracking method based on particle filtering and optical flow vector
CN104616290A (en) Target detection algorithm in combination of statistical matrix model and adaptive threshold
CN102184551A (en) Automatic target tracking method and system by combining multi-characteristic matching and particle filtering
CN102663362B (en) Moving target detection method based on gray features
CN102142085B (en) Robust tracking method for moving flame target in forest region monitoring video
EP2013817A2 (en) Video segmentation using statistical pixel modeling
CN101957997A (en) Regional average value kernel density estimation-based moving target detecting method in dynamic scene
CN103262119A (en) Method and system for segmenting an image
Zhao et al. Deep fully convolutional regression networks for single image haze removal
US20210088441A1 (en) Deposit detection device and deposit detection method
CN109712247B (en) Live-action training system based on mixed reality technology
CN105513080B (en) A kind of infrared image target Salience estimation
CN103258332A (en) Moving object detection method resisting illumination variation
CN101923719A (en) Particle filter and light stream vector-based video target tracking method
Meng et al. CORNet: Context-based ordinal regression network for monocular depth estimation
CN116071283B (en) Three-dimensional point cloud image fusion method based on computer vision
CN103049340A (en) Image super-resolution reconstruction method of visual vocabularies and based on texture context constraint
CN108615241A (en) A kind of quick estimation method of human posture based on light stream
CN113763427A (en) Multi-target tracking method based on coarse-fine shielding processing
CN110363197B (en) Video region of interest extraction method based on improved visual background extraction model
Wu et al. Overview of video-based vehicle detection technologies
CN104142730B (en) A kind of method that gesture tracking result is mapped to mouse event

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20101103