CN103632382A - Compressive sensing-based real-time multi-scale target tracking method - Google Patents

Compressive sensing-based real-time multi-scale target tracking method Download PDF

Info

Publication number
CN103632382A
CN103632382A CN201310700915.9A CN201310700915A CN103632382A CN 103632382 A CN103632382 A CN 103632382A CN 201310700915 A CN201310700915 A CN 201310700915A CN 103632382 A CN103632382 A CN 103632382A
Authority
CN
China
Prior art keywords
target
particle
yardstick
sample
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310700915.9A
Other languages
Chinese (zh)
Other versions
CN103632382B (en
Inventor
孙继平
贾倪
伍云霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology Beijing CUMTB
Original Assignee
China University of Mining and Technology Beijing CUMTB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology Beijing CUMTB filed Critical China University of Mining and Technology Beijing CUMTB
Priority to CN201310700915.9A priority Critical patent/CN103632382B/en
Publication of CN103632382A publication Critical patent/CN103632382A/en
Application granted granted Critical
Publication of CN103632382B publication Critical patent/CN103632382B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a compressive sensing-based real-time multi-scale target tracking method. A sample is modeled by extracting the normalized rectangle features of sampled image, and the normalized rectangle features have higher robustness for the multi-scale target tracking. The normalized rectangle features are very high in dimensionality, so that the method can be used for compressing high-dimensional features based on compressive sensing, the feature vector is compressed under the condition that the extraction scale is not changed, the computation complexity is greatly reduced by integrogram, and the demand of real-time tracking can be met. The compressed feature vector of the sample is classified by a Naive Bayes classifier, so that the most probable position of a target can be determined; the classifier is used for responding and estimating the particle weight and resampling particles so as to prevent the degeneration of particle tracking capability; furthermore, a second-order model is used for estimating and predicting the particle state under the condition that the target movement speed factor is considered. The target in video image can be tracked in real time by the compressive sensing-based real-time multi-scale target tracking method; the method is high in accuracy and low in computation complexity; a tracking frame changes in real time along with the change of target scale, so that the demand of actual tracking application can be met.

Description

A kind of real-time multiscale target tracking based on compressed sensing
Technical field
The present invention relates to a kind of real-time multiscale target tracking based on compressed sensing, belong to technical field of computer vision.
Background technology
Video image motion target following is an of paramount importance problem in computer vision, at object supervise, and motion detection and identification, medical image field is all widely used.The task of following the tracks of is under the known condition of dbjective state, to estimate the process of target status information in subsequent video frame in video lead frame.Video image motion target following is described as being dynamical state estimation problem conventionally.According to the difference of application, the status information of target is generally the kinematics character of target, as position coordinates, and target scale etc.Although researchist has proposed multiple solution through years of researches to video image target tracking problem both at home and abroad, but because the factor that impact is followed the tracks of is many, as targeted attitude adjustment, illumination variation causes target appearance characteristic changing, target occlusion, target scale changes, non-rigid targets deformation, motion blur, target fast moving, target rotation, background interference etc., realize in real time video frequency object tracking reliably and still face lot of challenges.
Generally speaking, video target tracking method can be divided into two large classes: statistical trace method and determinacy tracking.Determinacy tracking obtains the dbjective state position in current video frame by the similarity measurement maximizing between object module and candidate samples.Between the most conventional tolerance object module and present image sample, the method for similarity is squared difference and (SSD), although this metric form is simple, but often robustness is poor, proposed for this reason various improved procedures as based on mean shift algorithm and optimized algorithm to find best candidate sample.Although robustness is greatly enhanced after improving, various iterative search modes have consumed a large amount of computer resources, and real-time cannot be guaranteed.In addition, in determinacy tracking, there is several different methods to carry out modeling to target, as the model based on point, the model based on profile and the model based on cuclear density space etc.Object module based on point is as utilized yardstick invariant features point (SIFT) to Target Modeling, then utilize the constant key feature points of these yardsticks to mate the target area in original object and present frame, utilize RANSAC method to eliminate mispairing unique point simultaneously, finally utilize affined transformation or perspective transform to realize the mapping between original object region and present frame candidate region; Object module based on profile need to carry out the modeling under off-line condition to objective contour conventionally, utilizes the contour feature of great amount of samples training objective, and on-line tracing detection-phase is approached moving target and obtained target location state by adaptive mode.The problem that can find out determinacy tracking maximum is that computational load is larger, is unfavorable for real-time application.Determinacy tracking is larger to the selective dependency of feature in object module process of establishing in addition, for to blocking, the factor such as dimensional variation, illumination variation has higher robustness, must be strict with and design selected feature, must cause calculated amount to increase considerably.
Statistical trace method was subject to paying close attention to more and more widely in recent years.Although determinacy tracking still constantly had new achievement to occur in recent years, these schemes fail fundamentally to solve real time problems all the time.Statistical trace method is utilized state space and the common dynamic changing process of describing whole tracker target of measurement space, and the estimation of state is completed by finding the posterior probability peak value of state under corresponding measuring condition.Under linear Gauss model, utilize Kalman filtering to constantly update corresponding mean of a probability distribution and variance, can obtain the optimal value of corresponding estimation.For the tracking problem under non-linear or non-Gauss's condition, cannot obtain the distribution of state posterior probability and resolve optimal value.A lot of algorithms have been proposed for this reason if particle filter, EKF etc. are to obtain the approximate evaluation to distributing.Particle filter is the most typical a kind of scheme in statistical trace method, by sampling particle constantly being shifted and predicting and obtain sample individual features and measure so that sample weights is upgraded, utilize sample to be similar to state space posterior probability density is estimated.Particle filter utilizes recursive fashion to constantly update and propagate particle, is therefore predicting with more new stage algorithm computational load is lower.Through years development, there is the multiple tracking based on particle filter, these methods are conventionally used profile or color characteristic when sample is observed, as the particle filter based on color histogram, particle filter method based on yardstick invariant features, the particle filter method based on cascade feature etc.Although utilize color histogram stronger to noise robustness as observation model, when illumination variation is obvious or background is stronger to target jamming, the reliability of system tends to significantly reduce.In addition, the calculating of color histogram is relevant with target size, and along with the increase of target size, its calculated amount also will increase.Utilize the method for the constant SIFT unique point of profile and yardstick also to have same problem.This has just limited the quantity of the particle of sampling in particle filter, and when number of particles is less, its approximate evaluation degree of accuracy to posterior probability density also will inevitably reduce.
Can find out no matter be determinacy tracking or statistical trace method, choosing of target appearance model is all part and parcel very in tracing task, will directly have influence on real-time, robustness and the adaptability to various factors of track algorithm.In video tracking field, researchist had carried out a large amount of research to display model both at home and abroad in recent years.Generally, can be divided into two large classes: production model and discriminative model.Production model is first by the external appearance characteristic of learning objective, and then the target appearance signature search associated picture region of arriving is learnt in utilization, obtains the position of target in subsequent video frame according to minimum error principle.In order to adapt to the interference of many factors in tracing process, structure robust and efficient display model are very tired, and very strict.This can cause the significantly lifting of computation complexity equally.Comparatively typically there is the display model based on sparse expression, the display model based on orthogonal matching pursuit, Increment Learning Algorithm etc.The problem of these existing production display model maximums is, the training sample number that the study of external appearance characteristic needs is more, and in order to reduce computation complexity, lower study hypothetical target outward appearance are constant in whole tracing process online.And production model can not make full use of the background information of target proximity, and these background informations are often conducive to promote tracking effect.Discriminative model is regarded tracking problem as binary class problem, and its main thought is that target is separated from background.More typical discriminant trace model, as utilized the tracking of support vector machine classifier, promotes track algorithm online at present, semi-supervised online lifting track algorithm, many case-based learnings (MIL) track algorithm, and (CT) algorithm etc. is followed the tracks of in compression.Compression track algorithm is because it is subject to paying close attention to very widely compared with high real-time and reliability, but the method exists several problems below, has limited its practicality.First, compression tracking utilizes fixed size to follow the tracks of frame and detects recognition sample, follows the tracks of frame and with target scale, does not change and change, and cannot adapt to the impact of the multiple dimensioned variation of target on tracking effect.In a lot of practical applications, need algorithm to possess multiscale tracing characteristic.As a kind of discriminant tracking, tracing process is constantly classified to target and background, classifier parameters need to be constantly updated along with trace daemon, owing to having fixed tracking yardstick, although algorithm can trace into target area under certain conditions, in fact when target scale changes algorithm by target and background and combination (target scale is less than initial gauges) or a part for target (target scale is greater than initial gauges) has been regarded as to new target and followed the tracks of.Once target scale is undergone mutation, sorter does not also have time enough to learn the target signature after variation, will cause the possibility of track rejection greatly to increase.The second, current various discriminant tracking often utilizes target location correlativity in time when collecting sample, in fixing radius region, select, speed and the acceleration information of not considering target travel, the adaptability that moves factor for fast target is poor.The 3rd, current various discriminant trackings, sorter learning parameter value is fixed, and when target is blocked for a long time, thereby sorter will inevitably be thought coverture by mistake it is that target causes target to be followed losing.
Summary of the invention
The deficiency existing in order to overcome existing video tracing method, the present invention proposes a kind of real-time multiple dimensioned monotrack method based on compressed sensing, and method is real-time, can adapt to that target scale changes, tracking results robustness is high.
Real-time multiscale target tracking of the present invention adopts following technical scheme to realize, and comprises system initialisation phase and video real-time target tracking phase, and concrete steps are as follows:
System initialisation phase:
1. read target initial position parameters R state=[x, y, w, h], wherein (x, y) represents target initial position rectangle frame upper left corner point coordinate, w and h represent that respectively target initial position rectangle frame is wide and high;
2. read video sequence the first two field picture F 0, and be converted to gray level image, be designated as I 0;
3. calculate the integrogram of the first two field picture;
4. build initial random measurement matrix R 0;
5. take initial target location center as benchmark, gather the positive negative sample that wide height is identical with initial target;
6. extract the constant compressive features vector of yardstick of all positive and negative samples that collect, and upgrade Naive Bayes Classifier parameter;
7. calculate the constant compressive features vector v of sample size in initial target rectangle frame 0, and distribution of particles is carried out to initialization:
Video real-time target tracking phase:
1. read t two field picture, and be converted to gray level image, be designated as I t;
2. calculate present image integrogram;
3. utilize second-order model to carry out estimating and forecasting to particle state;
4. calculate the constant compressive features vector of yardstick of all particles;
5. utilize Naive Bayes Classifier to classify to all particles, and obtain the sorter response of all particles;
6. sorter is responded to maximum particle as target location, corresponding particle size, as the current size estimation value of target, then utilizes sorter response to calculate particle weight;
7. according to particle weight, all particles are resampled, after resampling, the particle place number of samples that weight is high increases, and the too low particle of weight is rejected to avoid the caused degeneration of low weight particle sample;
8. take and currently determined that target's center is as benchmark, gather with definite target width of frame, highly identical positive negative sample;
9. extract the constant compressive features vector of yardstick of the positive and negative samples that gathers, and upgrade classifier parameters;
10., if video does not finish, return to step 1 and continue to read next frame video image
Suppose
Figure BSA0000099185710000043
represent that wide, height is respectively the sample image of w, h.The present invention utilizes a series of normalization rectangular characteristic of sample image as the original high dimensional feature that represents sample characteristics, and these high dimensional features can pass through sample image z and a series of normalization rectangular filter
Figure BSA0000099185710000041
carrying out convolution obtains:
Wherein, i and j represent respectively width and the height of normalization rectangular filter.The column vector that the filtered sample image that utilizes all rectangular filter filtering to obtain is expressed as to column vector and these are obtained interconnects, and has just obtained the original high dimensional feature vector of the sample image of expression sample characteristics.For width be highly respectively the sample image of w, h, the dimension of its original normalization rectangle high dimensional feature vector is approximately (wh) 2.
Obviously, the original high dimensional feature vector that calculates all sample images need to expend a large amount of computer resources, cannot meet real-time application.Utilize the low dimensional feature vector of sample image after compressed sensing principle is compressed below.
Suppose
Figure BSA0000099185710000044
represent the original high dimensional feature vector of above-mentioned sample,
Figure BSA0000099185710000045
represent random measurement matrix, for high dimensional feature being mapped to low dimensional feature.Low dimensional feature vector v=Rx after can being compressed so, wherein
Figure BSA0000099185710000046
it is low-dimensional compressive features vector.The present invention utilizes following sparse random measurement matrix to realize the compression to original high dimensional feature.
Figure BSA0000099185710000051
Wherein, s=n/4, weight w ibe i capable in the evolution of non-zero entry number inverse.Can find out matrix
Figure BSA0000099185710000054
every a line non-zero entry number be no more than at most 4.
The concrete steps of the initial random measurement matrix of described structure are as follows:
(1) w and h represent respectively target initial position rectangle frame width and height, and low-dimensional number of features is m.Set the every row non-zero entry of initial random measurement matrix number bound and be respectively Num maxand Num min, according to above-mentioned sparse matrix, respectively by Num maxand Num minbe set to 4 and 2.
(2) circulation builds initial random each row of measurement matrix:
A. utilize uniform random number generator to produce interval [Num min, Num max] in random integers, capable as initial random measurement matrix i in non-zero entry number nz i;
B. produce at random nz iindividual rectangular area, makes px (i, t)~U (1, w-3), py (i, t)~U (1, h-3), pw (i, t)~U (1, w-px-2), ph (i, t)~U (1, h-py-2), px (i, t) wherein, py (i, t), pw (i, t), ph (i, t) represents respectively the upper left corner horizontal ordinate of t rectangular area producing, upper left corner ordinate, width and height, U (a, b) represent interval [a, b] interior integer is uniformly distributed, for generation of the random integers in interval, and the generation random measurement matrix value p corresponding with respective rectangular region value(i, t)=w isign t, wherein
Figure BSA0000099185710000052
represent random measurement matrix current line weight, sign trepresent t nonzero element symbol and produce at random with equiprobability by randomizer.
Finally obtain and store the value of all non-zero entry of initial random measurement matrix, the corresponding rectangle region of non-zero entry upper left corner horizontal ordinate, upper left corner ordinate, width and elevation information, and the number set { nz of every row non-zero entry i| i=1,2 ..., m}.
The concrete calculation procedure that the constant compressive features vector of described sample size extracts is as follows:
Suppose that w and h represent respectively target initial position rectangle frame width and height, and think that when initial target is big or small, yardstick is 1.The integrogram of video current frame image is iH, the width of sampled images and be highly respectively w sand h s, sampled images upper left corner position coordinates is (x, y), sampled images yardstick is s, wherein
Figure BSA0000099185710000053
(1) according to current sample image yardstick s, to stored initial random measurement matrix R 0non-zero entry carry out adaptability correction, obtain the random measurement matrix R under yardstick s s, specific as follows:
Keep in initial measurement matrix all non-zero entry numerical value constant, the parameter p x (i, t) of the corresponding rectangle of non-zero entry, py (i, t), pw (i, t), ph (i, t) is corresponding becomes original s doubly and according to round, that is:
I=1 wherein ..., m, t=1 ..., nzi, m is compressive features vector dimension;
(2) i dimension compressive features value computing formula are as follows:
Figure BSA0000099185710000062
Wherein, P sum(i, t) represents the interior pixel value summation of the corresponding rectangle of i capable t non-zero entry institute in revised random measurement matrix, can utilize integrogram to be calculated as follows:
P sum(i,t)=iH(maxI,maxJ)-iH(maxI,minJ)-iH(minI,maxJ)+iH(minI,minJ)
Wherein
Figure BSA0000099185710000063
Figure BSA0000099185710000064
the value that represents integrogram (u, v) point;
(3) finally obtain the constant compressive features vector v=(v of sampled images yardstick i| i=1,2 ..., m}.
The invention has the beneficial effects as follows, utilize normalization rectangular characteristic to describe object module as original high dimensional feature, make feature change and have stronger adaptability target scale, accurately tracking position of object and target scale change, and target following accuracy improves; Utilizing compressed sensing principle only according to random fashion, to extract a small amount of original high dimensional feature can utilize integrogram method can calculate fast normalization rectangular characteristic to target effective modeling, makes the method for the invention computation complexity low, can real-time follow-up target.
Accompanying drawing explanation
In order more to clearly demonstrate the embodiment of the present invention and technical scheme, below the accompanying drawing of required use during technical scheme is described is done simply and to be introduced.
Fig. 1 is the real-time multiscale target tracking process flow diagram based on compressed sensing;
Fig. 2 is initial random measurement matrix schematic diagram;
Fig. 3 is the constant compressive features vector of sample size of the present invention leaching process schematic diagram;
Fig. 4 is positive sample, negative sample pickup area schematic diagram;
Fig. 5 is particle state estimating and forecasting schematic diagram;
In figure, 1, video image region; 2, target area; 3 ,Zheng center of a sample's point pickup area; 4, negative sample central point pickup area.
Embodiment
Below in conjunction with Figure of description, the specific embodiment of the invention is described in detail, first the basic procedure of the real-time multiple dimensioned monotrack method based on compressed sensing is described.With reference to Fig. 1, concrete steps are as follows, and whole process is divided into system initialisation phase and video real-time target tracking phase:
System initialisation phase:
1. read target initial position parameters R state=[x, y, w, h], wherein (x, y) represents target initial position rectangle frame upper left corner point coordinate, w and h represent that respectively target initial position rectangle frame is wide and high;
2. read video sequence the first two field picture F 0={ F r, F g, F b, and be converted to gray level image, be designated as T 0;
The formula that coloured image is converted to gray level image is:
I 0(x,y)=0.299F R(x,y)+0.587F G(x,y)+0.114F B(x,y)
I 0(x, y) represents gray level image I 0the gray-scale value of locating at point (x, y), gray-scale value span is [0,255], and wherein 0 represents black, and 255 represent white, F r, F g, F bbe respectively the R of original image, G, B component.
3. calculate the integrogram of the first two field picture, the computing formula of integrogram is as follows:
iH ( x , y ) = Σ u = 1 x Σ v = 1 y I 0 ( u , v )
4. build initial random measurement matrix R 0;
5. take initial target location center as benchmark, gather wide, the high positive negative sample identical with initial target
With reference to Fig. 4, positive center of a sample gathers radius and is arranged on interval (0.inrad), and negative sample center gathers radius and is arranged on interval (inrad+4, outrad), in the embodiment of the present invention, inrad and outrad are set to respectively 4 and 30, and remain constant.Because target scale changes, positive and negative specimen sample radius also can arrange to yardstick and be directly proportional: inrad=4s, and outrad=30s, wherein s represents the current yardstick that need to gather positive negative sample.It should be noted that determining of inrad, outrad parameter is relevant with concrete video image resolution and target size, in the present embodiment, set parameter values all can reach comparatively desirable tracking effect for the different size target under conventional video resolution.In above-mentioned sampling interval, by even stochastic sampling mode, gather respectively 45 positive samples and 50 negative samples;
6. extract all positive samples that collect the constant compressive features vector of yardstick v (n) | n=1,2 ..., n posand the constant compressive features vector of the yardstick of all negative samples that collect w (t) | and t=1,2 ..., n neg, and utilize following formula to upgrade classifier parameters:
μ i 1 ← λ μ i 1 + ( 1 - λ ) μ 1 σ i 1 ← λ ( σ i 1 ) 2 + ( 1 - λ ) ( σ 1 ) 2 + λ ( 1 - λ ) ( μ i 1 - μ 1 ) 2 μ i 0 ← λ μ i 0 + ( 1 - λ ) μ 0 σ i 0 ← λ ( σ i 0 ) 2 + ( 1 - λ ) ( σ 0 ) 2 + λ ( 1 - λ ) ( μ i 0 - μ 0 ) 2
Wherein λ is sorter learning rate parameter, and more subclassification device parameter renewal speed is faster for λ, and in the present embodiment, λ gets 0.9, n pos=45, n neg=50, i=1,2 ..., m, m represents the dimension of the constant compressive features vector of yardstick,
Figure BSA0000099185710000082
represent that compressive features is to Naive Bayes Classifier parameter corresponding to flow control i dimensional feature value, μ 1 = 1 n pos Σ k = 1 n pos v i ( k ) , σ 1 = 1 n pos Σ k = 1 n pos ( v i ( k ) - μ 1 ) 2 , μ 0 = 1 n neg Σ k = 1 n neg w i ( k ) , σ 0 = 1 n neg Σ k = 1 n nrg ( w i ( k ) - μ 0 ) 2 ; V i(k), w i(k) represent respectively the i dimensional feature value of k positive and negative samples compressive features vector, all classifier parameters under starting condition
Figure BSA0000099185710000084
Figure BSA0000099185710000085
be 0, parameter
Figure BSA0000099185710000086
be 1;
7. calculate the constant compressive features vector v of sample size in initial target rectangle frame 0, and distribution of particles is carried out to initialization:
In the embodiment of the present invention, number of particles pn is set to 200, and each particle comprises following parameter: original state parameter (x 0, y 0, s 0), current t is state parameter (x constantly t, y t, z t), t-1 is state parameter (x constantly t-1, y t-1, s t-1), t-2 is state parameter (X constantly t-2, y t-2, s t-2), x wherein t, y trepresent t particle sample areas center position coordinate constantly, s trepresent t particle sample areas scale-value constantly; Also comprise parameter v and parameter w, v represents the constant compressive features vector of particle region current time yardstick, and w represents particle weight;
In distribution of particles initialization procedure, by all particle original state parameters, current time state parameter, t-1, state parameter, t-2 moment state parameter are all set to (x+floor (w/2) constantly, y+floor (h/2), 1), each particle weight is initialized as 0, and the constant compressive features vector parameter of yardstick of each particle is all initialized as v 0.
Video real-time target tracking phase:
1. read t two field picture, and be converted to gray level image, be designated as I t;
2. calculate present image integrogram;
3. utilize second-order model to carry out estimating and forecasting to particle state;
For each particle, utilize particle t-1 and t-2 moment status information and second-order model to estimate particle state, formula is as follows:
x t ( i ) = 2 x t - 1 ( i ) - x t - 2 ( i ) + wx t y t ( i ) = 2 y t - 1 ( i ) - y t - 2 ( i ) + wy t s t ( i ) = 2 s t - 1 ( i ) - s t - 2 ( i ) + ws t
Wherein, i=1,2 ..., pn, wx t, wy t, ws trepresent respectively 0 average Gaussian noise on three state components, its standard deviation is respectively std x, std y, std s, in the embodiment of the present invention, standard deviation is set to respectively std x=5, std y=2.5, std s=0.06; For preventing that particle sample from exceeding image range, need to be to the estimated state processing of crossing the border, and utilize the state (x that estimates and cross the border after processing t(i), y t(i), s t(i)) replace particle t state parameter constantly, primary particle t, t-1 moment state parameter is replaced respectively to corresponding particle t-1, t-2 state parameter constantly.With reference to Fig. 5, utilize the actual velocity information of considering target travel of second-order model estimating and forecasting particle state, Fig. 5 has intuitively shown and utilizes the new particle position that above-mentioned second-order model calculates more to trend towards target travel direction, rather than new particle sampler point is limited in to upper one constantly near sample point, in like manner for the estimation of particle size state, also considered the yardstick state of two moment particles above, and according to its variation tendency, new particle yardstick state has been estimated;
4. calculate the constant compressive features vector of all particle sizes;
5. utilize Naive Bayes Classifier to classify to all particles, and obtain the sorter response { H (v of all particles t) | t=1,2 ..., pn}, formula is as follows:
H ( v t ) = log ( Π i = 1 m p ( v t i | y = 1 ) p ( y = 1 ) Π i = 1 m p ( v t i | y = 0 ) p ( y = 0 ) ) = Σ i = 1 m log ( p ( v t i | y = 1 ) p ( v t i | y = 0 ) )
Wherein m represents low-dimensional compressive features vector dimension,
Figure BSA0000099185710000095
the compressive features vector v that represents t particle ti dimensional feature value, y=1 represents that sample belongs to target, y=0 represents that sample belongs to background, and supposes p (y=1)=p (y=0).Because the low-dimensional accidental projection of higher-dimension random vector is all Gaussian distributed, so every one-dimensional characteristic of low-dimensional compressive features vector is all obeyed as follows and is just too distributed p ( v t i | y = 1 ) ~ N ( μ i 1 , σ i 1 ) , p ( v t i | y = 0 ) ~ N ( μ i 0 , σ i 0 ) , Classifier parameters
Figure BSA0000099185710000094
by the positive and negative Sample Refreshment constantly collecting, obtain;
6. sorter is responded to maximum particle as target location, corresponding particle size, as the current size estimation value of target, then utilizes sorter response to calculate particle weight, and weight calculation formula is as follows:
w i = p ( z t | x t * ( i ) ) Σ j = 1 pn p ( z t | x t * ( j ) )
Wherein
Figure BSA0000099185710000102
represent that i particle at t moment state is
Figure BSA0000099185710000103
under condition, observe t target z constantly tprobability, p ( z t | x t * ( i ) ) ∝ exp ( H ( v i ) ) = Π j = 1 m p ( v j j | y = 1 ) p ( v i j | y = 0 ) ;
7. pair all particles are according to formula p{newp=p (i) }=w tresample, make the probability that after resampling, in new particle set, particle newp is front i the particle p (i) that resample equal its weight w i.Before supposing to resample particle assembly for p (i) | i=1,2 ..., pn}, specific practice is as follows:
(1) all pn particle is sorted from big to small according to weight, obtain new particle assembly p ' (i) | i=1,2 ..., pn}, after sequence, corresponding particle weight is
Figure BSA0000099185710000105
(2) following steps are carried out in circulation, until obtain whole pn resampling particle: a., read i the particle weight after sequence
Figure BSA0000099185710000106
b. calculate and take particle after i sequence and be basis, the required particle number deriving that copies resamples c. copy and obtain n inew particle after individual resampling.
After resampling, the particle place number of samples that weight is high increases, and the too low particle of weight is rejected to avoid the caused degeneration of low weight particle sample.
8. take and currently determined that target's center is as benchmark, gather with definite target width of frame, highly identical positive negative sample;
9. extract institute's constant compressive features of the positive and negative samples yardstick that gathers vectorial, and upgrade classifier parameters;
10., if video does not finish, return to step 1 and continue to read next frame video image.
With reference to Fig. 2, the concrete steps of the initial random measurement matrix of described structure are as follows:
(1) w and h represent respectively target initial position rectangle frame width and height, and low-dimensional number of features is m (the present embodiment m=150).Set the every row non-zero entry of initial random measurement matrix number bound and be respectively Num maxand Num min(the present embodiment Num max=4, Num min=2).
(2) circulation builds initial random each row of measurement matrix:
A. utilize uniform random number generator to produce interval [Num min, Num max] in random integers, capable as the current i of matrix in non-zero entry number nz i;
B. produce at random nz iindividual rectangular area, makes px (i, t)~U (1, w-3), py (i, t)~U (1, h-3), pw (i, t)~U (1, w-px-2), ph (i, t)~U (1, h-py-2), px (i, t) wherein, py (i, t), pw (i, t), ph (i, t) represents respectively the upper left corner horizontal ordinate of t rectangular area producing, upper left corner ordinate, width and height, U (a, b) represents interval [a, b] interior integer is uniformly distributed, and the generation random measurement matrix value p corresponding with respective rectangular region value(i, t)=w isign t, wherein
Figure BSA0000099185710000111
represent random measurement matrix current line weight, sign trepresent t nonzero element symbol, with equiprobability, produce at random.
Finally obtain and store the value of all non-zero entry of initial random measurement matrix, the corresponding rectangle region of non-zero entry upper left corner horizontal ordinate, upper left corner ordinate, width and elevation information, and the number set { nz of every row non-zero entry i| i=1,2 ..., m}.
The concrete calculation procedure of extracting below in conjunction with the constant compressive features vector of Figure of description 3 explanation sampled images yardstick:
Suppose that w and h represent respectively target initial position rectangle frame width and height, and think that when initial target is big or small, yardstick is 1.The integrogram of video current frame image is iH, the width of sampled images and be highly respectively w sand h s, sampled images upper left corner position coordinates is (x, y), sampled images yardstick is s, wherein
Figure BSA0000099185710000112
in order to facilitate arthmetic statement, in the present embodiment, think that target scale is consistent with variation in short transverse at width, scale parameter in width and short transverse is set no longer respectively, do not paying under the prerequisite of creative work, can expand the embodiment obtaining in width, short transverse yardstick independent variation.
(1) according to current sample image yardstick s, to stored initial random measurement matrix R 0non-zero entry carry out adaptability correction, obtain the random measurement matrix R under yardstick s s:
Keep in initial measurement matrix all non-zero entry numerical value constant, the parameter p x (i, t) of the corresponding rectangle of non-zero entry, py (i, t), pw (i, t), ph (i, t) is corresponding becomes original s doubly and according to round, that is:
I=1 wherein ..., m, t=1 ..., nz i, m is compressive features vector dimension;
(2) i dimension compressive features value computing formula are as follows:
Figure BSA0000099185710000121
Wherein, P sum(i, t) represents the interior pixel value summation of the corresponding rectangle of i capable t non-zero entry institute in revised random measurement matrix, can utilize integrogram to be calculated as follows:
P sum(i,t)=iH(maxI,maxJ)-iH(maxI,minJ)-iH(minI,maxJ)+iH(minI,minJ)
Wherein
Figure BSA0000099185710000123
With reference to Fig. 3, in above-mentioned formula represented is through revised random measurement matrix R sin the corresponding normalization rectangular characteristic of capable t the non-zero entry of i, p valuewhat (i, t) represented is the numerical value of non-zero entry in corresponding random measurement matrix.In Fig. 3, use respectively r ijand x jrepresent random measurement matrix R sin each eigenwert in each element and original higher-dimension normalization rectangular characteristic vector, due to random measurement matrix R sbe very sparse matrix, therefore in above-mentioned calculating, only need non-zero entry to carry out computing, and without calculating all eigenwerts in high dimensional feature vector.
Finally obtain the constant compressive features vector v={ v of sampled images yardstick i| i=1,2 ..., m}.

Claims (5)

1. the real-time multiscale target tracking based on compressed sensing, is characterized in that, comprises the following steps:
System initialisation phase:
(1). read target initial position parameters R state=[x, y, w, h], wherein (x, y) represents target initial position rectangle frame upper left corner point coordinate, w and h represent that respectively target initial position rectangle frame is wide and high;
(2). read video sequence the first two field picture F 0, and be converted to gray level image, be designated as I 0;
(3). calculate the integrogram of the first two field picture;
(4). build initial random measurement matrix R 0;
(5). take initial target location center as benchmark, gather wide, the high positive and negative samples identical with initial target;
(6). extract the constant compressive features vector of yardstick of all positive and negative samples that collect, and upgrade Naive Bayes Classifier parameter;
(7). calculate the constant compressive features vector v of sample size in initial target rectangle frame 0, and distribution of particles is carried out to initialization;
Video real-time target tracking phase:
(1). read t two field picture, and be converted to gray level image, be designated as I t;
(2). calculate present image integrogram;
(3). utilize second-order model to carry out estimating and forecasting to particle state;
(4). calculate the constant compressive features vector of yardstick of all particles;
(5). utilize Naive Bayes Classifier to classify to all particles, and obtain the sorter response of all particles;
(6). sorter is responded to maximum particle as target location, and corresponding particle size, as the current size estimation value of target, then utilizes sorter response to calculate particle weight;
(7). according to particle weight, all particles are resampled, after resampling, the particle place number of samples that weight is high increases, and the too low particle of weight is rejected to avoid the caused degeneration of low weight particle sample;
(8). take and currently determined that target's center is as benchmark, gather with definite target width of frame, highly identical positive and negative samples;
(9). extract the constant compressive features vector of yardstick of the positive and negative samples that gathers, and upgrade classifier parameters;
(10) if. video does not finish, and returns to step 1 and continues to read next frame video image.
2. a kind of real-time multiscale target tracking based on compressed sensing according to claim 1, is characterized in that, the constant compressive features vector of described sample image yardstick extracts and comprises the following steps:
Make w and h represent respectively target initial position rectangle frame width and height, and think that the large small scale of initial target is 1; The width of sample image and be highly respectively w sand h s, sample image upper left corner position coordinates is (x, y), sample image yardstick is s,
Figure FSA0000099185700000026
sample image is from video sequence, in certain two field picture, to collect and the integrogram of this two field picture is iH;
(1). according to sample image yardstick s, to initial random measurement matrix R 0non-zero entry adjust, obtain the random measurement matrix R under yardstick s s:
Keep R 0in all non-zero entry numerical value constant, non-zero entry corresponding rectangle parameter p x (i, t), py (i, t), pw (i, t), ph (i, t) becomes respectively original s doubly and according to round, formula is as follows:
Figure FSA0000099185700000021
I=1 wherein ..., m, t=1 ..., nz i, m is compressive features vector dimension, nz inumber for the capable non-zero entry of initial random measurement matrix i;
(2). calculate the constant compressive features of yardstick to flow control i dimensional feature value, formula is as follows:
Wherein, P sum(i, t) represents random measurement matrix R sin pixel value summation in the corresponding rectangle of i capable t non-zero entry institute, can utilize integrogram to be calculated as follows:
P sum(i,t)=iH(maxI,maxJ)-iH(maxI,minJ)-iH(minI,maxJ)+iH(minI,minJ)
Wherein
Figure FSA0000099185700000023
Figure FSA0000099185700000024
In formula,
Figure FSA0000099185700000025
represent random measurement matrix R sin individual features value in the corresponding original higher-dimension normalization rectangular characteristic vector of capable t the non-zero entry of i;
(3). the constant compressive features vector v={ v of sampled images yardstick i| i=1,2 ..., m}.
3. a kind of real-time multiscale target tracking based on compressed sensing according to claim 2, is characterized in that, the original higher-dimension normalization rectangular characteristic vector of described sample image has following feature:
Order
Figure FSA0000099185700000027
represent that wide, height is respectively the sample image of w, h, original higher-dimension normalization rectangular characteristic vector can be described as sample image z and a series of normalization rectangular filter
Figure FSA0000099185700000031
convolution, normalization rectangular filter formula is as follows:
Wherein, i and j represent respectively width and the height of normalization rectangular filter; The filtered sample image that utilizes all rectangular filter filtering to obtain be expressed as to column vector and these column vectors are interconnected, just having formed the original high dimensional feature vector of sample image.
4. a kind of real-time multiscale target tracking based on compressed sensing according to claim 1, is characterized in that, described to utilize second-order model to carry out estimating and forecasting formula to particle state as follows:
x t ( i ) = 2 x t - 1 ( i ) - x t - 2 ( i ) + wx t y t ( i ) = 2 y t - 1 ( i ) - y t - 2 ( i ) + wy t s t ( i ) = 2 s t - 1 ( i ) - s t - 2 ( i ) + ws t
Wherein, i=1,2 ..., pn, pn is total number of particles, wx t, wy t, ws trepresent respectively 0 average Gaussian noise on three state components of t moment particle, its standard deviation is respectively std x, std y, std s; For preventing that particle sample from exceeding image range, need to be to the estimated state processing of crossing the border, and utilize the state (x that estimates and cross the border after processing t(i), y t(i), s t(i)) replace particle t state parameter constantly, primary particle t, t-1 moment state parameter is replaced respectively to corresponding particle t-1, t-2 state parameter constantly, utilize second-order model estimating and forecasting particle state to consider the velocity information of target travel.
5. a kind of real-time multiscale target tracking based on compressed sensing according to claim 1, is characterized in that, described granular Weights Computing formula is as follows:
w i = p ( z t | x t * ( i ) ) Σ j = 1 pn p ( z t | x t * ( j ) )
I=1 wherein, 2 ..., pn, pn is total number of particles,
Figure FSA0000099185700000035
represent that i particle at t moment state is
Figure FSA0000099185700000036
under condition, observe the t probability of target constantly, p ( z t | x t * ( i ) ) ∝ exp ( H ( v i ) ) = Π j = 1 m p ( v i j | y = 1 ) p ( v i j | y = 0 ) , H(v i) represent the Bayes classifier response of i particle.
CN201310700915.9A 2013-12-19 2013-12-19 A kind of real-time multiscale target tracking based on compressed sensing Active CN103632382B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310700915.9A CN103632382B (en) 2013-12-19 2013-12-19 A kind of real-time multiscale target tracking based on compressed sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310700915.9A CN103632382B (en) 2013-12-19 2013-12-19 A kind of real-time multiscale target tracking based on compressed sensing

Publications (2)

Publication Number Publication Date
CN103632382A true CN103632382A (en) 2014-03-12
CN103632382B CN103632382B (en) 2016-06-22

Family

ID=50213398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310700915.9A Active CN103632382B (en) 2013-12-19 2013-12-19 A kind of real-time multiscale target tracking based on compressed sensing

Country Status (1)

Country Link
CN (1) CN103632382B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104185026A (en) * 2014-09-05 2014-12-03 西安电子科技大学 Infrared high-resolution imaging method for phase encoding under random projection domain and device thereof
CN104331906A (en) * 2014-11-10 2015-02-04 成都信升斯科技有限公司 Image real-time processing method
CN104392467A (en) * 2014-11-18 2015-03-04 西北工业大学 Video target tracking method based on compressive sensing
CN104463192A (en) * 2014-11-04 2015-03-25 中国矿业大学(北京) Dark environment video target real-time tracking method based on textural features
CN104680554A (en) * 2015-01-08 2015-06-03 深圳大学 SURF-based compression tracing method and system
CN105096345A (en) * 2015-09-15 2015-11-25 电子科技大学 Target tracking method based on dynamic measurement matrix and target tracking system based on dynamic measurement matrix
CN103903242B (en) * 2014-04-14 2016-08-31 苏州经贸职业技术学院 Adaptive targets compressed sensing fusion tracking method based on video sensor network
CN106023257A (en) * 2016-05-26 2016-10-12 南京航空航天大学 Target tracking method based on rotor UAV platform
CN106780568A (en) * 2016-12-20 2017-05-31 云南大学 A kind of video target tracking method based on the irregular piecemeal LBP of compression
CN106815562A (en) * 2016-12-19 2017-06-09 江苏慧眼数据科技股份有限公司 A kind of pedestrian detection tracking based on compressive features
CN106846363A (en) * 2016-12-29 2017-06-13 西安电子科技大学 A kind of scale adaptability compression tracking for improving sparse matrix
CN106897731A (en) * 2016-12-30 2017-06-27 西安天和防务技术股份有限公司 For the Target Tracking System of land resources monitoring
CN106952284A (en) * 2017-03-28 2017-07-14 歌尔科技有限公司 A kind of feature extracting method and its device based on compression track algorithm
WO2018058531A1 (en) * 2016-09-30 2018-04-05 富士通株式会社 Target tracking method and device, and image processing apparatus
CN108665479A (en) * 2017-06-08 2018-10-16 西安电子科技大学 Infrared object tracking method based on compression domain Analysis On Multi-scale Features TLD
CN108717450A (en) * 2018-05-18 2018-10-30 大连民族大学 Film review emotional orientation analysis algorithm
CN108734109A (en) * 2018-04-24 2018-11-02 中南民族大学 A kind of visual target tracking method and system towards image sequence
CN111127514A (en) * 2019-12-13 2020-05-08 华南智能机器人创新研究院 Target tracking method and device by robot
CN111314708A (en) * 2020-02-25 2020-06-19 腾讯科技(深圳)有限公司 Image data compression method and device, storage medium and electronic equipment
CN112616023A (en) * 2020-12-22 2021-04-06 荆门汇易佳信息科技有限公司 Multi-camera video target tracking method in complex environment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101739692A (en) * 2009-12-29 2010-06-16 天津市亚安科技电子有限公司 Fast correlation tracking method for real-time video target
US20120170659A1 (en) * 2009-09-04 2012-07-05 Stmicroelectronics Pvt. Ltd. Advance video coding with perceptual quality scalability for regions of interest
CN103310466A (en) * 2013-06-28 2013-09-18 安科智慧城市技术(中国)有限公司 Single target tracking method and achievement device thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120170659A1 (en) * 2009-09-04 2012-07-05 Stmicroelectronics Pvt. Ltd. Advance video coding with perceptual quality scalability for regions of interest
CN101739692A (en) * 2009-12-29 2010-06-16 天津市亚安科技电子有限公司 Fast correlation tracking method for real-time video target
CN103310466A (en) * 2013-06-28 2013-09-18 安科智慧城市技术(中国)有限公司 Single target tracking method and achievement device thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张路平 等: "尺度自适应特征压缩跟踪", 《国防科技大学学报》 *
程建 等: "基于粒子滤波的红外目标跟踪", 《红外与毫米波学报》 *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903242B (en) * 2014-04-14 2016-08-31 苏州经贸职业技术学院 Adaptive targets compressed sensing fusion tracking method based on video sensor network
CN104185026B (en) * 2014-09-05 2017-07-18 西安电子科技大学 The infrared high-resolution imaging method and its device of phase code under accidental projection domain
CN104185026A (en) * 2014-09-05 2014-12-03 西安电子科技大学 Infrared high-resolution imaging method for phase encoding under random projection domain and device thereof
CN104463192A (en) * 2014-11-04 2015-03-25 中国矿业大学(北京) Dark environment video target real-time tracking method based on textural features
CN104463192B (en) * 2014-11-04 2018-01-05 中国矿业大学(北京) Dark situation video object method for real time tracking based on textural characteristics
CN104331906A (en) * 2014-11-10 2015-02-04 成都信升斯科技有限公司 Image real-time processing method
CN104392467A (en) * 2014-11-18 2015-03-04 西北工业大学 Video target tracking method based on compressive sensing
CN104680554A (en) * 2015-01-08 2015-06-03 深圳大学 SURF-based compression tracing method and system
CN104680554B (en) * 2015-01-08 2017-10-31 深圳大学 Compression tracking and system based on SURF
CN105096345A (en) * 2015-09-15 2015-11-25 电子科技大学 Target tracking method based on dynamic measurement matrix and target tracking system based on dynamic measurement matrix
CN106023257A (en) * 2016-05-26 2016-10-12 南京航空航天大学 Target tracking method based on rotor UAV platform
CN106023257B (en) * 2016-05-26 2018-10-12 南京航空航天大学 A kind of method for tracking target based on rotor wing unmanned aerial vehicle platform
WO2018058531A1 (en) * 2016-09-30 2018-04-05 富士通株式会社 Target tracking method and device, and image processing apparatus
CN106815562A (en) * 2016-12-19 2017-06-09 江苏慧眼数据科技股份有限公司 A kind of pedestrian detection tracking based on compressive features
CN106780568A (en) * 2016-12-20 2017-05-31 云南大学 A kind of video target tracking method based on the irregular piecemeal LBP of compression
CN106780568B (en) * 2016-12-20 2019-08-13 云南大学 A kind of video target tracking method based on the irregular piecemeal LBP of compression
CN106846363A (en) * 2016-12-29 2017-06-13 西安电子科技大学 A kind of scale adaptability compression tracking for improving sparse matrix
CN106897731B (en) * 2016-12-30 2020-08-21 西安天和防务技术股份有限公司 Target tracking system for monitoring homeland resources
CN106897731A (en) * 2016-12-30 2017-06-27 西安天和防务技术股份有限公司 For the Target Tracking System of land resources monitoring
CN106952284A (en) * 2017-03-28 2017-07-14 歌尔科技有限公司 A kind of feature extracting method and its device based on compression track algorithm
CN108665479A (en) * 2017-06-08 2018-10-16 西安电子科技大学 Infrared object tracking method based on compression domain Analysis On Multi-scale Features TLD
CN108734109A (en) * 2018-04-24 2018-11-02 中南民族大学 A kind of visual target tracking method and system towards image sequence
CN108734109B (en) * 2018-04-24 2020-11-17 中南民族大学 Visual target tracking method and system for image sequence
CN108717450A (en) * 2018-05-18 2018-10-30 大连民族大学 Film review emotional orientation analysis algorithm
CN108717450B (en) * 2018-05-18 2022-04-05 大连民族大学 Analysis algorithm for emotion tendentiousness of film comment
CN111127514A (en) * 2019-12-13 2020-05-08 华南智能机器人创新研究院 Target tracking method and device by robot
CN111127514B (en) * 2019-12-13 2024-03-22 华南智能机器人创新研究院 Method and device for tracking target by robot
CN111314708A (en) * 2020-02-25 2020-06-19 腾讯科技(深圳)有限公司 Image data compression method and device, storage medium and electronic equipment
CN111314708B (en) * 2020-02-25 2021-05-07 腾讯科技(深圳)有限公司 Image data compression method and device, storage medium and electronic equipment
CN112616023A (en) * 2020-12-22 2021-04-06 荆门汇易佳信息科技有限公司 Multi-camera video target tracking method in complex environment

Also Published As

Publication number Publication date
CN103632382B (en) 2016-06-22

Similar Documents

Publication Publication Date Title
CN103632382A (en) Compressive sensing-based real-time multi-scale target tracking method
CN103295242B (en) A kind of method for tracking target of multiple features combining rarefaction representation
CN107300698B (en) Radar target track starting method based on support vector machine
CN104091147B (en) A kind of near-infrared eyes positioning and eye state identification method
CN103886325B (en) Cyclic matrix video tracking method with partition
CN106780552B (en) Anti-shelter target tracking based on regional area joint tracing detection study
CN104616318A (en) Moving object tracking method in video sequence image
CN106780568B (en) A kind of video target tracking method based on the irregular piecemeal LBP of compression
CN109658128A (en) A kind of shops based on yolo and centroid tracking enters shop rate statistical method
CN115345905A (en) Target object tracking method, device, terminal and storage medium
CN109961028A (en) SAR detection method based on three-dimensional Block- matching and full condition of contact random field
CN112991394A (en) KCF target tracking method based on cubic spline interpolation and Markov chain
Lin et al. A novel robust algorithm for position and orientation detection based on cascaded deep neural network
CN106485283B (en) A kind of particle filter pedestrian target tracking based on Online Boosting
Ding et al. Visual tracking using locality-constrained linear coding and saliency map for visible light and infrared image sequences
CN109166138B (en) Target tracking method and device based on high-order cumulant and storage medium
Zakaria et al. Particle swarm optimization and support vector machine for vehicle type classification in video stream
CN112614158B (en) Sampling frame self-adaptive multi-feature fusion online target tracking method
CN104050486A (en) Polarimetric SAR image classification method based on maps and Wishart distance
Lu et al. Adaptive random-based self-organizing background subtraction for moving detection
Xing et al. Robust object tracking based on sparse representation and incremental weighted PCA
Yuan et al. Research approach of hand gesture recognition based on improved YOLOV3 network and Bayes classifier
Wibowo et al. Multi-scale color features based on correlation filter for visual tracking
Cai et al. A target tracking method based on adaptive occlusion judgment and model updating strategy
Zeng et al. Point matching estimation for moving object tracking based on Kalman filter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant