CN102708382B - Multi-target tracking method based on variable processing windows and variable coordinate systems - Google Patents

Multi-target tracking method based on variable processing windows and variable coordinate systems Download PDF

Info

Publication number
CN102708382B
CN102708382B CN201210146295.4A CN201210146295A CN102708382B CN 102708382 B CN102708382 B CN 102708382B CN 201210146295 A CN201210146295 A CN 201210146295A CN 102708382 B CN102708382 B CN 102708382B
Authority
CN
China
Prior art keywords
target
field picture
energy density
theta
processing block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210146295.4A
Other languages
Chinese (zh)
Other versions
CN102708382A (en
Inventor
王睿
刘玉明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201210146295.4A priority Critical patent/CN102708382B/en
Publication of CN102708382A publication Critical patent/CN102708382A/en
Application granted granted Critical
Publication of CN102708382B publication Critical patent/CN102708382B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

A multi-target tracking method based on variable processing windows and variable coordinate systems includes: step one, performing an initialization process; step two, constructing a space-time processing block for the k<th> frame image; step three, solving a position energy density of the space-time processing block constructed in the step two; step four, extracting position parameters of all targets in the k<th> frame image; step five, matching the target with the position parameter in each processing block to obtain a position observed value Z (k+1) of each target; step six, calculating target speed parameters with the Kalman filter algorithm with weighted gain to reduce calculated amount, amending the position observed value of each object, and strengthening accuracy and robustness of the tracking; and step seven, calculating the motion scale of each target in the k<th> frame image. According to the multi-target tracking method, the variable window, variable projection method and Kalman motion estimation are introduced on the basis of continuous wavelet transform and expectation maximization (EM) fitting algorithm, motion parameters are accurately extracted; and calculated amount is reduced; and accurate tracking of a plurality of motion targets is achieved.

Description

A kind of multi-object tracking method based on becoming processing window and change coordinate system
Technical field
The present invention relates to a kind of multi-object tracking method based on becoming processing window and change coordinate system, belong to machine vision and area of pattern recognition.Be specifically related to a kind of image-region in processing window be carried out to the spatio-temporal filtering based on continuous wavelet transform, then filtering result is become to coordinate system projection dimensionality reduction, and then from one dimension energy density curve, extract the kinematic parameter of target, finally realize the method to a plurality of motion target trackings.
Background technology
The basic problem of target following is exactly to estimate in real time, exactly the motion state of target according to observation data.Along with the development of modern navigation, aerospace cause, the research of target following technology has been subject to the great attention of every country.Be no matter at sea with air traffic control or in modern defense system, target following technology has all been brought into play very important effect.
Traditional motion target tracking method is the method based on characteristic matching mostly, such as the method in based target region, method based on profile etc.When clarification of objective information is easy to obtain, the method based on characteristic matching has reasonable effect, and reliability is higher.But if these features of tracked target are not obvious, such as lower in the contrast of target and background, under the scene such as target texture feature is less, and target is less; Or clarification of objective information changes in time at random, such as in the situations such as target size and attitude variation, the above-mentioned method based on characteristic matching does not just have good robustness.
The scholars such as Mujica have designed a kind of space time filter based on continuous wavelet transform, first with this wave filter, continuous some two field pictures are carried out to filtering operation, then filtering result is carried out to speed, position and the motion yardstick that maximum searching just can obtain target.Owing to not needing color and the Texture eigenvalue of target, this strategy can adapt to the not obvious and time dependent scene of target signature of target signature preferably, especially for little target following, has higher tracking accuracy and robustness.But, when a plurality of targets with similar movement feature mutually near time, the method can produce tracing deviation and even there will be trail-and-error.
For the problem in Mujica tracking, L.Hong has proposed a kind of improving one's methods to Mujica tracking.L.Hong thinks, the reason that causes Mujica tracking defect is when two targets are mutually close, the energy density detecting is the stack of a plurality of target energy density, so caused the position of energy density peak with the deviation of target physical location, and then caused and follow the tracks of inaccurate or even follow the tracks of unsuccessfully.For this problem L.Hong, propose should just to energy density, not carry out maximum searching when a plurality of targets are mutually close, and should first two-dimentional position energy density be projected to one dimension, then with EM algorithm, the position energy density of one dimension is carried out to matching, the average that EM algorithm obtains, as the position of target, is finally used the method realize target of data correlation and the coupling between parameter.
The interference problem when method that L.Hong proposes can solve gtoal setting to a certain extent, but also there is following defect:
1. for each target, design separately a processing window, this makes calculated amount follow target number to be linear positive relation, is unfavorable for real-time follow-up;
2. when a certain dimension coordinate value of a plurality of targets is close, this dimension coordinate value that EM matching obtains just has larger deviation, and this is easy to cause data correlation mistake, finally causes following the tracks of unsuccessfully;
3. while extracting the speed of target and direction of motion parameter, need first by wavelet transformation, obtain two-dimension speed energy density curved surface, this can make calculated amount increase sharply.
For above-mentioned three problems, the present invention has introduced respectively change processing window, becomes coordinate system projection and extracts the movement rate of target and the method for direction by Kalman filtering, has greatly reduced calculated amount, has improved accuracy, robustness and the real-time of following the tracks of.
Summary of the invention
1, object: the object of this invention is to provide a kind of multi-object tracking method based on becoming processing window and change coordinate system, it is on the basis of continuous wavelet transform, EM fitting algorithm, to introduce to become window, change projection pattern and Kalman's estimation, accurately extract the kinematic parameter of target, reduce calculated amount, realize to a plurality of moving targets exactly, follow the tracks of robust.
2, technical scheme: a kind of multi-object tracking method based on becoming processing window and change coordinate system of the present invention, the method concrete steps are as follows:
Step 1: initialization process;
Setup parameter k is as frame number index, the number of image frames of setting each space-time processing block is T, determine that the picture frame that wish in space-time processing block is followed the tracks of is τ (2≤τ≤T-1), and in the two field picture of initialization τ-1 position, speed, yardstick, the Kalman Filter Residuals covariance matrix of each target, average, variance and the weight of EM algorithm.
Step 2: be k two field picture structure space-time processing block;
Get k-(τ-1), k-(τ-1)+1 ..., k, k+1 ... k+ (T-τ) two field picture forms the video flowing of k two field picture, according to position and the speed of each target in k-1 two field picture, estimates the position of all targets in k two field picture:
( x 1 k , y 1 k ) = ( x 1 k - 1 , y 1 k - 1 ) + ( v x 1 k - 1 , v y 1 k - 1 )
( x 2 k , y 2 k ) = ( x 2 k - 1 , y 2 k - 1 ) + ( v x 2 k - 1 , v y 2 k - 1 )
. . . . . . ( x n k , y n k ) = ( x n k - 1 , y n k - 1 ) + ( v x n k - 1 , v y n k - 1 )
Wherein: the target number of n for following the tracks of
Calculate the absolute value of the difference of the transverse and longitudinal coordinate between target i and target j:
dx ij=|x i-x j|
dy ij=|y i-y i|
Search the impact point m with label minimum 1between distance satisfy condition
dx m 1 j 1 < d 1 dy m 1 j 1 < d 1 , J 1=1,2 ..., m 1-1, m 1all impact point (x of+1...n j1, y j1) form one set C 1, impact point m 1the central point that is called this set; Reject m 1after, then from remaining source location, find out the impact point m of minimum label 2, the width of setting processing window is d 1, search with impact point m 2between distance satisfy condition dx m 2 j 2 < d 1 / 2 dy m 2 j 2 < d 1 / 2 , J 2=1,2 ..., m 2-1, m 2all impact point (x of+1...n j2, y j2) form one set C 2, impact point m 2the central point that is called this set; For all impact points, all find one's own set by that analogy.On each two field picture in the video flowing of k two field picture, to gather centered by center position with d 1for length of side intercepting processing window forms the space-time processing block of k two field picture, like this with regard to the individual space-time processing block of L (1≤L≤n) that has been n target formation.
Step 3: the position energy density of asking the space-time processing block of constructing in step 2;
By the motion yardstick a of target in k-1 two field picture k-1, speed c k-1with angle θ k-1parameter substitution wavelet function &psi; w ( &xi; &RightArrow; , t ) = a - 3 2 &psi; ( a - 1 c - 1 3 r - &theta; ( &xi; &RightArrow; - b &RightArrow; ) , a - 1 c 2 3 ( t - &tau; ) ) , Wherein r - &theta; = cos &theta; sin &theta; - sin &theta; cos &theta; ,
Figure BDA00001627361600035
with this wavelet function, all space-time processing blocks of constructing in step 2 are carried out to the position energy density that continuous wavelet transform obtains all space-time processing blocks:
&epsiv; ( a k - 1 , c k - 1 , &theta; k - 1 , &tau; ) ( 1 ) ( b &RightArrow; k ) = 1 ( a k - 1 &tau; - 1 ) 4 | < &psi; ( a k - 1 , c k - 1 , &theta; k - 1 , &tau; ) | B k > | 2
In formula, symbol description is as follows:
τ: the position of the two field picture that wish is followed the tracks of in space-time processing block, with the τ in step 1
A k-1: the motion yardstick of target in k-1 two field picture
C k-1: the speed of target in k-1 two field picture
θ k-1: the angle of target in k-1 two field picture
Figure BDA00001627361600037
the position coordinates of pixel
B k: the space-time processing block of k two field picture
Figure BDA00001627361600038
wavelet function
Figure BDA00001627361600039
in k two field picture
Figure BDA000016273616000310
the position energy density of position
Figure BDA000016273616000311
use wavelet function
Figure BDA000016273616000312
to space-time processing block B kdo continuous wavelet transform.
Step 4: the location parameter that extracts all targets in k two field picture;
The two-dimensional position energy density obtaining in step 3 is carried out to projection dimensionality reduction, and with Nelder-Mead simplex search algorithm or EM algorithm, extracts the location parameter of target, according to the number of target in each processing block, be divided into following two kinds of situations:
1. if the target number m in image processing block i=1, do not need projection dimensionality reduction, directly adopt the maximum position of Nelder-Mead simplex search algorithm search two dimension energy density as the location parameter of this target.
2. if the target number m in image processing block i>=2, first to carry out One Dimensional Projection to two-dimentional position energy density, obtain the energy density of one dimension; Then calculate with the line of other any one target with the satisfy condition target number n of π/8, π/8≤θ≤3 of the angle theta of x axle 1, and do not meet the target number n of this condition 2=m i-n 1.Selective basis n for projecting direction 1and n 2magnitude relationship be divided into again two kinds of situations: a kind of situation is to work as n 1>=n 2time, to x Zhou He y axial projection, obtain the position energy density of x direction and the position energy density of y direction respectively, then utilize EM algorithm respectively the position energy density of the position energy density of x direction and y direction to be carried out to matching, obtain x coordinate figure and the y coordinate figure of all targets in this image block; Another kind of situation is to work as n 1< n 2time, need to first former rectangular coordinate system be rotated counterclockwise to π/4 and form new coordinate system; Then the x Zhou He y axial projection to new coordinate system by two-dimensional position energy density, then with EM algorithm, the position energy density of the position energy density of x direction under new coordinate system and y direction is carried out to matching, obtain the coordinate figure of target in new coordinate system (x ', y '); Finally by coordinate transform, ask (x ', y ') coordinate under former coordinate system
x y = cos ( &pi; / 4 ) - sin ( &pi; / 4 ) sin ( &pi; / 4 ) cos ( &pi; / 4 ) x &prime; y &prime; .
Step 5: realize target in each processing block with the coupling of location parameter, obtain the position detection value Z (k+1) of each target;
Take that to have two targets in processing block be example, the transverse and longitudinal coordinate parameters recording is x 1, x 2, y 1, y 2, have 4 kinds of possible combinations: (x 1, y 1), (x 1, y 2), (x 2, y 1), (x 2, y 2), as long as just these 4 location points can be distributed to 2 targets by the method for data correlation.Image block building method described in analytical procedure one is known, and a target may can obtain many groups position detection value of this target simultaneously in a plurality of set, for these targets, gets the mean value of its all observed readings.
Step 6: calculate target velocity parameter with the Kalman filtering algorithm of gain weighting, reduce calculated amount, and the position detection value of each target is revised, strengthen accuracy and the robustness of following the tracks of;
Suppose state vector be X (k)=| x k, y k, vx k, vy k| the state vector in k two field picture according to system equation estimating target first, in order to simplify, to calculate the present invention and adopt uniform rectilinear motion model, that is:
X ( k ) = 1 0 1 0 0 1 0 1 0 0 1 0 0 0 0 1 X ( k - 1 ) + 1 / 4 0 1 / 2 0 0 1 / 4 0 1 / 2 1 / 2 0 1 0 0 1 / 2 0 1
Can obtain thus the position prediction value X (k|k-1) of target in k two field picture, the predicted value obtaining in calculation procedure four is with the distance between observed reading according to following formula, judge the weight ρ gaining in Kalman filtering algorithm:
&rho; = 1 ifd &le; e &rho; 1 ifd > e 0 < ρ wherein 1< 1
The kalman gain calculating is multiplied by gain weight as final gain substitution Kalman filtering fundamental equation, obtain final state vector value X (k), thus obtained the speed of target in k two field picture and to step 5 in the position detection value that obtains revise.
Step 7: the motion yardstick that calculates each target in k two field picture;
By the speed c of target in k two field picture kwith angle θ ksubstitution wavelet function &psi; w ( &xi; &RightArrow; , t ) = a - 3 2 &psi; ( a - 1 c - 1 3 r - &theta; ( &xi; &RightArrow; - b &RightArrow; ) , a - 1 c 2 3 ( t - &tau; ) ) , Wherein r - &theta; = cos &theta; sin &theta; - sin &theta; cos &theta; ,
Figure BDA00001627361600053
with this wavelet function, the space-time processing block of constructing in step 2 is carried out to the Scale energy density that continuous wavelet transform obtains one dimension:
&epsiv; ( c k , &theta; k , &tau; ) ( 2 ) ( a k ) = 1 a k 4 | < &psi; ( a k , c k , &theta; k , b &RightArrow; k , &tau; ) | B k > | 2
In formula, symbol description is as follows:
τ: the position of the two field picture that wish is followed the tracks of in space-time processing block, with the τ in step 1
A k: the motion yardstick of target in k two field picture
C k: the speed of target in k two field picture
θ k: the angle of target in k two field picture
Figure BDA00001627361600055
the position coordinates of pixel
B k: the space-time processing block of k two field picture
Figure BDA00001627361600056
wavelet function
Figure BDA00001627361600057
in k two field picture
Figure BDA00001627361600058
the Scale energy density of position
use wavelet function
Figure BDA000016273616000510
to space-time processing block B kdo continuous wavelet transform.
For the processing block that only has a target, adopt the maximum position of Nelder-Mead simplex search algorithm search Scale energy density as the motion scale-value of this target; For the processing block that has a plurality of targets, the average obtaining with EM fitting algorithm, as the scale-value of target, then realizes the coupling between dimension measurement and target by the method for data correlation.
So far, in k two field picture, the position of target, kinematic parameter and scale parameter all obtain, and next will continue the target in subsequent frame to follow the tracks of, until follow the tracks of, finish.
3, advantage: a kind of multi-object tracking method based on becoming processing window and change coordinate system of the present invention, its advantage is:
1) with traditional method for tracking target based on characteristic matching, compare, the method is a kind of method that based target kinematic parameter extracts, Weak target and the unconspicuous target of feature are had to good tracking effect, and can adapt to well target sizes and attitude variation.
2) introduce change processing window, reduced the recruitment of the calculated amount brought because of increasing of target numbers.
3) introduce the mode that becomes projection dimensionality reduction, increased a certain dimension coordinate of a plurality of targets in same processing window measurement accuracy when identical.
4) speed of extracting target with wavelet transformation space time filter is compared with direction of motion, and Kalman filtering algorithm calculates speed and the direction of motion of target for the method, has greatly reduced calculated amount, has improved the real-time of target following.
5) be aided with the Kalman filtering algorithm of the weighting that gains, by judgement predicted value, with the error size between observed reading, change gain weight, strengthened accuracy and the robustness of target following.
Accompanying drawing explanation
Fig. 1: be multiple target tracking overall flow figure of the present invention
Fig. 2: do curvilinear motion with two circular targets that emulating image sequence simulation radius is all 5 pixels, have cross one another tracking results in motion process, dotted line is the actual motion track of two targets, the movement locus that solid line obtains for this tracking.Altogether emulation 100 two field pictures
Fig. 3: the position energy density of two targets of the 50th frame in the simulated experiment shown in Fig. 2, horizontal coordinate is two-dimensional position coordinate, vertically coordinate is position energy density values.
Fig. 4: the x oriented energy density that the two-dimentional energy density dimensionality reduction projection of Fig. 3 is obtained, and EM fitting result.Solid line is the x oriented energy densimetric curve obtaining after the projection of two-dimentional energy density dimensionality reduction, and dotted line is two single Gaussian curves that EM matching obtains, and adding star curve is two results after single Gaussian curve stack, i.e. matched curve.
Fig. 5: the y oriented energy density that the two-dimentional energy density dimensionality reduction projection of Fig. 3 is obtained, and EM fitting result.Solid line is the y oriented energy densimetric curve obtaining after the projection of two-dimentional energy density dimensionality reduction, and dotted line is two single Gaussian curves that EM matching obtains, and adding star curve is two results after single Gaussian curve stack, i.e. matched curve.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.
Fig. 1 is image sequence processing process overview flow chart of the present invention, and as shown in Figure 1, realization of the present invention comprises the steps:
Step 1: setup parameter k is as frame number index, the number of image frames of setting each space-time processing block is T, determine that the picture frame that wish in space-time processing block is followed the tracks of is τ (2≤τ≤T-1), and in the two field picture of initialization τ-1 position, speed, yardstick, the Kalman Filter Residuals covariance matrix of each target, average, variance and the weight of EM algorithm.
Step 2: be k two field picture structure space-time processing block, specific as follows:
Step 2-1: get k-(τ-1), k-(τ-1)+1, ..., k, k+1 ... k+ (T-τ) two field picture forms the video flowing of k two field picture, if not enough τ frame above, not enough part is supplemented by copying the 1st two field picture, if not enough T-τ frame below, not enough part is supplemented by copying last frame image.
Step 2-2: according to the position of all targets in the position of all targets in k-1 two field picture and velocity estimation k two field picture:
( x 1 k , y 1 k ) = ( x 1 k - 1 , y 1 k - 1 ) + ( v x 1 k - 1 , v y 1 k - 1 )
( x 2 k , y 2 k ) = ( x 2 k - 1 , y 2 k - 1 ) + ( v x 2 k - 1 , v y 2 k - 1 )
. . . . . . ( x n k , y n k ) = ( x n k - 1 , y n k - 1 ) + ( v x n k - 1 , v y n k - 1 )
Wherein: the target number of n for following the tracks of
Step 2-3: the absolute value that calculates the difference of the transverse and longitudinal coordinate between target i and target j:
dx ij=|x i-x j|
dy ij=|y i-y j|
Step 2-4: the target m that searches label minimum 1, search and satisfy condition
Figure BDA00001627361600074
j 1=1,2 ..., m 1-1, m 1all impact point (x of+1...n j1, y j1) form one set C 1, by target m 1position as the central point of this set;
Step 2-5: if all targets have all had own affiliated set, forward step: 2-7 to; Otherwise forward step 2-6 to.
Step 2-6: find the target that also there is no to distribute affiliated set, search the target m of label minimum in these targets 2, search and satisfy condition
Figure BDA00001627361600075
j 2=1,2 ..., m 2-1, m 2all impact point (x of+1...n j2, y j2) form one set C 2, by target m 2position as the central point of this set, forward step 2-5 to;
Step 2-7: on each two field picture in the video flowing of k two field picture, to gather centered by center position with 2d 1for length of side intercepting processing window, form the space-time processing block of all targets in k two field picture.
Step 3: by the motion yardstick a of target in k-1 two field picture k-1, speed c k-1with angle θ k-1parameter substitution wavelet function &psi; w ( &xi; &RightArrow; , t ) = a - 3 2 &psi; ( a - 1 c - 1 3 r - &theta; ( &xi; &RightArrow; - b &RightArrow; ) , a - 1 c 2 3 ( t - &tau; ) ) , Wherein r - &theta; = cos &theta; sin &theta; - sin &theta; cos &theta; ,
Figure BDA00001627361600078
with this wavelet function, all space-time processing blocks of constructing in step 2 are carried out to the position energy density that continuous wavelet transform obtains all space-time processing blocks:
&epsiv; ( a k - 1 , c k - 1 , &theta; k - 1 , &tau; ) ( 1 ) ( b &RightArrow; k ) = 1 ( a k - 1 &tau; - 1 ) 4 | < &psi; ( a k - 1 , c k - 1 , &theta; k - 1 , &tau; ) | B k > | 2
Step 4: extract the location parameter of all targets in k two field picture, specific as follows:
Step 4-1: calculate i the target number m in processing block iif, m i=1 forwards step 4-2 to; If m i>=2 forward 4-3 to.
Step 4-2: adopt the maximum position (x, y) of Nelder-Mead simplex search algorithm search two dimension energy density, coordinate (x, y) is exactly the position detection value of this target, forwards step 4-4 to.
Step 4-3: two-dimentional energy density is projected to one dimension becoming under coordinate system, and by the observed reading of EM algorithm calculated target positions.
Step 4-3-1: predict the position (x (k|k-1), y (k|k-1)) of all targets in this processing block, and the line that calculates all target prodiction values is with the angle of x axle:
x(k|k-1)=x(k-1)+v x(k-1)
y(k|k-1)=y(k-1)+v y(k-1)
&theta; = arcsin ( | y 1 ( k | k - 1 ) - y 2 ( k | k - 1 ) | ( y 1 ( k | k - 1 ) - y 2 ( k | k - 1 ) ) 2 + ( x 1 ( k | k - 1 ) - x 2 ( k | k - 1 ) ) 2 )
Calculating with the line of other any one target with the satisfy condition target number n of π/8, π/8≤θ≤3 of the angle theta of x axle 1, and do not meet the target number n of this condition 2=m i-n 1.If n 1>=n 2forward step 4-3-2 to; Otherwise forward 4-3-3 to.
Step 4-3-2: under former coordinate system, two-dimentional energy density is carried out to projection dimensionality reduction, and the position energy density of the position energy density of the x direction obtaining and y direction is carried out to the observed reading that EM matching obtains target location.Specific as follows:
Under former coordinate system, two-dimentional energy density is done to skew integration:
&epsiv; x ( a k - 1 , c k - 1 , &theta; k - 1 ) ( x ) = &Integral; y &epsiv; ( a k - 1 , c k - 1 , &theta; k - 1 ) ( x , y ) dy
&epsiv; y ( a k - 1 , c k - 1 , &theta; k - 1 ) ( y ) = &Integral; y &epsiv; ( a k - 1 , c k - 1 , &theta; k - 1 ) ( x , y ) dx
The position energy density of the position energy density of x direction and y direction is carried out to the observed reading that EM matching obtains target: E step, asks the mathematical expectation of complete data collection:
Q ( &Theta; , &Theta; ( t ) ) = E [ l ( &theta; ) | &Theta; ( t ) ] = &Sigma; j = 1 M P ( t ) ( j | x ) l ( &theta; )
= &Sigma; x &Element; L &Sigma; j = 1 M P ( t ) ( j | x ) h ( x ) [ ln ( P j - 1 2 ln ( 2 &pi; ) - ln ( &sigma; j ) - ( x - u j ) 2 2 &sigma; j 2 ) ]
M step: ask and make Q (Θ, Θ (t)) maximum parameter Θ:
u j = &Sigma; x &Element; L h ( x ) P ( t ) ( j | x ) x &Sigma; x &Element; L h ( x ) P ( t ) ( j | x )
&sigma; j 2 = &Sigma; x &Element; L h ( x ) P ( t ) ( j | x ) ( x - u j ) 2 &Sigma; x &Element; L h ( x ) P ( t ) ( j | x )
P j = &Sigma; x &Element; L h ( x ) P ( t ) ( j | x ) &Sigma; x &Element; L h ( x )
Wherein: P ( t ) ( j | x ) = P ( t ) ( x | j ) P j ( t ) &Sigma; j = 1 M P ( t ) ( x | j ) P j ( t )
E step and M step replace calculating until meet the condition of iteration stopping, the u that last iteration obtains jbe the position detection value Z (k) recording, forward step 5 to.
Step 4-3-3: first former coordinate system is rotated counterclockwise to π/4 and then under new coordinate system, two-dimentional energy density is done to skew integration:
&epsiv;x ( a k - 1 c k - 1 &theta; k - 1 ) &prime; ( x &prime; ) = &Integral; y &epsiv; ( a k - 1 , c k - 1 , &theta; k - 1 ) ( x &prime; , y &prime; ) dy &prime;
&epsiv;y ( a k - 1 c k - 1 &theta; k - 1 ) &prime; ( y &prime; ) = &Integral; x &epsiv; ( a k - 1 , c k - 1 , &theta; k - 1 ) ( x &prime; , y &prime; ) dx &prime;
With the EM algorithmic formula in step 4-3-2, the position energy density of the position energy density of x ' direction and x ' direction is carried out to matching, obtain position detection value Z ' in new coordinate system (k).Then calculate the position detection value Z (k) in the coordinate system before rotation:
Z ( k ) = x y = cos ( &pi; / 4 ) - sin ( &pi; / 4 ) sin ( &pi; / 4 ) cos ( &pi; / 4 ) x &prime; y &prime; .
Forward step 5 to.
Step 5: by the coupling of overall arest neighbors data correlation (GNN) method realize target and observed reading, specific as follows.
Step 5-1: supposing has m in i processing window iindividual target, can obtain m i* m iindividual observed reading, calculates respectively the predicted value (x of v observed reading and u target u(k|k-1), y u(k|k-1) distance)
d uv ( u = 1,2 . . . m i , v = 1,2 . . . m i 2 ) .
Step 5-2: set a thresholding G, d satisfies condition uvall d of < G uvform the set C that likely follows u object matching u, search for respectively each set C uminimum value
Figure BDA00001627361600096
if all have v for all l u≠ v l(l ≠ u), by v uindividual observed reading is mated to u target; Otherwise execution step 4-5-3 mates again to u target and l target.
Step 5-3: search set C usub-minimum with set C isub-minimum
Figure BDA00001627361600098
if
Figure BDA00001627361600099
by v ' uindividual observed reading is distributed to u target, by v lindividual observed reading is distributed to l target; Otherwise, by v uindividual observed reading is distributed to u target, by v ' iindividual observed reading is distributed to l target.
Step 5-4: if all processing blocks have all been realized the extraction and associated coupling of observed reading, perform step six, otherwise forward step 4-1 to.
Step 6: speed and the angle of the Kalman filtering algorithm estimating target of use gain weighting in k two field picture, and position observed reading is revised, specific as follows:
Step 6-1:
Set condition vector is X (k)=[x k, y k, vx k, vy k] ', be the state vector in k two field picture according to system equation estimating target first:
X ( k | k - 1 ) = 1 0 1 0 0 1 0 1 0 0 1 0 0 0 0 1 X ( k - 1 ) + 1 / 4 0 1 / 2 0 0 1 / 4 0 1 / 2 1 / 2 0 1 0 0 1 / 2 0 1
Step 6-2: calculate predicted value and follow the distance between observed reading:
d = ( Z x ( k ) - X x ( k | k - 1 ) ) 2 + ( Z y ( k ) - X y ( k | k - 1 ) ) 2 .
The weight ρ of the gain of computer card Kalman Filtering:
Figure BDA00001627361600103
0 < ρ wherein 1< 1.
Step 6-3: according to the final measured value of following formula computing mode vector:
Error covariance predictive equation: P (k|k-1)=AP (k-1) A '+Q
Wherein A = 1 0 1 0 0 1 0 1 0 0 1 0 0 0 0 1 , Q = 1 / 4 0 1 / 2 0 0 1 / 4 0 1 / 2 1 / 2 0 1 0 0 1 / 2 0 1
Kalman gain equation: K=P (k|k-1) H ' (HP (k|k-1) H '+R) -1, wherein H = 1 0 0 0 0 1 0 0
State revision equation: X (k)=X (k|k-1)+ρ K (Z (k)-HX (k|k-1))
Error covariance update equation: P (k)=(I-ρ KH) P (k|k-1)
So far obtained position and the speed of all targets in k two field picture.
Step 7: the motion yardstick that calculates all targets in k two field picture.
Step 7-1: by the speed c of target in k two field picture kwith angle θ kparameter substitution wavelet function &psi; w ( &xi; &RightArrow; , t ) = a - 3 2 &psi; ( a - 1 c - 1 3 r - &theta; ( &xi; &RightArrow; - b &RightArrow; ) , a - 1 c 2 3 ( t - &tau; ) ) , Wherein r - &theta; = cos &theta; sin &theta; - sin &theta; cos &theta; ,
Figure BDA00001627361600109
with this wavelet function, the space-time processing block of constructing in step 2 is carried out to the Scale energy density that continuous wavelet transform obtains one dimension:
&epsiv; ( c k , &theta; k , &tau; ) ( 2 ) ( a k ) = 1 a k 4 &Sigma; b &RightArrow; k &Element; &Phi; | < &psi; ( a k , c k , &theta; k , b &RightArrow; k , &tau; ) | B k > | 2
If only have a target in i image block, forward step 7-2 to, otherwise forward step 7-3 to.
Step 7-2: adopt the maximum position of Nelder-Mead simplex search algorithm search Scale energy density, as the motion scale-value of this target.Forward step 7-5 to.
Step 7-3: with EM algorithm, Scale energy density is carried out to matching, the average obtaining is as the measured value of the scale-value of target.
Step 7-4: with overall arest neighbors data correlation (GNN) algorithm by m iindividual dimension measurement is distributed to m iindividual target.
Step 7-5: do not finish if follow the tracks of, k is increased to 1 and forward step 2 to.
Embodiment
Embodiment adopts emulating image sequence, and radius is all that two circular targets of 5 pixels are done curvilinear motion, and in motion process, two targets are intersected, altogether emulation 100 frames.
The motion target tracking result of the 45th frame to the 55 frames is as shown in the table:
Frame number 45 46 47 48 49 50 51 52 53 54 55
Target 1 actual coordinate x1 246.25 246.38 246.55 246.75 247.00 247.24 247.53 247.85 248.20 248.59 249.00
The x1 that tracking obtains 247.00 246.76 246.84 246.76 246.71 247.03 247.21 247.36 247.46 247.98 248.35
Target 1 actual coordinate y1 214.00 217.49 220.99 224.48 228.00 231.47 235.00 238.44 241.92 245.40 248.88
The y1 that tracking obtains 211.33 215.03 218.88 222.34 226.03 229.52 232.73 236.13 239.76 243.13 246.56
Target 2 actual coordinate x2 267.38 267.30 267.20 267.06 266.89 266.69 266.46 266.20 265.91 265.59 265.24
The x2 that tracking obtains 266.20 226.19 266.47 266.32 266.14 266.28 266.01 265.74 265.51 265.61 265.22
Target 2 actual coordinate y2 195.58 199.08 202.57 206.07 209.57 213.06 216.55 220.04 223.53 227.02 230.50
The y2 that tracking obtains 194.25 197.58 201.14 204.60 208.25 211.68 215.00 218.35 221.83 225.09 228.68
Fig. 2 is this embodiment actual path and pursuit path figure, and dotted line is two target actual motion tracks, the movement locus that solid line obtains for this tracking.
Fig. 3 is the position energy density of two targets of the 50th frame, and horizontal coordinate is two-dimensional position coordinate, and vertically coordinate is position energy density values.
Fig. 4 is the x oriented energy density obtaining after the two-dimentional energy density dimensionality reduction projection of Fig. 3, and EM fitting result.Solid line is the x oriented energy densimetric curve obtaining after the projection of two-dimentional energy density dimensionality reduction, and dotted line is two single Gaussian curves that EM matching obtains, and adding star curve is two results after single Gaussian curve stack, i.e. matched curve.
Fig. 5 is the y oriented energy density obtaining after the two-dimentional energy density dimensionality reduction projection of Fig. 3, and EM fitting result.Solid line is the y oriented energy densimetric curve obtaining after the projection of two-dimentional energy density dimensionality reduction, and dotted line is two single Gaussian curves that EM matching obtains, and adding star curve is two results after single Gaussian curve stack, i.e. matched curve.
The above, be only preferred embodiment of the present invention, is not intended to limit protection scope of the present invention.

Claims (1)

1. based on becoming processing window and a multi-object tracking method that becomes coordinate system, it is characterized in that: the method concrete steps are as follows:
Step 1: initialization process;
Setup parameter k is as frame number index, and image block building method is: the number of image frames of setting each space-time processing block is T, determine that the picture frame that wish in space-time processing block is followed the tracks of is τ, and in the two field picture of initialization τ-1 position, speed, yardstick, the Kalman Filter Residuals covariance matrix of each target, average, variance and the weight of EM algorithm; Wherein, 2≤τ≤T-1;
Step 2: be k two field picture structure space-time processing block;
Get k-(τ-1), k-(τ-1)+1 ..., k, k+1 ... k+ (T-τ) two field picture forms the video flowing of k two field picture, according to position and the speed of each target in k-1 two field picture, estimates the position of all targets in k two field picture:
( x 1 k , y 1 k ) = ( x 1 k - 1 , y 1 k - 1 ) + ( v x 1 k - 1 , v y 1 k - 1 )
( x 2 k , y 2 k ) = ( x 2 k - 1 , y 2 k - 1 ) + ( v x 2 k - 1 , v y 2 k - 1 )
......
( x n k , y n k ) = ( x n k - 1 , y n k - 1 ) + ( v x n k - 1 , v y n k - 1 )
Wherein: the target number of n for following the tracks of,
Calculate the absolute value of the difference of the transverse and longitudinal coordinate between target i and target j:
dx ij=|x i-x j|
dy ij=|y i-y j|
Search the impact point m with label minimum 1between distance satisfy condition dx m 2 j 2 < d 1 / 2 dy m 2 j 2 < d 1 / 2 , J 1=1,2 ..., m 1-1, m 1all impact point (x of+1...n j1, y j1) form one set C 1, impact point m 1the central point that is called this set; Reject m 1after, then from remaining source location, find out the impact point m of minimum label 2, the width of setting processing window is d 1, search with impact point m 2between distance satisfy condition dx m 2 j 2 < d 1 / 2 dy m 2 j 2 < d 1 / 2 , J 2=1,2 ..., m 2-1, m 2all impact point (x of+1...n j2, y j2) form one set C 2, impact point m 2the central point that is called this set; For all impact points, all find one's own set by that analogy; On each two field picture in the video flowing of k two field picture, to gather centered by center position with d 1for length of side intercepting processing window forms the space-time processing block of k two field picture, like this with regard to L the space-time processing block that be n target formation; Wherein, 1≤L≤n;
Step 3: the position energy density of asking the space-time processing block of constructing in step 2;
By the motion yardstick a of target in k-1 two field picture k-1, speed c k-1with angle θ k-1parameter substitution wavelet function &psi; w ( &xi; &RightArrow; , t ) = a - 3 2 &psi; ( a - 1 c - 1 3 r - &theta; ( &xi; &RightArrow; - b &RightArrow; ) , a - 1 c 2 3 ( t - &tau; ) ) , Wherein r - &theta; = cos &theta; sin &theta; - sin &theta; cos &theta; , b &RightArrow; = ( b x , b y ) , With this wavelet function, all space-time processing blocks of constructing in step 2 are carried out to the position energy density that continuous wavelet transform obtains all space-time processing blocks:
&epsiv; ( a k - 1 , c k - 1 , &theta; k - 1 , &tau; ) ( 1 ) ( b &RightArrow; k ) = 1 ( a k - 1 &tau; - 1 ) 4 | < &psi; ( a k - 1 , c k - 1 , &theta; k - 1 , &tau; ) | B k > | 2
In formula, symbol description is as follows:
τ: the position of the two field picture that wish is followed the tracks of in space-time processing block, with the τ in step 1
A k-1: the motion yardstick of target in k-1 two field picture
C k-1: the speed of target in k-1 two field picture
θ k-1: the angle of target in k-1 two field picture
Figure FDA00003665606800000210
the position coordinates of pixel
B k: the space-time processing block of k two field picture
Figure FDA0000366560680000025
wavelet function
Figure FDA0000366560680000026
in k two field picture
Figure FDA0000366560680000027
the position energy density of position
Figure FDA0000366560680000028
use wavelet function to space-time processing block B kdo continuous wavelet transform;
Step 4: the location parameter that extracts all targets in k two field picture;
The two-dimensional position energy density obtaining in step 3 is carried out to projection dimensionality reduction, and with Nelder – Mead simplex search algorithm or EM algorithm, extracts the location parameter of target, according to the number of target in each processing block, be divided into following two kinds of situations:
1. if the target number m in image processing block i=1, do not need projection dimensionality reduction, directly adopt the maximum position of Nelder – Mead simplex search algorithm search two dimension energy density as the location parameter of this target;
2. if the target number m in image processing block i>=2, first to carry out One Dimensional Projection to two-dimentional position energy density, obtain the energy density of one dimension; Then calculate with the line of other any one target with the satisfy condition target number n of π/8, π/8≤θ≤3 of the angle theta of x axle 1, and do not meet the target number n of this condition 2=m i-n 1; Selective basis n for projecting direction 1and n 2magnitude relationship be divided into again two kinds of situations: a kind of situation is to work as n 1>=n 2time, to x Zhou He y axial projection, obtain the position energy density of x direction and the position energy density of y direction respectively, then utilize EM algorithm respectively the position energy density of the position energy density of x direction and y direction to be carried out to matching, obtain x coordinate figure and the y coordinate figure of all targets in this image block; Another kind of situation is to work as n 1<n 2time, need to first former rectangular coordinate system be rotated counterclockwise to π/4 and form new coordinate system, x Zhou He y axial projection by from two-dimensional position energy density to new coordinate system, then with EM algorithm, the position energy density of the position energy density of x direction under new coordinate system and y direction is carried out to matching, obtain the coordinate figure of target in new coordinate system (x ', y '); Finally by coordinate transform, ask (x ', y ') coordinate under former coordinate system
x y = cos ( &pi; / 4 ) - sin ( &pi; / 4 ) sin ( &pi; / 4 ) cos ( &pi; / 4 ) x , y , ;
Step 5: realize target in each processing block with the coupling of location parameter, obtain the position detection value Z (k+1) of each target;
The transverse and longitudinal coordinate parameters recording is x 1, x 2, y 1, y 2, have 4 kinds of possible combinations: (x 1, y 1), (x 1, y 2), (x 2, y 1), (x 2, y 2), as long as just these 4 location points are distributed to 2 targets by the method for data correlation; Image block building method described in analytical procedure one is known, and a target may can obtain many groups position detection value of this target simultaneously in a plurality of set, for these targets, gets the mean value of its all observed readings;
Step 6: calculate target velocity parameter with the Kalman filtering algorithm of gain weighting, reduce calculated amount, and the position detection value of each target is revised, strengthen accuracy and the robustness of following the tracks of;
Suppose that state vector is X (k)=[x k, y k, vx k, vy k] ', be the state vector in k two field picture according to system equation estimating target first, adopts uniform rectilinear motion model, that is: in order to simplify to calculate
X ( k ) = 1 0 1 0 0 1 0 1 0 0 1 0 0 0 0 1 X ( k - 1 ) + 1 / 4 0 1 / 2 0 0 1 / 4 0 1 / 2 1 / 2 0 1 0 0 1 / 2 0 1
Obtain thus the position prediction value X (k|k-1) of target in k two field picture, the predicted value obtaining in calculation procedure six is with the distance between observed reading d = ( Z x ( k ) - X x ( k | k - 1 ) ) 2 + ( Z y ( k ) - X y ( k | k - 1 ) ) 2 , According to following formula judgement Kalman filter
The weight ρ gaining in ripple algorithm:
&rho; = 1 ifd &le; e &rho; 1 ifd > e 0< ρ wherein 1<1
The kalman gain calculating is multiplied by gain weight as final gain substitution Kalman filtering fundamental equation, obtain final state vector value X (k), thus obtained the speed of target in k two field picture and to step 5 in the position detection value that obtains revise;
Step 7: the motion yardstick that calculates each target in k two field picture;
By the speed c of target in k two field picture kwith angle θ ksubstitution wavelet function &psi; w ( &xi; &RightArrow; , t ) = a - 3 2 &psi; ( a - 1 c - 1 3 r - &theta; ( &xi; &RightArrow; - b &RightArrow; ) , a - 1 c 2 3 ( t - &tau; ) ) , Wherein r - &theta; = cos &theta; sin &theta; - sin &theta; cos &theta; , b &RightArrow; = ( b x , b y ) , With this wavelet function, the space-time processing block of constructing in step 2 is carried out to the Scale energy density that continuous wavelet transform obtains one dimension:
&epsiv; ( c k , &theta; k , &tau; ) ( 2 ) ( a k ) = 1 a k 4 &Sigma; b &RightArrow; k &Element; &Phi; | < &psi; ( a k , c k , &theta; k , b &RightArrow; k , &tau; ) | B k > | 2
In formula, symbol description is as follows:
τ: the position of the two field picture that wish is followed the tracks of in space-time processing block, with the τ in step 1
A k: the motion yardstick of target in k two field picture
C k: the speed of target in k two field picture
θ k: the angle of target in k two field picture
Figure FDA0000366560680000046
the position coordinates of pixel
B k: the space-time processing block of k two field picture
Figure FDA0000366560680000047
wavelet function
Figure FDA0000366560680000048
in k two field picture
Figure FDA0000366560680000049
the Scale energy density of position
Figure FDA00003665606800000410
use wavelet function
Figure FDA00003665606800000411
to space-time processing block B kdo continuous wavelet transform;
For the processing block that only has a target, adopt the maximum position of Nelder – Mead simplex search algorithm search Scale energy density as the motion scale-value of this target; For the processing block that has a plurality of targets, the average obtaining with EM fitting algorithm, as the scale-value of target, then realizes the coupling between dimension measurement and target by the method for data correlation;
So far, in k two field picture, the position of target, kinematic parameter and scale parameter all obtain, and next will continue the target in subsequent frame to follow the tracks of, until follow the tracks of, finish.
CN201210146295.4A 2012-05-11 2012-05-11 Multi-target tracking method based on variable processing windows and variable coordinate systems Expired - Fee Related CN102708382B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210146295.4A CN102708382B (en) 2012-05-11 2012-05-11 Multi-target tracking method based on variable processing windows and variable coordinate systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210146295.4A CN102708382B (en) 2012-05-11 2012-05-11 Multi-target tracking method based on variable processing windows and variable coordinate systems

Publications (2)

Publication Number Publication Date
CN102708382A CN102708382A (en) 2012-10-03
CN102708382B true CN102708382B (en) 2014-01-29

Family

ID=46901125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210146295.4A Expired - Fee Related CN102708382B (en) 2012-05-11 2012-05-11 Multi-target tracking method based on variable processing windows and variable coordinate systems

Country Status (1)

Country Link
CN (1) CN102708382B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104077783A (en) * 2013-03-25 2014-10-01 日电(中国)有限公司 Method and device for tracking object in video
CN103955947A (en) * 2014-03-21 2014-07-30 南京邮电大学 Multi-target association tracking method based on continuous maximum energy and apparent model
CN106846402B (en) * 2017-01-04 2019-07-12 北京环境特性研究所 The method that scattering center in a kind of pair of multiframe ISAR image is associated
CN107680375B (en) * 2017-09-29 2020-07-17 深圳市易成自动驾驶技术有限公司 Vehicle load calculation method and device and storage medium
CN108280846B (en) * 2018-01-16 2020-12-29 中国科学院福建物质结构研究所 Target tracking correction method and device based on geometric figure matching
CN110688873A (en) * 2018-07-04 2020-01-14 上海智臻智能网络科技股份有限公司 Multi-target tracking method and face recognition method
CN111080674B (en) * 2019-12-18 2023-11-14 上海无线电设备研究所 Multi-target ISAR key point extraction method based on Gaussian mixture model
CN113111890B (en) * 2021-04-08 2022-09-27 哈尔滨工程大学 Remote water surface infrared target rapid tracking method based on water antenna
CN113077492A (en) * 2021-04-26 2021-07-06 北京华捷艾米科技有限公司 Position tracking method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002074368A (en) * 2000-08-25 2002-03-15 Matsushita Electric Ind Co Ltd Moving object recognizing and tracking device
CN101916368A (en) * 2010-08-20 2010-12-15 中国科学院软件研究所 Multiwindow-based target tracking method
CN102156993A (en) * 2011-04-15 2011-08-17 北京航空航天大学 Continuous wavelet transform object tracking method based on space-time processing block

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002074368A (en) * 2000-08-25 2002-03-15 Matsushita Electric Ind Co Ltd Moving object recognizing and tracking device
CN101916368A (en) * 2010-08-20 2010-12-15 中国科学院软件研究所 Multiwindow-based target tracking method
CN102156993A (en) * 2011-04-15 2011-08-17 北京航空航天大学 Continuous wavelet transform object tracking method based on space-time processing block

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
education》.2011,全文. *
wang rui et al.A spatio-temporal filtering method for motion estimation.《2011 6th international conference on computer science & education》.2011,全文.
wang rui et al.A spatio-temporal filtering method for motion estimation.《2011 6th international conference on computer science &amp *
基于Gabor变换的特征点跟踪算法研究;耶亚林等;《计算机应用研究》;20090831;第26卷(第8期);全文 *
耶亚林等.基于Gabor变换的特征点跟踪算法研究.《计算机应用研究》.2009,第26卷(第8期),全文.

Also Published As

Publication number Publication date
CN102708382A (en) 2012-10-03

Similar Documents

Publication Publication Date Title
CN102708382B (en) Multi-target tracking method based on variable processing windows and variable coordinate systems
CN110335337B (en) Method for generating visual odometer of antagonistic network based on end-to-end semi-supervision
Krull et al. 6-dof model based tracking via object coordinate regression
CN101369346B (en) Tracing method for video movement objective self-adapting window
Wang et al. A region based stereo matching algorithm using cooperative optimization
Deguchi et al. Object tracking by the mean-shift of regional color distribution combined with the particle-filter algorithms
CN104899590A (en) Visual target tracking method and system for unmanned aerial vehicle
CN104200488A (en) Multi-target tracking method based on graph representation and matching
CN113822278B (en) License plate recognition method for unlimited scene
CN104200485A (en) Video-monitoring-oriented human body tracking method
CN105551015A (en) Scattered-point cloud image registering method
CN104700398A (en) Point cloud scene object extracting method
CN102722697A (en) Unmanned aerial vehicle autonomous navigation landing visual target tracking method
CN103927745A (en) Tracking and matching parallel computing method for wearable device
CN108305278B (en) Image matching correlation improvement method in ORB-SLAM algorithm
CN104281148A (en) Mobile robot autonomous navigation method based on binocular stereoscopic vision
CN105096341B (en) Mobile robot position and orientation estimation method based on trifocal tensor and key frame strategy
CN101221238A (en) Dynamic deviation estimation method based on gauss average value mobile registration
CN101739687A (en) Covariance matrix-based fast maneuvering target tracking method
Ji et al. RGB-D SLAM using vanishing point and door plate information in corridor environment
CN102663351A (en) Face characteristic point automation calibration method based on conditional appearance model
CN102982556B (en) Based on the video target tracking method of particle filter algorithm in manifold
CN107886541A (en) Monocular movement object pose method for real-time measurement based on back projection method
Dahal et al. Extended object tracking in curvilinear road coordinates for autonomous driving
Kottath et al. Mutual information based feature selection for stereo visual odometry

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140129

Termination date: 20150511

EXPY Termination of patent right or utility model