KR102026651B1 - A method and system for robust object tracking using particle filter framework - Google Patents

A method and system for robust object tracking using particle filter framework Download PDF

Info

Publication number
KR102026651B1
KR102026651B1 KR1020147029541A KR20147029541A KR102026651B1 KR 102026651 B1 KR102026651 B1 KR 102026651B1 KR 1020147029541 A KR1020147029541 A KR 1020147029541A KR 20147029541 A KR20147029541 A KR 20147029541A KR 102026651 B1 KR102026651 B1 KR 102026651B1
Authority
KR
South Korea
Prior art keywords
drift value
likelihood function
object likelihood
state
updated
Prior art date
Application number
KR1020147029541A
Other languages
Korean (ko)
Other versions
KR20150006424A (en
Inventor
고팔라크리슈넌 비스와나스
간난 하리프라사드
아난드 발라수브라마니안
모기 프라티
벨루새미 수드하
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of KR20150006424A publication Critical patent/KR20150006424A/en
Application granted granted Critical
Publication of KR102026651B1 publication Critical patent/KR102026651B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

A method and system are disclosed that provide a robust object tracking mechanism using a combination of deterministic and probabilistic approaches in a particle filter framework. The method includes calculating object likelihood for successive frames using the state parameter of the object being tracked within a short time. The method performs trend analysis of the tracking mechanism based on the calculation of the object likelihood and the determination of the state drift value of the object likelihood function in the current frame. To modify the tracking mechanism, the sampling function variance is updated based on the calculation of the state drift value and the object likelihood function.

Description

A method and system for robust object tracking using particle filter framework

Computer vision systems, particularly in particle filter frameworks, are concerned with establishing and modifying object tracking mechanisms. This application is based on the application of the application No. 1637 / CHE / 2012 dated April 25, 2012, with priority. The disclosure is incorporated herein by reference.

Image tracking is one of the important areas of research in computer vision. Applications of computer vision that require accurate tracking of objects include video surveillance, augmented reality, human-computer interaction, video analytics, and the like.

In conventional methods, object tracking is generally based on either deterministic or stochastic approaches. The methodological approach is computationally efficient, but sensitive to computer image characteristics such as background scattering, echoes, and obscuration. If the target object is not tracked correctly, the tracking mechanism cannot recover from the failure. The stochastic approach is used for time series estimation and prediction of the stochastic distribution of the target object. The most commonly used method in this approach is the particle filter framework. This method effectively handles nonlinearity in state-space models and estimates non-Gaussian input and output noise components.

In the current scenario, the existing tracking algorithm failed to accurately represent the shape of the object as affected by the shape change. Due to this limitation, the robustness or accuracy of the tracking algorithm cannot be modified or improved while tracking the object.

In the conventional method, a sub-space based object shape model known as incremental visual tracking (IVT) uses a particle filter framework for robust tracking based on the condensation algorithm. This method applies a stochastic approach to tracking the movement of an object. This algorithm can estimate the state posterior density at tn by generating particles from state transition density P (Xn | Xn-1). This method can provide a robust tracking method by propagating a set of particles to the next step, tn + 1, and creating a set of new particles around the propagated particles. However, since the detailed analysis of the deterministic aspects of object movement in the particle filter framework is ignored, the accuracy of the shape of the tracked object cannot be adjusted or improved.

In conventional methods, the accuracy of the object tracking mechanism is measured and corrected while tracking specific objects of the human body, including the hand and fingertips. This tracking mechanism can only be used to detect parallels and curves of objects, and the robustness of the tracking mechanism has changed as necessary.

In the light of the above discussion, existing methods fail to provide a combination of deterministic and probabilistic approaches that provide a more accurate, generalized, iterative, and modified object tracking mechanism in computer imaging systems.

The main purpose of one embodiment is to provide a robust image tracking mechanism through a combination of deterministic and stochastic approaches of the particle filter framework.

Another objective is to perform trend analysis of particle filter performance in every frame and provide a way to initiate corrective action by redesigning or updating state drift values as needed.

Another object is to provide a way to iteratively modify the tracker's performance.

According to one aspect of the invention, a method for providing object tracking in a video is determined for at least one state parameter within a short time interval and comprises an object likelihood function for a set of consecutive frames in the video. Formulation steps are included. The method for providing object tracking in video also includes determining the concaveness of the object likelihood function when the object likelihood function is concave and updating the state drift value. The method of providing object tracking in video also includes updating the variance of the sampling function based on the updated state drift value and the main curvature direction of the object likelihood function.

Accordingly, a computer program device for providing object tracking in video is provided. The integrated circuit also includes at least one processor and at least one memory. The memory also includes computer program code inside the circuit. A computer program device for providing object tracking in a video consisting of at least one memory having at least one processor and computer code is determined for at least one state parameter within a short time interval and is a set of consecutive frames in the video. Formulate the object likelihood function for. The computer program device for providing object tracking in video also determines the degree of concaveness of the object likelihood function if the object likelihood function is concave. The computer program device for providing object tracking in the video also updates the state drift value and updates the variance of the sampling function based on the updated state drift value and the principal curvature direction of the object likelihood function.

Aspects and other aspects of the embodiments will be more clearly understood when considered in conjunction with the following description and drawings. It should be understood, however, that the following detailed description represents preferred embodiments and numerous specific details, while not limited to those given by way of example. Various changes and modifications may be made within the scope of the embodiments without departing from the disclosed technical spirit, and the embodiments include all such modifications.

In the particle filter framework, a combination of deterministic and probabilistic approaches can make the image tracking mechanism robust. You can also analyze trends in the performance of particle filters at every frame, and modify the behavior of the tracker repeatedly.

BRIEF DESCRIPTION OF THE DRAWINGS Embodiments of the invention are described with reference to the accompanying drawings, in which like reference characters designate corresponding parts of the various drawings through the drawings. Embodiments will be apparent from the following description with reference to the drawings.
1 illustrates object likelihood estimation for successive frames, according to one embodiment.
2 is a flowchart illustrating an improvement of a tracking process and an accuracy of an object tracking mechanism, according to an exemplary embodiment.
3 illustrates a computing environment for implementing a method for robust object tracking using a particle filter framework, according to one embodiment.

Embodiments of the present invention and various features and advantageous details are described more fully with reference to the non-limiting examples illustrated by the drawings and detailed description of the following specification. In order not to unnecessarily obscure the embodiments, well-known components and processing techniques are omitted. The embodiments herein are merely intended to assist in the understanding of methods for those skilled in the art to implement and use.

Embodiments combine a deterministic model with a probabilistic framework of particle filters to achieve a method and system for providing robust object tracking. Based on the tracking mechanism variable, the particle filter repeatedly applies appropriate corrective action to improve the accuracy of the tracking mechanism.

The object likelihood for successive frames is determined based on a multivariate function for smooth changes of state parameters within short time intervals.

Sudden occlusions of an object being tracked measure and process the rate of change of object likelihood in successive frames of video, and map the change to a logistic regression function.

Object likelihood is a scalar value representing the change in the object captured by the current state estimate.

The state parameter model is a posterior distribution of object states. The object is tracked using six affine parameters. The six status parameters are X-axis shift, Y-axis shift, scale, aspect ratio, and rotation angle (

Figure 112014100616227-pct00001
) And asymmetry angle (
Figure 112014100616227-pct00002
), But is not limited thereto.

The term multivariate function defines a function for evaluating object likelihood in successive frames using state parameters.

1 through 3, like reference numerals refer to like elements throughout, and the preferred embodiments are shown.

1 illustrates object likelihood estimation for successive frames, according to one embodiment. Object likelihood is considered a time-invariant function of state parameters for a short period of time. Status parameters include X-axis movement, Y-axis movement, scale, aspect ratio, and rotation angle (

Figure 112014100616227-pct00003
) And asymmetry angle (
Figure 112014100616227-pct00004
), But is not limited thereto.

The particle filter framework iteratively estimates the most likely object state by tracking the object using state parameters. Referring to FIG. 1, three consecutive frames f n-2 , f n-1 and f n are considered according to an embodiment. During each frame f, the particle filter framework can provide a scalar measurement of the object likelihood O i using the multivariate function f (X) = 0 for the state parameter. The object likelihood for frames f n-2 , f n-1 and f n is represented by (O 1 , X 1 ), (O 2 , X 2 ) and (O 3 , X 3 ), respectively, X 1 , X 2 And X 3 represents the state parameter of the object. Referring to FIG. 1, the object likelihood of frame f n + 1 is drift from the pattern of motion formulated and calculated by the particle filter frame as indicated by the dotted line. Trend analysis of the tracking algorithm is performed with the second derivative of the function f (X). Status drift value (

Figure 112014100616227-pct00005
) And the second derivative calculated by the particle filter framework to modify the tracking mechanism of frame f n + 1 .

2 is a flowchart illustrating an improvement of a tracking process and an accuracy of an object tracking mechanism, according to an exemplary embodiment. The deterministic approach is combined with the probabilistic approach of the particle filter framework to calculate the object likelihood for each frame. The object is tracked using the six state parameters of the object in the frame. Referring to flow diagram 200, object state parameters are initialized to calculate object likelihood using a particle filter framework (201). The particle filter framework predicts 202 the most likely object state of the current frame based on the object likelihood state parameter of the previous frame. Object likelihood is also obtained based on the current state and observation (203). It is predicted whether there is a sudden occlusion of the tracked object (204). The predicted occlusion is then mapped to the logistic regression function. In addition, trend analysis is performed when the object likelihood calculation does not predict occlusion. In addition, the object likelihood is calculated using a Hessian matrix in the current frame.

The following equation is a Hessian matrix H (f) that encodes the second derivative of the function f (X) into a variable.

Figure 112014100616227-pct00006

From the matrix above, the partial derivative of the object likelihood state parameter of the current frame is calculated over short time intervals. The Hessian matrix of the object likelihood function is analyzed to estimate the trend based on Eigen values (205). The direction of the object likelihood function is also determined from the Hessian matrix. The maximum eigenvalue calculated by the Hessian matrix determines the main curvature value of the object likelihood in the current frame. Based on the direction of the principal curvature, determine whether the object likelihood function exhibits the concave behavior of the function (negative semi-definite) or the convex behavior of the function (positive semi-definite) (206). . Concave behavior shows a decreasing trend of the object likelihood function. The convex behavior represents the upward trend of the object likelihood function. If the determined likelihood function is concave, the state drift value for the concave behavior of the object likelihood is obtained based on the principal curvature of the object likelihood function in the current frame (207). The main curvature direction of the object likelihood is determined using an eigenvector according to the maximum eigenvalue of the Hessian matrix in the current frame. In addition, the state drift value is drift in the direction opposite to the direction of the main curvature. For example, the direction of principal curvature E = [e 1 , e 2 , e 3 , e 4 , e 5 , e 6 ]

Figure 112014100616227-pct00007
Consider a state drift value vector represented by = [x 1 , x 2 , x 3 , x 4 , x 5 , x 6 ]. The direction of the two vectors along different dimensions of the state parameter is used to update the state drift value during the current frame. If the two vectors e 1 and x 1 are in the same direction, ie the direction of the main curvature is the same, the direction of the state drift value is updated with the opposite sign. The state drift value X 1 is scaled to the absolute value of e 1 representing the strength of the principal curvature along the corresponding dimension. For state drift values that are not the same as the direction of the principal curvature, the method is scaled to the value represented by the corresponding eigenvector without inverting the sign of the state drift value.

In addition, the sampling function variance is constructed based on the principal curvature of the object likelihood in the current frame and based on the previously estimated state drift value (208). If the direction of the state drift value is the same as the direction of the eigenvector, sampling is performed at a low density because system reliability is low. When the direction of the state drift value and the direction of the eigenvectors are opposite, sampling is performed at a high density because the system reliability is high. Based on the state drift value, a new particle for the current frame is created (209) using the new state drift value and the appropriate sampling function variance. The object likelihood state is also obtained 210 based on the current frame. After modifying the state drift value and the bioflation function variance, the particle filter framework is repeated at least once for improved tracking performance.

3 illustrates a computing environment for implementing a method for robust object tracking using a particle filter framework, according to one embodiment. Referring to FIG. 3, the computing environment 301 includes a control unit 302, an arithmetic logic unit (ALU) 303, a memory 305, a memory 306, a plurality of network devices 308. And at least one processing unit 304 having a plurality of input / output devices 307. The processing unit 304 is responsible for the instruction processing of the algorithm. Processing unit 304 receives instructions from the control unit to process the processing. In addition, logical and arithmetic operations related to the execution of the instructions are calculated with the aid of the ALU 303.

The overall computing environment 301 may be comprised of multiple homogeneous and / or heterogeneous cores, multiple CPUs of different kinds, special media, and other accelerators. The processing unit 304 is responsible for the instruction processing of the algorithm. In addition, multiple processing units 304 may be located on a single chip or across multiple chips.

Algorithms, including instructions and code necessary for implementation, are stored in memory 305, storage 306, or both 305, 306. If running, the instructions are fetched from the memory 305 or memory 306 and executed by the processing unit 304.

In the case of a hardware implementation, various network devices 308 or external input / output devices 307 may be connected to a computing environment that supports the implementation through network units and I / O device units.

The disclosed embodiments can be implemented through at least one software driven by at least one hardware device, and can perform network management functions to control components. The components of FIG. 3 include blocks of at least one hardware device or a combination of hardware devices and software modules.

The foregoing description of specific embodiments fully discloses the general nature of the implementation that others can do by applying current knowledge or easily applying various modifications and / or various applications without departing from the general concept, and therefore, such applications and variations Is intended to be understood within the meaning and scope of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed above is for the purpose of description and not of limitation. Thus, it will be appreciated by those skilled in the art that the present disclosure may be implemented in other specific forms without changing the disclosed technical spirit or essential features.

301: computing environment 302: control unit
303: ALU 304: processing unit
305: memory 306: storage device
307: input and output device 308: network device

Claims (17)

An object likelihood function is determined for at least one state parameter within a short time interval, formulating the object likelihood function for a set of consecutive frames in the video;
If the object likelihood function is concave, determining a degree of concavity of the object likelihood function and updating a state drift value; And
Updating the variance of the sampling function based on the updated state drift value and updating the main curvature direction of the object likelihood function.
The method of claim 1,
Tracking the object based on at least one of state parameters including at least one of X axis movement, Y axis movement, scale, aspect ratio, rotation angle and asymmetry angle. .
The method of claim 1,
Calculating a partial derivative of the object likelihood function in a Hessian matrix according to at least one state parameter value for a plurality of consecutive frames.
The method of claim 1,
Calculating the state drift value based on an eigenvector according to the maximal eigenvalue representing the principal curvature of the object likelihood function.
The method of claim 1,
Generating at least one new particle for the current frame of the video using the updated state drift value and the updated variance.
The method of claim 1,
And performing sampling at low density when the updated state drift value is the same as the direction of the main curvature.
The method of claim 1,
And performing sampling at a high density when the updated state drift value is opposite the direction of the main curvature.
The method of claim 5,
Repeating particle filtering at least once using the updated state drift value and the updated variance.
delete An integrated circuit comprising at least one processor;
At least one memory having computer program code within the circuit;
The at least one memory and the computer program code having the at least one processor is a computer program device,
An object likelihood function is determined for at least one state parameter within a short time interval and formulates the object likelihood function for a set of consecutive frames in the video;
If the object likelihood function is concave, determine the degree of concave of the object likelihood function;
A computer program device for providing object tracking in a video that updates a state drift value and updates a variance of a sampling function based on the updated state drift value and a direction of major curvature of the object likelihood function.
The method of claim 10,
A computer program device for providing object tracking in a video that further tracks the object based on at least one of state parameters including at least one of X axis movement, Y axis movement, scale, aspect ratio, rotation angle, and asymmetry angle. .
The method of claim 10,
A computer program device for providing object tracking in a video that further computes partial derivatives of the object likelihood function in a Hessian matrix according to at least one state parameter value for a plurality of consecutive frames.
The method of claim 10,
And a computer program device for providing object tracking in the video further calculating the state drift value based on an eigenvector according to the maximal eigenvalue representing the principal curvature of the object likelihood function.
The method of claim 10,
And use the updated state drift value and the updated variance to further generate at least one new particle for the current frame of the video.
The method of claim 10,
And perform sampling at low density when the updated state drift value is equal to the direction of the main curvature.
The method of claim 11,
And perform sampling at a high density when the updated state drift value is opposite to the direction of the main curvature.
The method of claim 14,
And repeat particle filtering at least once using said updated state drift value and said updated variance.
KR1020147029541A 2012-04-25 2013-04-25 A method and system for robust object tracking using particle filter framework KR102026651B1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IN1637CH2012 2012-04-25
IN1637/CHE/2012(???) 2012-04-25
IN1637/CHE/2012(????) 2013-04-09
PCT/KR2013/003587 WO2013162313A1 (en) 2012-04-25 2013-04-25 A method and system for robust object tracking using particle filter framework

Publications (2)

Publication Number Publication Date
KR20150006424A KR20150006424A (en) 2015-01-16
KR102026651B1 true KR102026651B1 (en) 2019-09-30

Family

ID=49483527

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020147029541A KR102026651B1 (en) 2012-04-25 2013-04-25 A method and system for robust object tracking using particle filter framework

Country Status (2)

Country Link
KR (1) KR102026651B1 (en)
WO (1) WO2013162313A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046717B (en) * 2015-05-25 2019-03-19 浙江师范大学 A kind of video object method for tracing object of robustness
CN106408590B (en) * 2016-10-21 2019-03-08 西安电子科技大学 Particle filter method for tracking target based on regression analysis
CN107292918B (en) * 2016-10-31 2020-06-19 清华大学深圳研究生院 Tracking method and device based on video online learning
CN111159924B (en) * 2020-04-02 2020-07-28 上海彩虹鱼海洋科技股份有限公司 Method and apparatus for predicting drift trajectory

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060291696A1 (en) 2005-06-27 2006-12-28 Jie Shao Subspace projection based non-rigid object tracking with particle filters
US20080063236A1 (en) 2006-06-09 2008-03-13 Sony Computer Entertainment Inc. Object Tracker for Visually Tracking Object Motion
US20080166045A1 (en) 2005-03-17 2008-07-10 Li-Qun Xu Method of Tracking Objects in a Video Sequence
US20110058708A1 (en) 2008-03-14 2011-03-10 Sony Computer Entertainment Inc. Object tracking apparatus and object tracking method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6674877B1 (en) * 2000-02-03 2004-01-06 Microsoft Corporation System and method for visually tracking occluded objects in real time

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080166045A1 (en) 2005-03-17 2008-07-10 Li-Qun Xu Method of Tracking Objects in a Video Sequence
US20060291696A1 (en) 2005-06-27 2006-12-28 Jie Shao Subspace projection based non-rigid object tracking with particle filters
US20080063236A1 (en) 2006-06-09 2008-03-13 Sony Computer Entertainment Inc. Object Tracker for Visually Tracking Object Motion
US20110058708A1 (en) 2008-03-14 2011-03-10 Sony Computer Entertainment Inc. Object tracking apparatus and object tracking method

Also Published As

Publication number Publication date
KR20150006424A (en) 2015-01-16
WO2013162313A1 (en) 2013-10-31

Similar Documents

Publication Publication Date Title
US11200696B2 (en) Method and apparatus for training 6D pose estimation network based on deep learning iterative matching
Chen et al. Recovering the missing components in a large noisy low-rank matrix: Application to SFM
EP2813082B1 (en) Head pose tracking using a depth camera
US10410362B2 (en) Method, device, and non-transitory computer readable storage medium for image processing
Kwon et al. A geometric particle filter for template-based visual tracking
US10037624B2 (en) Calibrating object shape
US9361706B2 (en) Real-time optical flow sensor design and its application to obstacle detection
US11514607B2 (en) 3-dimensional reconstruction method, 3-dimensional reconstruction device, and storage medium
US11244506B2 (en) Tracking rigged polygon-mesh models of articulated objects
KR102026651B1 (en) A method and system for robust object tracking using particle filter framework
Opitz Latent Gaussian modeling and INLA: A review with focus on space-time applications
Grigorian et al. Dynamically adaptive and reliable approximate computing using light-weight error analysis
Yoon et al. Interacting multiview tracker
Turnbull et al. Computing distances between NURBS-defined convex objects
Stumpp et al. Harms: A hardware acceleration architecture for real-time event-based optical flow
Grigorian et al. Improving coverage and reliability in approximate computing using application-specific, light-weight checks
US10304258B2 (en) Human feedback in 3D model fitting
Kim et al. Spatio-Temporal Auxiliary Particle Filtering With $\ell_ {1} $-Norm-Based Appearance Model Learning for Robust Visual Tracking
CN109166138B (en) Target tracking method and device based on high-order cumulant and storage medium
Li et al. A shape tracking algorithm for visual servoing
Zhou et al. DecoupledPoseNet: Cascade decoupled pose learning for unsupervised camera ego-motion estimation
Köstler et al. Multigrid solution of the optical flow system using a combined diffusion‐and curvature‐based regularizer
Tang et al. FPGA implementation of RANSAC algorithm for real-time image geometry estimation
CN112230801A (en) Kalman smoothing processing method, memory and equipment applied to touch trajectory
KR20220086971A (en) Method and apparatus of tracking hand joint

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant