CN115170621A - Target tracking method and system under dynamic background based on relevant filtering framework - Google Patents

Target tracking method and system under dynamic background based on relevant filtering framework Download PDF

Info

Publication number
CN115170621A
CN115170621A CN202210922551.8A CN202210922551A CN115170621A CN 115170621 A CN115170621 A CN 115170621A CN 202210922551 A CN202210922551 A CN 202210922551A CN 115170621 A CN115170621 A CN 115170621A
Authority
CN
China
Prior art keywords
target
tracking
filtering
background
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210922551.8A
Other languages
Chinese (zh)
Inventor
刘升
曲文峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XI'AN KEYWAY TECHNOLOGY CO LTD
Original Assignee
XI'AN KEYWAY TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XI'AN KEYWAY TECHNOLOGY CO LTD filed Critical XI'AN KEYWAY TECHNOLOGY CO LTD
Priority to CN202210922551.8A priority Critical patent/CN115170621A/en
Publication of CN115170621A publication Critical patent/CN115170621A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A target tracking method and system under a dynamic background based on a relevant filtering frame belong to the technical field of image target tracking, and are characterized in that: the method combines a related filtering tracking algorithm with angular point detection, sparse optical flow estimation, random sampling consistency and Kalman filtering algorithm, decomposes target pixel displacement in a video frame into relative motion of a target and a background and background motion, and respectively carries out estimation and modeling to realize stable tracking of the target under a dynamic background. According to the target tracking method and system, the relevant filtering tracking algorithm is combined with the angular point detection, the sparse optical flow estimation, the random sampling consistency and the Kalman filtering algorithm, the target pixel displacement in the video frame is decomposed into the relative motion of the target and the background motion, and estimation and modeling are respectively carried out, so that the stable tracking of the target under the dynamic background is realized, the robustness and the precision of small target tracking under the dynamic background can be effectively improved, and the application range of the target tracking algorithm is expanded.

Description

Method and system for tracking target under dynamic background based on relevant filtering framework
Technical Field
The invention belongs to the technical field of image target tracking, and particularly relates to a target tracking method and a target tracking system under a dynamic background based on a relevant filtering framework.
Background
Target tracking is an important problem in the field of computer vision, and is widely applied to the fields of military guidance, security monitoring, unmanned aerial vehicles, robots and the like. In general, a robust model is constructed according to a target and background information in a video to predict the motion states of the target in the video, such as shape, size, position, trajectory and the like.
The target tracking algorithm evolves and evolves gradually from an early classical algorithm to an algorithm based on relevant filtering and then to an algorithm based on deep learning, the accuracy and efficiency of the early classical algorithm cannot meet the current application requirements, and the algorithm based on relevant filtering and deep learning becomes mainstream in recent years. Although the tracking accuracy of the deep learning-based tracking algorithm is higher, at present, the lightweight research is less, the existing lightweight realization has no advantages in efficiency and accuracy compared with a related filtering algorithm framework, and meanwhile, the algorithm model has great requirements on the scale and platform computing power of a training data set, and the application deployment on embedded equipment is more difficult. With continuous iterative evolution of a related filtering algorithm, especially effective processing of edge effects and combination with depth features, tracking accuracy and robustness are remarkably improved, balance between accuracy and efficiency under different task requirements can be achieved through different feature selection and optimization strategies, and application and deployment are more flexible.
However, no matter the tracking algorithm based on the correlation filtering or the deep learning, since the search range is in direct proportion to the size of the target, when the target is small, the disturbance of the camera may bring the background motion equivalent to the target size, which may have a great influence on the performance of the algorithm, for example, in an application scenario where a vehicle-mounted, airborne or ship-mounted small-field-of-view camera performs remote detection. In the scene of small target tracking under the dynamic background, the tracking precision and robustness of the conventional tracking algorithm are obviously reduced, and when the disturbance is large, the target is easily lost and the tracking is unstable.
Disclosure of Invention
The invention aims to solve the problems and provides a method and a system for tracking a target under a dynamic background based on a relevant filtering framework.
In a first aspect, the invention provides a method for tracking a target under a dynamic background based on a correlation filtering framework, which combines a correlation filtering tracking algorithm with angular point detection, sparse optical flow estimation, random sampling consistency and a Kalman filtering algorithm, decomposes target pixel displacement in a video frame into relative motion of the target and the background and background motion, and respectively estimates and models the motion to realize stable tracking of the target under the dynamic background.
Further, the method for tracking the target under the dynamic background based on the relevant filtering framework comprises the following steps:
1) Cutting out image blocks which take the target as the center and simultaneously contain the background from the current frame according to the target position and the scale in the target framing information; taking the image block as a sample to extract features, and initializing a related filtering tracker; meanwhile, initializing a Kalman filter according to a target position in the target framing information;
2) Carrying out mean value filtering on the current frame, then sampling at a preset scale, caching, and extracting feature points through an angular point detection algorithm; the extracted feature points are mainly global background points;
3) Acquiring a next frame of image, performing mean filtering, sampling at the same preset scale, and then performing optical flow estimation by combining the cached frames and corresponding feature points by adopting an optical flow algorithm to acquire the displacement of the feature points in two frames;
4) Screening the feature points by adopting a random sampling consistency algorithm, filtering residual foreground points and fitting a background motion affine model through background point light flow; the algorithm disclosed by the invention is mainly used for tracking small targets, background points account for most of the small targets, and meanwhile, the distance between the targets and a camera is long, so that the random sampling consensus algorithm can effectively filter the foreground and accurately estimate an affine matrix representing background motion;
5) Projecting the target position in the previous frame to the current frame by affine transformation, and obtaining the target displacement generated by background motion as a Kalman filter control quantity to carry out position prediction to obtain a predicted position;
6) The predicted position is combined with the target scale information in the previous frame to serve as a search center, and the current position and the scale information of the target are obtained through a relevant filtering tracker; correcting the position information serving as an observation value through a Kalman filter;
7) Re-collecting sample characteristics in the current frame according to the corrected position and scale information and updating a related filtering tracker;
8) And repeating the steps 3) -7) to realize continuous tracking of the target.
Further, in the method for tracking a target under a dynamic background based on a correlation filtering framework, the process of initializing the correlation filtering tracker includes: reading a video frame corresponding to the target frame selection information as an initialization frame
Figure 417584DEST_PATH_IMAGE001
(ii) a According to the target position and scale in the target frame selection information, the target frame selection information is paired
Figure DEST_PATH_IMAGE002
Taking a target as a center and including background region extraction gradient and color characteristics as a target sample
Figure DEST_PATH_IMAGE003
According to slave initialization frame
Figure 397041DEST_PATH_IMAGE004
The target sample extracted in (1)
Figure 176778DEST_PATH_IMAGE003
Initializing the correlation filter tracker by solving the following
Figure DEST_PATH_IMAGE005
Figure 571987DEST_PATH_IMAGE006
Wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE007
in order to achieve the desired output response,
Figure 386360DEST_PATH_IMAGE008
multiplication operation among matrix elements;
Figure 100002_DEST_PATH_IMAGE009
representing the matrix complex conjugate.
In a second aspect, the invention provides a system for tracking a target under a dynamic background based on a correlation filtering framework, which comprises an initialization module, a feature extraction module and a loop tracking module;
the initialization module is used for cutting out image blocks which take the target as the center and simultaneously contain the background from the current frame according to the target position and the scale in the target framing information; taking the image block as a sample to extract features, and initializing a related filtering tracker; meanwhile, initializing a Kalman filter according to a target position in the target framing information;
the feature extraction module is used for performing mean filtering on the current frame, then performing sampling at a preset scale, performing caching, and extracting feature points through an angular point detection algorithm;
the circular tracking module is used for acquiring the next frame of image, sampling the next frame of image in the same preset scale after mean filtering, and then performing optical flow estimation by combining the cached frames and the corresponding feature points by adopting an optical flow algorithm to acquire the displacement of the feature points in the two frames;
screening the feature points by adopting a random sampling consistency algorithm, filtering residual foreground points and fitting a background motion affine model through background point light flow; projecting the target position in the previous frame to the current frame by affine transformation, and obtaining the target displacement generated by background motion as a Kalman filter control quantity to carry out position prediction to obtain a predicted position;
the predicted position is combined with the target scale information in the previous frame to serve as a search center, and the current position and the scale information of the target are obtained through a relevant filtering tracker; correcting the position information serving as an observation value through a Kalman filter;
and re-collecting sample characteristics in the current frame according to the corrected position and scale information and updating the related filtering tracker.
In a third aspect, the present invention provides a target tracking apparatus under a dynamic background based on a correlation filtering framework, including a processor and a memory electrically connected to each other; the memory is used for storing a computer program; the processor, when executing the aforementioned computer program, may implement the method for object tracking in a dynamic background based on a correlation filtering framework as described in the first aspect.
In a fourth aspect, the present invention provides a computer readable storage medium having a computer program stored thereon; the computer program, when executed, may implement a method for object tracking in a dynamic context based on a correlation filtering framework as described in the first aspect.
According to the method and the system for tracking the target under the dynamic background based on the relevant filtering frame, the relevant filtering tracking algorithm is combined with the angular point detection, the sparse optical flow estimation, the random sampling consistency and the Kalman filtering algorithm, the target pixel displacement in the video frame is decomposed into the relative motion of the target and the background motion, the estimation and the modeling are respectively carried out, the stable tracking of the target under the dynamic background is realized, the robustness and the precision of the small target tracking under the dynamic background can be effectively improved, and the application range of the target tracking algorithm is expanded.
Drawings
Fig. 1 is a schematic flowchart of a target tracking method based on a correlation filtering framework in a dynamic background according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a target tracking system under a dynamic background based on a correlation filtering framework according to an embodiment of the present invention;
FIG. 3 is a graphical illustration of the accuracy of the evaluation over a subset of the UAV123 data set in accordance with an embodiment of the present invention;
fig. 4 is a diagram illustrating the success rate of the evaluation on the UAV123 data set subset according to an embodiment of the present invention.
Detailed Description
The following describes the method and system for tracking a target in a dynamic background based on a correlation filtering framework in detail through the accompanying drawings and embodiments.
Example one
The method for tracking the target under the dynamic background based on the relevant filtering framework disclosed by the embodiment of the disclosure combines the relevant filtering tracking algorithm with angular point detection, sparse optical flow estimation, random sampling consistency and Kalman filtering algorithm, decomposes target pixel displacement in a video frame into relative motion of a target and the background and background motion, and respectively estimates and models to realize stable tracking of the target under the dynamic background.
The method for tracking the target under the dynamic background based on the correlation filtering framework, as shown in fig. 1, includes the following steps:
1) According to the target position and scale in the target frame selection information, cutting out an image block which takes a target as a center and simultaneously contains a background from the current frame; taking the image block as a sample to extract features, and initializing a related filtering tracker; meanwhile, initializing a Kalman filter according to a target position in the target framing information;
2) Carrying out mean value filtering on the current frame, then sampling at a preset scale, caching, and extracting feature points through an angular point detection algorithm; the extracted feature points are mainly global background points;
3) Obtaining the next frame of image, sampling at the same preset scale after mean filtering, and then carrying out optical flow estimation by combining the cached frame and the corresponding feature points by adopting an optical flow algorithm to obtain the displacement of the feature points in the two frames;
4) Screening the feature points by adopting a random sampling consistency algorithm, filtering residual foreground points and fitting a background motion affine model by background point light stream; the algorithm disclosed by the disclosure is mainly used for tracking small targets, background points account for most of the small targets, and meanwhile, the distance between the target and a camera is long, so that the random sampling consistent algorithm can effectively filter the foreground and accurately estimate an affine matrix representing background motion;
5) Projecting the target position in the previous frame to the current frame by affine transformation, and obtaining the target displacement generated by background motion as a Kalman filter control quantity to carry out position prediction to obtain a predicted position;
6) The predicted position is combined with the target scale information in the previous frame to serve as a search center, and the current position and the scale information of the target are obtained through a relevant filtering tracker; correcting the position information serving as an observed value through a Kalman filter;
7) Re-collecting sample characteristics in the current frame according to the corrected position and scale information and updating a related filtering tracker;
8) And repeating the steps 3) -7) to realize continuous tracking of the target.
The specific implementation steps in the embodiment of the present disclosure include:
step 1: step 1: after receiving the target frame selection information, reading a video frame corresponding to the target frame selection information as an initialization frame
Figure DEST_PATH_IMAGE010
And 2, step: according to the target position and scale in the target frame selection information, the target frame selection information is paired
Figure DEST_PATH_IMAGE011
Taking a target as a center and including background region extraction gradient and color characteristics as a target sample
Figure DEST_PATH_IMAGE012
And 3, step 3: according to the slave initialization frame
Figure 21871DEST_PATH_IMAGE004
The target sample extracted in (1)
Figure DEST_PATH_IMAGE013
Initializing the correlation filter by solving
Figure DEST_PATH_IMAGE014
Figure DEST_PATH_IMAGE015
Wherein, the first and the second end of the pipe are connected with each other,
Figure 351222DEST_PATH_IMAGE007
in order to provide the desired output response,
Figure DEST_PATH_IMAGE016
is a multiplication operation between matrix elements;
Figure 550122DEST_PATH_IMAGE009
representing the matrix complex conjugate.
And 4, step 4: initializing parameters such as a Kalman filter state and a covariance matrix according to a corresponding target center position in the target frame selection information; the system state, the measured values, the state equation and the measurement equation at time k are expressed as follows:
Figure DEST_PATH_IMAGE017
Figure DEST_PATH_IMAGE018
Figure DEST_PATH_IMAGE019
Figure DEST_PATH_IMAGE020
wherein A is a system state transition matrix, B is a system control matrix,
Figure DEST_PATH_IMAGE021
for process noise, the process noise covariance matrix is denoted as Q, and the covariance matrix of the system state estimate is denoted as Q
Figure DEST_PATH_IMAGE022
H is a system measurement matrix,
Figure DEST_PATH_IMAGE023
for measuring noise, the covariance matrix of the measured noise is marked as O; k represents the current time, X (k) is a system state vector, U (k) is a system control quantity, Z (k) is a system measurement value, and cx and cy representThe coordinates of the center point, vx, vy indicate the velocity coordinate components.
And 5: to pair
Figure 530585DEST_PATH_IMAGE011
Graying and mean filtering are carried out, and then downsampling buffering is carried out to obtain
Figure DEST_PATH_IMAGE024
Wherein f is
Figure DEST_PATH_IMAGE025
Representing a gray scale image, x, y are pixel coordinates,
Figure 586266DEST_PATH_IMAGE026
indicating that the image was the image of the previous frame in the optical flow algorithm.
Step 6, detecting the image by using Harris angular point detection operator
Figure DEST_PATH_IMAGE027
A group of corner points in the Harris operator matrix form is defined as:
Figure 340595DEST_PATH_IMAGE028
Figure DEST_PATH_IMAGE029
Figure DEST_PATH_IMAGE030
the gradient is respectively in the horizontal direction and the vertical direction,
Figure DEST_PATH_IMAGE031
for the weighting factor, the corner point quality R is calculated by the determinant and trace of the M matrix:
Figure DEST_PATH_IMAGE032
and 7: reading the next frame image
Figure 890656DEST_PATH_IMAGE033
To, for
Figure 414041DEST_PATH_IMAGE033
Graying and mean filtering are carried out, and then down sampling is carried out to obtain an image
Figure DEST_PATH_IMAGE034
Step 8, calculating the previous frame image through the LK pyramid optical flow algorithm
Figure 843886DEST_PATH_IMAGE035
The extracted feature points in the later frame image
Figure DEST_PATH_IMAGE036
Position in, LK optical flow is based on the constant-brightness assumption:
Figure 85511DEST_PATH_IMAGE037
Figure DEST_PATH_IMAGE038
is the time of the previous frame and is,
Figure 422952DEST_PATH_IMAGE039
for the next frame time, the pixel point of the previous frame
Figure DEST_PATH_IMAGE040
The position in the subsequent frame is
Figure 66423DEST_PATH_IMAGE041
The right Taylor expansion for the above equation yields:
Figure DEST_PATH_IMAGE042
the adjacent pixel points are assumed to have space consistency, and the method can be used for constructingThe equation can be solved by the least square method
Figure 401589DEST_PATH_IMAGE043
And
Figure DEST_PATH_IMAGE044
and step 9: background points in the two sets of point sets are screened out through a random sample consensus (RANSAC) algorithm, and a corresponding affine matrix is estimated. Affine models are standard models of motion estimation that can accurately describe motion patterns such as translation, rotation, and deformation. Thus, the background motion estimation is computed by a planar affine model between two frames:
Figure DEST_PATH_IMAGE045
in the formula
Figure DEST_PATH_IMAGE046
Figure DEST_PATH_IMAGE047
Is a displacement motion vector field. By
Figure DEST_PATH_IMAGE048
And
Figure DEST_PATH_IMAGE049
by estimating the model parameters a and b, the corresponding background motion vector field can be calculated. The solution is carried out by RANSAC algorithm, and the objective function is as follows:
Figure 445025DEST_PATH_IMAGE050
wherein
Figure DEST_PATH_IMAGE051
,
Figure 523839DEST_PATH_IMAGE052
,
Figure DEST_PATH_IMAGE053
,
Figure 84134DEST_PATH_IMAGE054
The goal is to solve for 6 affine transformation parameters, as known.
RANSAC firstly randomly assumes a small group of local interior points as initial values to solve unknown parameters, fits a model, and then screens all data according with the model by using the model to expand the data, wherein if enough points are classified as the assumed local interior points, the estimated model is reasonable enough.
Step 10: and projecting the target position of the previous frame to the current frame through the estimated affine transformation parameters, calculating the background motion displacement corresponding to the target position, and performing position prediction by taking the displacement as the control quantity of the Kalman filter.
Step 11: the target position of the position prediction is taken as a search center, the position of the target is searched through a response image of a search area after the function of a correlation filter, the larger the value of the response image is, the larger the correlation between the image of the position and the target is, and the response image is represented as follows:
Figure DEST_PATH_IMAGE055
wherein the content of the first and second substances,
Figure 855780DEST_PATH_IMAGE056
the updated correlation filter for the previous frame,
Figure DEST_PATH_IMAGE057
to extract image features on the search area,
Figure 806419DEST_PATH_IMAGE058
is an inverse fourier transform.
Step 12: and correcting the position of the maximum value of the response diagram as a measured value through a Kalman filter, and outputting a corrected value as a target position at the current moment.
Step 13: updating the relevant filter according to the sample characteristics re-extracted at the corrected position, specifically, re-training a new filter by using the new sample characteristics, and performing exponential averaging with the previously trained filter:
Figure DEST_PATH_IMAGE059
wherein, the first and the second end of the pipe are connected with each other,
Figure 501974DEST_PATH_IMAGE060
in order to be a new sample,
Figure DEST_PATH_IMAGE061
to update the parameters.
Step 14: judging whether to finish the tracking, if so, finishing the algorithm, otherwise, finishing the algorithm
Figure 854458DEST_PATH_IMAGE062
Buffer as
Figure DEST_PATH_IMAGE063
And jumps to step 6.
As shown in fig. 3 and 4, the accuracy and success rate of the method according to the embodiment of the present disclosure evaluated on the UAV123 data set subset are shown schematically; the accuracy is defined as the average Euclidean distance between the central position of the tracking target and the true value of the label, the success rate is defined as the proportion of the number of frames which are successfully tracked, and the overlapping rate of the tracking target frame and the true value of the label is successfully judged to be larger than a threshold value. Because the video sequence in the UAV123 is shot by the unmanned aerial vehicle and contains more dynamic backgrounds and small target scenes, the video sequence in the UAV123 data set under the scenes is selected to evaluate the tracking accuracy and robustness of the algorithm, and compared with the main relevant filtering algorithms KCF, DSST and STAPLE. As can be seen from the results on the graph, the target tracking method based on the relevant filtering framework in the embodiment of the invention is superior to other methods, and the performance of the algorithm in the scene is effectively improved.
Example two
The embodiment of the disclosure discloses a target tracking system under a dynamic background based on a relevant filtering framework, as shown in fig. 2, comprising an initialization module, a feature extraction module and a circular tracking module;
the initialization module is used for cutting out image blocks which take the target as the center and simultaneously contain the background from the current frame according to the target position and the scale in the target framing information; taking the image block as a sample to extract features, and initializing a related filtering tracker; meanwhile, initializing a Kalman filter according to a target position in the target framing information;
the feature extraction module is used for sampling and caching a current frame in a preset scale after mean filtering is carried out on the current frame, and extracting feature points through an angular point detection algorithm;
the circular tracking module is used for acquiring the next frame of image, sampling the next frame of image in the same preset scale after mean filtering, and then performing optical flow estimation by combining the cached frames and the corresponding feature points by adopting an optical flow algorithm to acquire the displacement of the feature points in the two frames;
screening the feature points by adopting a random sampling consistency algorithm, filtering residual foreground points and fitting a background motion affine model by background point light stream; projecting the target position in the previous frame to the current frame by affine transformation, and obtaining the target displacement generated by background motion as a Kalman filter control quantity to carry out position prediction to obtain a predicted position;
the predicted position is combined with the target scale information in the previous frame to serve as a search center, and the current position and the scale information of the target are obtained through a relevant filtering tracker; correcting the position information serving as an observation value through a Kalman filter;
and re-collecting sample characteristics in the current frame according to the corrected position and scale information and updating the relevant filtering tracker.
EXAMPLE III
The embodiment of the disclosure discloses a target tracking device under a dynamic background based on a correlation filtering framework, which comprises a processor and a memory which are electrically connected with each other; the memory is used for storing a computer program; when the processor executes the computer program, the method for tracking the target under the dynamic background based on the correlation filtering framework according to the first embodiment can be realized; the specific steps of the tracking method are the same as those in the first embodiment, and are not described herein again.
Example four
The disclosed embodiments disclose a computer-readable storage medium having a computer program stored thereon; when the computer program is executed, the method for tracking the target under the dynamic background based on the correlation filtering framework can be realized; the specific steps of the tracking method are the same as those in the first embodiment, and are not described herein again.
The computer described in the embodiments of the present application may be a general purpose computer, a special purpose computer, a network of computers, or other programmable devices. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium. The computer readable storage medium may be any available medium that can be read by a computer or a data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., digital Versatile Disk (DVD)), or a semiconductor medium (e.g., solid State Disk (SSD)), among others. The software formed by the computer stored code may be located in a random access memory, flash memory, read only memory, programmable read only memory or electrically erasable programmable memory, registers, or other storage media as is well known in the art.
Each functional module in the embodiments of the present application may be integrated into one processing unit or module, or each module may exist alone physically, or two or more modules are integrated into one unit or module. In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are realized in whole or in part.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (6)

1. A target tracking method under a dynamic background based on a correlation filtering framework is characterized in that: the method combines a related filtering tracking algorithm with angular point detection, sparse optical flow estimation, random sampling consistency and Kalman filtering algorithm, decomposes target pixel displacement in a video frame into relative motion of a target and a background and background motion, and respectively carries out estimation and modeling to realize stable tracking of the target under a dynamic background.
2. The method for tracking the target under the dynamic background based on the correlation filtering framework according to claim 1, characterized by comprising the following steps:
1) Cutting out image blocks which take the target as the center and simultaneously contain the background from the current frame according to the target position and the scale in the target framing information; taking the image block as a sample to extract features, and initializing a related filtering tracker; meanwhile, initializing a Kalman filter according to a target position in the target framing information;
2) Carrying out mean value filtering on the current frame, then sampling at a preset scale, caching, and extracting feature points through an angular point detection algorithm;
3) Acquiring a next frame of image, performing mean filtering, sampling at the same preset scale, and then performing optical flow estimation by combining the cached frames and corresponding feature points by adopting an optical flow algorithm to acquire the displacement of the feature points in two frames;
4) Screening the feature points by adopting a random sampling consistency algorithm, filtering residual foreground points and fitting a background motion affine model by background point light stream;
5) Projecting the target position in the previous frame to the current frame by affine transformation, and obtaining the target displacement generated by background motion as a Kalman filter control quantity to carry out position prediction to obtain a predicted position;
6) The predicted position is combined with the target scale information in the previous frame to serve as a search center, and the current position and the scale information of the target are obtained through a relevant filtering tracker; correcting the position information serving as an observation value through a Kalman filter;
7) Re-collecting sample characteristics in the current frame according to the corrected position and scale information and updating a related filtering tracker;
8) And repeating the steps 3) -7) to realize continuous tracking of the target.
3. The method for tracking targets under a dynamic background based on a correlation filtering framework according to claim 2, wherein the process of initializing the correlation filtering tracker comprises: reading a video frame corresponding to the target frame selection information as an initialization frame
Figure DEST_PATH_IMAGE001
(ii) a According to the target position and scale in the target frame selection information, the target frame selection information is paired
Figure 809212DEST_PATH_IMAGE002
Taking a target as a center and including background region extraction gradient and color characteristics as a target sample
Figure 646718DEST_PATH_IMAGE003
According to slave initialization frame
Figure DEST_PATH_IMAGE004
The target sample extracted in (1)
Figure 212828DEST_PATH_IMAGE005
Initializing the correlation filter tracker by solving the following equation
Figure DEST_PATH_IMAGE006
Figure 514497DEST_PATH_IMAGE007
Wherein the content of the first and second substances,
Figure 875071DEST_PATH_IMAGE008
in order to provide the desired output response,
Figure DEST_PATH_IMAGE009
multiplication operation among matrix elements;
Figure 698408DEST_PATH_IMAGE010
representing the matrix complex conjugate.
4. A target tracking system under a dynamic background based on a correlation filtering framework is characterized in that: the system comprises an initialization module, a feature extraction module and a loop tracking module;
the initialization module is used for cutting out image blocks which take the target as the center and simultaneously contain the background from the current frame according to the target position and the scale in the target framing information; taking the image block as a sample to extract features, and initializing a related filtering tracker; meanwhile, initializing a Kalman filter according to a target position in the target framing information;
the feature extraction module is used for performing mean filtering on the current frame, then performing sampling at a preset scale, performing caching, and extracting feature points through an angular point detection algorithm;
the circular tracking module is used for acquiring the next frame of image, sampling the next frame of image by the same preset scale after mean filtering, and then estimating the displacement of the characteristic points in the two frames by adopting an optical flow algorithm by combining the cached frames and the corresponding characteristic points;
screening the feature points by adopting a random sampling consistency algorithm, filtering residual foreground points and fitting a background motion affine model by background point light stream; projecting the target position in the previous frame to the current frame by affine transformation, and obtaining the target displacement generated by background motion as a Kalman filter control quantity to carry out position prediction to obtain a predicted position;
the predicted position is combined with the target scale information in the previous frame to serve as a search center, and the current position and the scale information of the target are obtained through a relevant filtering tracker; correcting the position information serving as an observed value through a Kalman filter;
and re-collecting sample characteristics in the current frame according to the corrected position and scale information and updating the relevant filtering tracker.
5. A target tracking device under a dynamic background based on a correlation filtering framework comprises a processor and a memory which are electrically connected; the memory is used for storing a computer program; the method is characterized in that: the processor, when executing the computer program, may implement the method for tracking objects in a dynamic background based on a correlation filtering framework according to any one of claims 1 to 3.
6. A computer-readable storage medium characterized by: the storage medium has a computer program stored thereon; the computer program, when executed, may implement a method for correlation filtering framework based dynamic background object tracking according to any of claims 1-3.
CN202210922551.8A 2022-08-02 2022-08-02 Target tracking method and system under dynamic background based on relevant filtering framework Pending CN115170621A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210922551.8A CN115170621A (en) 2022-08-02 2022-08-02 Target tracking method and system under dynamic background based on relevant filtering framework

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210922551.8A CN115170621A (en) 2022-08-02 2022-08-02 Target tracking method and system under dynamic background based on relevant filtering framework

Publications (1)

Publication Number Publication Date
CN115170621A true CN115170621A (en) 2022-10-11

Family

ID=83478000

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210922551.8A Pending CN115170621A (en) 2022-08-02 2022-08-02 Target tracking method and system under dynamic background based on relevant filtering framework

Country Status (1)

Country Link
CN (1) CN115170621A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116228817A (en) * 2023-03-10 2023-06-06 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116228817A (en) * 2023-03-10 2023-06-06 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering
CN116228817B (en) * 2023-03-10 2023-10-03 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering

Similar Documents

Publication Publication Date Title
CN109974693B (en) Unmanned aerial vehicle positioning method and device, computer equipment and storage medium
CN110555901B (en) Method, device, equipment and storage medium for positioning and mapping dynamic and static scenes
JP6095018B2 (en) Detection and tracking of moving objects
CN112233177B (en) Unmanned aerial vehicle pose estimation method and system
US20070237359A1 (en) Method and apparatus for adaptive mean shift tracking
WO2007056711A2 (en) Tracking using an elastic cluster of trackers
CN110390685B (en) Feature point tracking method based on event camera
CN111383252B (en) Multi-camera target tracking method, system, device and storage medium
WO2018152214A1 (en) Event-based feature tracking
DuToit et al. Consistent map-based 3D localization on mobile devices
CN105374049B (en) Multi-corner point tracking method and device based on sparse optical flow method
CN115131420A (en) Visual SLAM method and device based on key frame optimization
CN109448023A (en) A kind of satellite video Small object method for real time tracking of combination space confidence map and track estimation
CN112634333A (en) Tracking device method and device based on ECO algorithm and Kalman filtering
Chen et al. Real-time object tracking via CamShift-based robust framework
White et al. An iterative pose estimation algorithm based on epipolar geometry with application to multi-target tracking
CN115170621A (en) Target tracking method and system under dynamic background based on relevant filtering framework
CN113379789B (en) Moving target tracking method in complex environment
CN113313739A (en) Target tracking method, device and storage medium
Le et al. Human detection and tracking for autonomous human-following quadcopter
CN109410254B (en) Target tracking method based on target and camera motion modeling
CN116883897A (en) Low-resolution target identification method
WO2023130842A1 (en) Camera pose determining method and apparatus
CN116643291A (en) SLAM method for removing dynamic targets by combining vision and laser radar
CN116523957A (en) Multi-target tracking method, system, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination