CN109785366B - Related filtering target tracking method for shielding - Google Patents

Related filtering target tracking method for shielding Download PDF

Info

Publication number
CN109785366B
CN109785366B CN201910052347.3A CN201910052347A CN109785366B CN 109785366 B CN109785366 B CN 109785366B CN 201910052347 A CN201910052347 A CN 201910052347A CN 109785366 B CN109785366 B CN 109785366B
Authority
CN
China
Prior art keywords
frame
target
tracking
weight map
weight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910052347.3A
Other languages
Chinese (zh)
Other versions
CN109785366A (en
Inventor
凌强
汤峰
李峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Snegrid Electric Technology Co ltd
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN201910052347.3A priority Critical patent/CN109785366B/en
Publication of CN109785366A publication Critical patent/CN109785366A/en
Application granted granted Critical
Publication of CN109785366B publication Critical patent/CN109785366B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to a method for tracking a related filtering target aiming at shielding, which comprises the following steps of: for a section of tracked video sequence, giving the position and the size of a tracked target of a t-th frame, determining a search area, calculating characteristics, and calculating a weight map of the t-th frame; step 2: training a correlation filter of the t frame based on the obtained weight graph of the t frame; and step 3: according to the trained correlation filter, calculating a target response image of a t +1 th frame, and calculating a target position of the t +1 th frame; and 4, step 4: and (4) obtaining an APSR strategy with high confidence level based on the target position of the t +1 th frame, and determining whether the correlation filter of the t th frame is updated.

Description

Related filtering target tracking method for shielding
Technical Field
The invention relates to a method for tracking a related filtering target aiming at shielding, and belongs to the field of pattern recognition and computer vision.
Background
With the increasing development of computer vision, visual tracking has been widely applied to many computer vision tasks, such as video surveillance, human-computer interaction, and unmanned perception systems. Given the true position of the object in the first frame, the tracker can locate the object of interest throughout the video sequence. Although visual tracking methods have made great progress, many challenges remain, such as morphing, occlusion, out-of-view, scale changes, in-plane rotation, etc. [1 ].
In recent years, the discriminant class tracking method has attracted great attention. The discrimination method treats target tracking as two categories, namely the target and the background region in the video. Many discriminant-like methods are based on machine learning methods, of which correlation filtering (KCF) [2] is the most popular due to its high computational efficiency and excellent tracking performance. However, the standard correlation filtering is subject to the boundary effect, and may generate unreal training negative samples, which may train an over-fitting filter, and may not well cope with the challenges such as deformation occlusion, thereby increasing the risk of tracking failure. There are many efforts to improve the boundary effect of correlation filtering, and SRDCF 3 (spatial regularization discriminant correlation filter) introduces a spatial regularization window that is 5 times the size of the target, penalizing filter values outside the rectangular box of the target, which causes many background samples to be suppressed, and thus has a stronger tracking capability than KCF. However, SRDCF is fixed in its overall parameters during tracking, and thus this method does not adapt well to the shape change of the target. In addition, CSR-DCF [4], it uses the color histogram model to construct two classification segmentation matrices, giving more weight to the real target area, and the background pixels are suppressed, so the trained correlation filter tracker focuses more on the real target area. However, the binary segmentation matrix obtained by the color histogram may not be always accurate, and particularly when occlusion and illumination change occur, the binary segmentation matrix with low confidence coefficient greatly interferes with the tracker, resulting in tracking failure.
An article [2] provides a traditional KCF tracking algorithm flow, and utilizes a popular tracking-by-detection [5] idea, wherein the general idea of the KCF is as follows: for a given training positive sample, the properties of the periodic matrix are used to generate a large number of remaining negative samples and to train the correlation filter. Depending on the nature of the circulant matrix, the DCF method converts the time-consuming spatial correlation into a fast fourier domain of elemental operations.
Article [6] proposes a HOG (Histogram of organized Gradient) descriptor, and the HOG is generated according to the idea that: it constructs features by calculating and counting the histogram of gradient direction of local area of image. The directional density distribution of the gradient or edge of the target well describes the appearance and shape of the target, and thus the HOG feature is widely used in the field of target detection and tracking.
Article [7] proposes a CN (Color Names) descriptor, and the CN is generated according to the idea that:
it classifies the colors that a target may appear into 11 classes: black, blue, brown, gray, green, orange, pink, purple, red, white and yellow. Through an adaptive algorithm, the idea of PCA (principal component analysis) is utilized to select the more prominent color of each pixel in the target area, and the 11-dimensional color feature is reduced to 2-dimensional.
Article [4] proposes a spatially aware correlation filtering tracking algorithm. The method utilizes a color histogram to generate a weight matrix and judges the pixel type (target or background) in a tracking target area. The algorithm firstly extracts target features and calculates a color histogram for a tracking result of a previous frame (generally a rectangular frame), then the generated weight matrix is merged into the traditional KCF tracking algorithm, and then a trained filter is obtained, and the most appropriate target position is positioned in a search area of a current frame.
In summary, it is still very difficult to design a tracking algorithm that not only satisfies real-time performance, but also deals with various external interferences and has a tracking effect that can also satisfy actual requirements. At present, no relevant literature reports exist.
[1] Wangsheng, Daxiang, Xuning, and Zhang Pengfei, "review of environmental perception technology for unmanned vehicles," Chun Chang Liang university of science (Nature science edition), vol.40, No.01, pp.1-6,2017.
[2]J.F.Henriques,R.Caseiro,P.Martins,and J.Batista,"High-speed tracking with kernelized correlation filters,"IEEE Transactions on Pattern Analysis and Machine Intelligence,vol.37,no.3,pp.583-596,2015.
[3]M.Danelljan,G.Hager,F.Shahbaz Khan,and M.Felsberg,"Learning spatially regularized correlation filters for visual tracking,"in Proceedings of the IEEE International Conference on Computer Vision,2015,pp.4310-4318.
[4]A.Lukezic,T.Vojir,L.C.Zajc,J.Matas,and M.Kristan,"Discriminative Correlation Filter with Channel and Spatial Reliability,"in CVPR,2017,vol.1,no.2,p.3.
[5]Z.Kalal,K.Mikolajczyk,and J.Matas,"Tracking-learning-detection,"IEEE transactions on pattern analysis and machine intelligence,vol.34,no.7,p.1409,2012.
[6]N.Dalal and B.Triggs,"Histograms of oriented gradients for human detection,"in Computer Vision and Pattern Recognition,2005.CVPR 2005.IEEE Computer Society Conference on,2005,vol.1,pp.886-893:IEEE.
[7]J.Van De Weijer,C.Schmid,J.Verbeek,and D.Larlus,"Learning color names for real-world applications,"IEEE Transactions on Image Processing,vol.18,no.7,pp.1512-1523,2009.
[8]S.Boyd,N.Parikh,E.Chu,B.Peleato,and J.Eckstein,"Distributed optimization and statistical learning via the alternating direction method of multipliers,"Foundations and
Figure BDA0001951202060000031
in Machine learning,vol.3,no.1,pp.1-122,2011.
[9]Y.Wu,J.Lim,M.-H.Yang,Object tracking benchmark,IEEE Transactions on Pattern Analys is and Machine Intelligence,vol.37,no.9,pp.1834–1848,2015.
Disclosure of Invention
The invention solves the problems that: the method for tracking the shielded related filtering target overcomes the defects of the prior art, is high in tracking precision and robustness, meets the real-time requirement for tracking speed, and can solve the problems of target shielding, deformation and the like.
The principle of the invention is as follows: the invention provides a color histogram based weighted correlation filtering tracker.
On one hand, pixels with high weight values should be considered as targets; on the other hand, pixels with low weight values, which are more likely to be considered as background, should be suppressed from interfering with the training correlation filter (KCF).
Compared with the CSR-DCF and the SRDCF, the invention provides a novel spatial perception correlation filter with an adaptive weight map. The adaptive weight map set and the spatial weight map and target likelihood map (derived from the color histogram) of the present invention reflect the likelihood of each pixel in the search area belonging to the target.
In addition, when the target is shielded, the target area is polluted by background pixels, if the tracking model is continuously updated at the moment, the tracker is polluted, and once the target reappears in the field of view, the tracker cannot lock the target again at the moment. Therefore, the invention provides a high-confidence self-adaptive updating strategy which is used for judging the tracking quality of the tracker and determining whether the tracking model obtained by the current frame training is updated.
The invention relates to a method for tracking a related filtering target aiming at shielding, which comprises the following steps:
step 1: for a section of tracked video sequence, giving the position and the size of a tracked target of a t-th frame, determining a search area, extracting features, and calculating a weight map of the t-th frame;
step 2: training a correlation filter of the t frame based on the obtained weight graph of the t frame;
and step 3: according to the trained correlation filter, calculating a target response image of a t +1 th frame, and calculating a target position of the t +1 th frame;
and 4, step 4: and (4) obtaining an APSR strategy with high confidence level based on the target position of the t +1 th frame, and determining whether the correlation filter of the t th frame is updated.
The step 1 is specifically realized as follows:
the weight map based on the T frame mentioned in the step 1 is composed of a target similar weight map T and a spatial perception weight map P;
target similarity weight graph T:
knowing the target position and size of the image of the t-th frame, constructing a color histogram
Figure BDA0001951202060000041
And
Figure BDA0001951202060000042
as follows:
Figure BDA0001951202060000043
Figure BDA0001951202060000044
where gamma is a fixed update rate,
Figure BDA0001951202060000045
and
Figure BDA0001951202060000046
respectively representing the target and background color histograms of the t-th frame,
Figure BDA0001951202060000047
and
Figure BDA0001951202060000048
for the historical frames, namely the target and background color histograms from the 1 st frame to the T-1 st frame, a target similarity weight map T based on the color histogram is obtained:
Figure BDA0001951202060000049
wherein
Figure BDA00019512020600000410
And
Figure BDA00019512020600000411
the prior probability represents the proportion of the size of the target area and the background area of the t-th frame in the whole search area;
the spatially-perceptual weight map P, whose weight values decay with distance from the target center. For any pixel pi in the target frame, the numerical value of the spatial perception weight is marked as P (pi), and P (pi) is calculated for each pixel in the target frame to generate final P;
the above steps obtain the target similar weight map T and the spatial perception weight map P, and then the final weight map W of the T-th frametCalculated by the following formula:
Wt=T+P。
the step 2 is specifically realized as follows:
extracting features of the target area of the t-th frame, recording x and y as labels conforming to Gaussian distribution, and training a correlation filter ftThe optimization function is as follows:
(f)=||ft*x-y||2+λ||ft||2
wherein f ist=ft⊙WtWhen (f) is minimized, the correlation filter f of the t-th frame is trainedt
The step 3 is specifically realized as follows:
inputting a t +1 th frame image, searching a target position in a t +1 th frame search area, cutting the search area by taking the target position of the previous frame as a center, and extracting the characteristics of the search area, wherein the characteristics are expressed as zt+1Then according to the correlation filter f of the t frame obtained in step 2t: obtaining a response image S of the final t +1 th framet+1
Figure BDA0001951202060000051
Wherein
Figure BDA0001951202060000052
And
Figure BDA0001951202060000053
represents a pair of ftAnd zt+1Performing Fourier transform, F-1Representing the inverse Fourier transform, St+1Is a target response map of the t +1 th frame;
according to the response diagram St+1And calculating the target position of the t +1 th frame.
The step 4 is specifically realized as follows:
aiming at the target response map S of the t +1 th frame obtained in the step 3t+1The following APSR strategy is adopted to judge the tracking quality, wherein APSR is defined as follows:
Figure BDA0001951202060000054
wherein SmaxRepresents St+1Maximum value of (1), SminRepresents St+1Minimum value of, mu1Represents the region omega near the peak1Average value of (a) ("sigma1Is region omega1Standard deviation of, wherein, St+1Except for omega1Except thatThe remaining region is denoted as Ω2W and h represent St+1Abscissa and ordinate of the middle pixel, Sw,hDenotes St+1The numerical value corresponding to the middle coordinate (w, h) and mean are averaging functions;
and evaluating the tracking quality by calculating the value of the APSR to determine whether the relevant filter of the t-th frame is updated.
Compared with the prior art, the invention has the advantages and positive effects that:
(1) the invention can effectively process the tracking of the target under complex scenes such as sheltered target, deformation and the like
Aiming at target tracking in an actual scene, a spatial perception self-adaptive weight graph is utilized to train a relevant filter, so that the obtained filter can effectively identify real target pixels and reduce the interference of background pixels. The learned filter has memory, when the target disappears in the sight line for a short time, the tracker judges that the target disappears in the search area, the tracker stops updating the training model (polluted by background pixels) at the moment, and the tracker can still lock the tracking target when the target reappears in the sight line. The OTB2015 target tracking data set [9] has an accuracy of 84.7%, which is improved by 14.8%, 5.3% and 5% compared with other tracking methods of the trackers KCF [2], SRDCF [3] and CSRDCF [4], respectively.
(2) The tracking algorithm of the invention consumes less time
The method has the advantages of high calculation speed, benefit from the advantages of a KCF algorithm on one hand, abandon a complex optimization process on the other hand, and train an ideal filter by adopting a circular iteration method. Experiments show that the method can process 30 frames of data per second and can completely meet the requirement of real-time tracking.
Drawings
FIG. 1 is a flow chart of a method implementation of the present invention;
FIG. 2 is a weight diagram of the tth frame;
FIG. 3 is an illustration of a high confidence update strategy;
fig. 4 is a verification diagram of the experimental results.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and examples.
As shown in FIG. 1, the method comprises the following steps:
firstly giving a video sequence, giving the position and the size of a tracking target of a t frame, further determining a target search area, and extracting features;
then in the target search area, respectively calculating a spatial perception weight map P and a target similarity weight map T based on the color histogram to obtain a weight map W of the T-th framet
Based on the weight map W of the t-th frametTraining a correlation filter of the t frame;
and calculating a target response image of the t +1 th frame according to the trained correlation filter of the t th frame, and calculating the target position of the t +1 th frame.
And solving the APSR strategy with high confidence level to determine whether the relevant filter of the t-th frame is updated or not.
The specific process is described in detail below.
1. Target similarity weight map T based on color histogram
And determining a search area according to the target position and size of the t frame. As mentioned above, a color histogram is first generated to construct the target similarity weight map T.
Defining the target area of the t frame as OtThe background around the object is defined as BtThus, a color histogram is extracted for two regions, and the extracted color histogram is recorded as
Figure BDA0001951202060000066
And
Figure BDA0001951202060000067
wherein
Figure BDA0001951202060000068
And
Figure BDA0001951202060000069
respectively representing the target and background colors of the t-th frameA histogram. Meanwhile, in order to improve the reliability of the color histogram, the target and background color histograms of the history frames (1 st frame to t-1 st frame) are considered
Figure BDA0001951202060000061
And
Figure BDA0001951202060000062
to obtain
Figure BDA0001951202060000063
And
Figure BDA0001951202060000064
as follows:
Figure BDA0001951202060000065
Figure BDA0001951202060000071
gamma is a fixed update rate, and the gamma is 0.04 through a large number of repeated experiments.
Then, a target similarity weight map T based on the color histogram is obtained:
Figure BDA0001951202060000072
wherein
Figure BDA0001951202060000073
And
Figure BDA0001951202060000074
representing the target area O of the t frame by prior probabilitytAnd a background region BtIs proportional to the total search area.
2. Generating a weight map W of the t-th framet
Generally, there is a priori knowledge. The closer the pixels within the target area are to the target, the higher the likelihood that the pixels at the edges of the target area rectangular frame are subject to interference from background pixels, while the pixels located outside the target area are generally all background. The invention provides a spatial perception weight map P, which is used for endowing a pixel close to the center of a target area with higher weight, the weight values of the rest pixels in a target rectangular frame are gradually attenuated along with the pixel far away from the center of the target area, and the pixels outside the target area are assigned to be 0.5, so that the same possibility is kept and the pixels are selected to be the target area or the background. For any pixel pi in the target frame, the numerical value of the spatial perception weight is marked as P (pi), and P (pi) is calculated for each pixel in the target frame to generate final P;
Figure BDA0001951202060000075
wherein the tracking frame is rectangular, CtIs the central pixel coordinate of the rectangular frame, CxCoordinates of the remaining pixels within the rectangular frame, d (C)t-Cx) Is represented by CtTo CxThe distance of (c).
The above steps obtain the target similar weight map T and the spatial perception weight map P, and then the final weight map W of the T-th frametCalculated by the following formula:
Wt=T+P。
finally obtaining the space weight graph W of the t frametThe color image is generated by combining the spatial position of the pixel and the color information, so that the target and the background can be well distinguished. The effect is shown in fig. 2, (a) for the t frame search chart, the rectangular frame gives the tracking result of the previous frame; (b) a target similarity weight map T based on the color histogram; (c) is a spatial perception weight map P; (d) spatial weight map W for the tth framet
As can be seen from FIG. 2, the spatial weight map W of the t-th frame that is finally generatedtThe weight value of the target area is higher, and the weight value of the background area is lower.
3. Training filter
Extracting the characteristics of the target area obtained from the t-th frame and recording the characteristics asx, the feature operator uses the HOG and CN features mentioned in the background technology, y is a label conforming to Gaussian distribution, and the invention needs to train to obtain the correlation filter f of the t frametThe optimization function is as follows:
(f)=||ft*x-y||2+λ||ft||2
wherein f ist=ft⊙WtAn row represents a dot product. Is a convolution operation. λ is the regularization parameter, taken to be 0.05. (f) Is a loss function. WtIs the weight map of the t-th frame obtained in step 2. By ADMM [8]Iterative method to minimize (f), thus learning the correlation filter f for the t-th framet. Thus at WtTraining the resulting correlation filter f with interventiontOnly the target pixel is acted, and the tracking accuracy is greatly improved.
4. Tracking an object
The input image of the t +1 th frame needs to search a target position in the t +1 th frame, cut a search area by taking the target position of the previous frame as the center, extract the characteristics of the search area and express the characteristics as zt+1. Then according to the correlation filter f of the t frame obtained in the step 3tResponse map S of t +1 th framet+1
Figure BDA0001951202060000081
Wherein
Figure BDA0001951202060000082
And
Figure BDA0001951202060000083
represents a pair of ftAnd zt+1Performing Fourier transform, F-1Representing an inverse fourier transform. An indication of a dot product. Response graph St+1I.e., the target position of the t +1 th frame.
5. High confidence update policy
Most tracking methods update the filter with a fixed update rate. However, once the target is severely occluded,even disappearing in the field of view, which may result in tracking failure if the relevant filter is still updated at that time. In the invention, a high-confidence score evaluation strategy is introduced, namely a response graph St+1Calculates and determines whether the correlation filter should be updated. The confidence scores introduced are mainly from the degree of sharpness of the peaks and the degree of smoothness of the valleys of the response maps. The normal response diagram has a sharp peak value and other flat response values, which indicate that a reliable tracking target is detected. In contrast, when the response map has a plurality of peaks, the target suffers occlusion at this time.
Figure BDA0001951202060000084
Wherein SmaxRepresents St+1Maximum value of (1), SminRepresents St+1Minimum value of, mu1Represents the region omega near the peak1Average value of (a) ("sigma1Is region omega1Standard deviation of, wherein, St+1Except for omega1The remaining regions other than the region are denoted as Ω2W and h represent St+1Abscissa and ordinate of the middle pixel, Sw,hDenotes St+1And the mean is a mean function of the numerical values corresponding to the middle coordinates (w, h).
The APSR can evaluate the tracking quality and further determine whether the relevant filter of the t-th frame is updated.
It is also seen from fig. 3 that the a trace rectangle is a tracker that employs the APSR policy of the present invention, and the B trace rectangle is a tracker that does not employ the APSR update policy. At frame 90, when no occlusion occurs, the A tracking frame and the B tracking frame accurately keep up with the target. When the tracking target is occluded at the 113 th frame, the APSR value is reduced from 7.92 to 1.34, and at this time, the APSR strategy of the present invention determines that the target is occluded, and stops updating the relevant filter suffering from pollution. At frame 135, the target reappears in the field of view, at which point the A trace box (using the APSR strategy) successfully finds the previously lost target, while the B rectangle box (not using the APSR strategy) completely lost the target. Therefore, the APSR strategy used by the invention can well deal with the problem of shielding.
6. Verification of experimental results
The invention tests the tracking effect of the invention in the video sequences Girl2 and Human 3. In fig. 4, the a tracking box is the tracker used in the present invention, and B, C, D is other existing tracking algorithms (the tracking methods proposed by the articles [2-4] mentioned in the background of the invention, respectively). From fig. 4, it can be seen that in the scene sequences of Girl2 and Human3, the target is occluded by an obstacle, when the target reappears in the field of view, only the tracking method of the present invention can successfully detect the real target, and the other tracking methods all fail to track, which proves to a certain extent that the tracking method of the present invention can well cope with the occlusion challenge.
Although particular embodiments of the present invention have been described above, it will be appreciated by those skilled in the art that these are merely examples and that many variations or modifications may be made to these embodiments without departing from the principles and implementations of the invention, the scope of which is therefore defined by the appended claims.

Claims (3)

1. A method for tracking an occluded related filtering target is characterized by comprising the following steps:
step 1: for a section of tracked video sequence, giving the position and the size of a tracked target of a t-th frame, determining a search area, extracting features, and calculating a weight map of the t-th frame;
step 2: training a correlation filter of the t frame based on the obtained weight graph of the t frame;
and step 3: according to the trained correlation filter, calculating a target response image of a t +1 th frame, and calculating a target position of the t +1 th frame;
and 4, step 4: based on the target position of the t +1 th frame, an APSR strategy with high confidence level is obtained, and whether the relevant filter of the t th frame is updated or not is determined;
the step 1 is specifically realized as follows:
the weight map based on the T frame mentioned in the step 1 is composed of a target similar weight map T and a spatial perception weight map P;
target similarity weight graph T:
knowing the target position and size of the image of the t-th frame, constructing a color histogram
Figure FDA0002719430970000011
And
Figure FDA0002719430970000012
as follows:
Figure FDA0002719430970000013
Figure FDA0002719430970000014
where gamma is a fixed update rate,
Figure FDA0002719430970000015
and
Figure FDA0002719430970000016
respectively representing the target and background color histograms of the t-th frame,
Figure FDA0002719430970000017
and
Figure FDA0002719430970000018
for the historical frames, namely the target and background color histograms from the 1 st frame to the T-1 st frame, a target similarity weight map T based on the color histogram is obtained:
Figure FDA0002719430970000019
wherein
Figure FDA00027194309700000110
And
Figure FDA00027194309700000111
the prior probability represents the proportion of the size of the target area and the background area of the t-th frame in the whole search area;
the spatial perception weight map P, the weight value of which decays with distance from the target center; for any pixel pi in the target frame, the numerical value of the spatial perception weight is marked as P (pi), and P (pi) is calculated for each pixel in the target frame to generate final P;
the above steps obtain the target similar weight map T and the spatial perception weight map P, and then the final weight map W of the T-th frametCalculated by the following formula:
Wt=T+P;
the step 2 is specifically realized as follows:
extracting features of the target area of the t-th frame, recording x and y as labels conforming to Gaussian distribution, and training a correlation filter ftThe optimization function is as follows:
(f)=||ft*x-y||2+λ||ft||2
wherein f ist=ft⊙WtWhen (f) is minimized, the correlation filter f of the t-th frame is trainedtAnd λ is a regularization parameter.
Figure FDA0002719430970000021
2. The occlusion-directed correlation filtering target tracking method according to claim 1, characterized in that: the step 3 is specifically realized as follows:
inputting a t +1 th frame image, searching a target position in a t +1 th frame search area, cutting the search area by taking the target position of the previous frame as a center, and extracting the characteristics of the search area, wherein the characteristics are expressed as zt+1Then according to the correlation filter f of the t frame obtained in step 2t: obtaining a response image S of the final t +1 th framet+1
Figure FDA0002719430970000022
Wherein
Figure FDA0002719430970000023
And
Figure FDA0002719430970000024
represents a pair of ftAnd zt+1Performing Fourier transform, F-1Representing the inverse Fourier transform, St+1Is a target response map of the t +1 th frame;
according to the response diagram St+1And calculating the target position of the t +1 th frame.
3. The occlusion-directed correlation filtering target tracking method according to claim 1, characterized in that: the step 4 is specifically realized as follows:
aiming at the target response map S of the t +1 th frame obtained in the step 3t+1The following APSR strategy is adopted to judge the tracking quality, wherein APSR is defined as follows:
Figure FDA0002719430970000025
wherein SmaxRepresents St+1Maximum value of (1), SminRepresents St+1Minimum value of, mu1Represents the region omega near the peak1Average value of (a) ("sigma1Is region omega1Standard deviation of, wherein, St+1Except for omega1The remaining regions other than the region are denoted as Ω2W and h represent St+1Abscissa and ordinate of the middle pixel, Sw,hDenotes St+1The numerical value corresponding to the middle coordinate (w, h) and mean are averaging functions;
and evaluating the tracking quality by calculating the value of the APSR to determine whether the relevant filter of the t-th frame is updated.
CN201910052347.3A 2019-01-21 2019-01-21 Related filtering target tracking method for shielding Active CN109785366B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910052347.3A CN109785366B (en) 2019-01-21 2019-01-21 Related filtering target tracking method for shielding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910052347.3A CN109785366B (en) 2019-01-21 2019-01-21 Related filtering target tracking method for shielding

Publications (2)

Publication Number Publication Date
CN109785366A CN109785366A (en) 2019-05-21
CN109785366B true CN109785366B (en) 2020-12-25

Family

ID=66500899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910052347.3A Active CN109785366B (en) 2019-01-21 2019-01-21 Related filtering target tracking method for shielding

Country Status (1)

Country Link
CN (1) CN109785366B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110276383B (en) * 2019-05-31 2021-05-14 北京理工大学 Nuclear correlation filtering target positioning method based on multi-channel memory model
CN110599519B (en) * 2019-08-27 2022-11-08 上海交通大学 Anti-occlusion related filtering tracking method based on domain search strategy
CN110765970B (en) * 2019-10-31 2022-08-09 北京地平线机器人技术研发有限公司 Method and device for determining nearest obstacle, storage medium and electronic equipment
CN111091583B (en) * 2019-11-22 2022-09-06 中国科学技术大学 Long-term target tracking method
CN111260689B (en) * 2020-01-16 2022-10-11 东华大学 Confidence enhancement-based correlation filtering visual tracking method
CN111583306A (en) * 2020-05-12 2020-08-25 重庆邮电大学 Anti-occlusion visual target tracking method
CN112598710B (en) * 2020-12-25 2024-03-12 杭州电子科技大学 Space-time correlation filtering target tracking method based on feature on-line selection
CN117011340A (en) * 2023-08-09 2023-11-07 北京航空航天大学 Reconfigurable relevant filtering target tracking algorithm based on statistical color characteristics

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106097383A (en) * 2016-05-30 2016-11-09 海信集团有限公司 A kind of method for tracking target for occlusion issue and equipment
CN106651913A (en) * 2016-11-29 2017-05-10 开易(北京)科技有限公司 Target tracking method based on correlation filtering and color histogram statistics and ADAS (Advanced Driving Assistance System)
CN108053419A (en) * 2017-12-27 2018-05-18 武汉蛋玩科技有限公司 Inhibited and the jamproof multiscale target tracking of prospect based on background
CN108764064A (en) * 2018-05-07 2018-11-06 西北工业大学 SAR Target Recognition Algorithms based on Steerable filter device and self-encoding encoder
CN109146912A (en) * 2018-07-26 2019-01-04 湖南人文科技学院 A kind of visual target tracking method based on Objective analysis

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101694723B (en) * 2009-09-29 2011-08-03 北京航空航天大学 Real-time moving target tracking method based on global matching similarity function
JP6079076B2 (en) * 2012-09-14 2017-02-15 沖電気工業株式会社 Object tracking device and object tracking method
KR101517538B1 (en) * 2013-12-31 2015-05-15 전남대학교산학협력단 Apparatus and method for detecting importance region using centroid weight mask map and storage medium recording program therefor
CN106570887A (en) * 2016-11-04 2017-04-19 天津大学 Adaptive Mean Shift target tracking method based on LBP features
CN108734723B (en) * 2018-05-11 2022-06-14 江南大学 Relevant filtering target tracking method based on adaptive weight joint learning
CN108776975B (en) * 2018-05-29 2021-11-05 安徽大学 Visual tracking method based on semi-supervised feature and filter joint learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106097383A (en) * 2016-05-30 2016-11-09 海信集团有限公司 A kind of method for tracking target for occlusion issue and equipment
CN106651913A (en) * 2016-11-29 2017-05-10 开易(北京)科技有限公司 Target tracking method based on correlation filtering and color histogram statistics and ADAS (Advanced Driving Assistance System)
CN108053419A (en) * 2017-12-27 2018-05-18 武汉蛋玩科技有限公司 Inhibited and the jamproof multiscale target tracking of prospect based on background
CN108764064A (en) * 2018-05-07 2018-11-06 西北工业大学 SAR Target Recognition Algorithms based on Steerable filter device and self-encoding encoder
CN109146912A (en) * 2018-07-26 2019-01-04 湖南人文科技学院 A kind of visual target tracking method based on Objective analysis

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
自适应特征融合的核相关滤波跟踪算法;熊昌镇 等;《计算机辅助设计与图形学学报》;20170630;第29卷(第6期);第1068-1074页 *

Also Published As

Publication number Publication date
CN109785366A (en) 2019-05-21

Similar Documents

Publication Publication Date Title
CN109785366B (en) Related filtering target tracking method for shielding
Li et al. SAR image change detection using PCANet guided by saliency detection
CN107424171B (en) Block-based anti-occlusion target tracking method
Zhao et al. Closely coupled object detection and segmentation
CN111292355A (en) Nuclear correlation filtering multi-target tracking method fusing motion information
CN111340842B (en) Correlation filtering target tracking method based on joint model
Kim Analysis of small infrared target features and learning-based false detection removal for infrared search and track
Iraei et al. Object tracking with occlusion handling using mean shift, Kalman filter and edge histogram
CN111091583B (en) Long-term target tracking method
CN112329784A (en) Correlation filtering tracking method based on space-time perception and multimodal response
Shu et al. Multi-feature fusion target re-location tracking based on correlation filters
Du et al. Spatial–temporal adaptive feature weighted correlation filter for visual tracking
Zou et al. Fish tracking based on feature fusion and scale adaptation in a real-world underwater environment
Moridvaisi et al. An extended KCF tracking algorithm based on TLD structure in low frame rate videos
Lee et al. Efficient Face Detection and Tracking with extended camshift and haar-like features
Yaosheng et al. Object tracking in satellite videos based on improved correlation filters
Lu et al. Particle filter vehicle tracking based on surf feature matching
Lan et al. Robust visual object tracking with spatiotemporal regularisation and discriminative occlusion deformation
CN113920391A (en) Target counting method based on generated scale self-adaptive true value graph
Tang et al. Rapid forward vehicle detection based on deformable Part Model
Huang et al. Infrared maritime target tracking via correlation filter with adaptive context-awareness and spatial regularization
Jensch et al. A comparative evaluation of three skin color detection approaches
Gong et al. Visual object tracking
Wang MRCNNAM: Mask Region Convolutional Neural Network Model Based On Attention Mechanism And Gabor Feature For Pedestrian Detection
CN112419227B (en) Underwater target detection method and system based on small target search scaling technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220608

Address after: 230093 room 1701, block C, building 1, zone J, phase II, Hefei Innovation Industrial Park, No. 2800, innovation Avenue, high tech Zone, Hefei, Anhui

Patentee after: SNEGRID ELECTRIC TECHNOLOGY Co.,Ltd.

Address before: 230026 Jinzhai Road, Baohe District, Hefei, Anhui Province, No. 96

Patentee before: University of Science and Technology of China