CN111028268B - Rapid target scale estimation method in target tracking - Google Patents

Rapid target scale estimation method in target tracking Download PDF

Info

Publication number
CN111028268B
CN111028268B CN201911366789.1A CN201911366789A CN111028268B CN 111028268 B CN111028268 B CN 111028268B CN 201911366789 A CN201911366789 A CN 201911366789A CN 111028268 B CN111028268 B CN 111028268B
Authority
CN
China
Prior art keywords
scale
current
target
tracking
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911366789.1A
Other languages
Chinese (zh)
Other versions
CN111028268A (en
Inventor
尹向雷
马晓虹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi University of Technology
Original Assignee
Shaanxi University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi University of Technology filed Critical Shaanxi University of Technology
Priority to CN201911366789.1A priority Critical patent/CN111028268B/en
Publication of CN111028268A publication Critical patent/CN111028268A/en
Application granted granted Critical
Publication of CN111028268B publication Critical patent/CN111028268B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of computer visual tracking, and discloses a rapid target scale estimation method in target tracking, which comprises the steps of designing three scale factors by taking a current scale as a center, judging the maximum values of response graphs corresponding to the three scales by adopting a detection method the same as SAMF (sampling and analysis), obtaining the change direction of the current scale, designing an adaptive scale factor according to the scale change trend and the current tracking reliability, obtaining an adaptive scale increment, obtaining the current adaptive scale according to the scale change direction, taking the adaptive scale factor and the scale with a large response value in the scale as a final scale factor, obviously reducing the algorithm calculation amount while not reducing the position tracking precision, and improving the tracking speed.

Description

Rapid target scale estimation method in target tracking
Technical Field
The invention relates to the technical field of computer vision tracking, in particular to a quick target scale estimation method in target tracking.
Background
Target tracking is a challenging research hotspot in the field of computer vision, and has a wide range of applications, such as automatic driving, mobile robots, video monitoring, abnormal behavior analysis, and the like. In recent years, correlation Filters (CF) have been introduced into the framework of object tracking, and have achieved significant effects in both accuracy and speed. In 2010, bolme et al proposed a new correlation filter, MOSSE (Minimum Output Sum of squared Error), applying CF to the tracking algorithm for the first time. The MOSSE utilizes a correlation filter to model the appearance of the target and carries out operation in a frequency domain, so that the tracking speed is obviously improved. Researchers have proposed a CSK (circular Structure Kernels) tracking method that generates a large number of virtual training samples by using cyclic shift on the basis of MOSSE; and KCF (kernel Correlation Filters) algorithm that generalizes single channel features to multi-channel features using kernel techniques.
The tracker makes great progress in both tracking speed and tracking accuracy, and has the defect of lack of estimation on target scale, which indicates that the tracking accuracy has great improvement space. The current scale estimation is mainly divided into two main types of methods, one is a multi-scale search method based on SAMF, and the other is an independent scale estimation filter method based on DSST.
The SAMF (Scale adaptive with Multiple Features tracker) method is a multi-scale search method proposed by Yang Li et al based on KCF by adding scale estimation. The SAMF strategy is to combine position estimation and scale estimation, and use a multi-resolution (multiple search scales) detection method, in order to traverse a suitable scale, the number of used search scales cannot be too small (SAMF is 7), and the method respectively tracks and calculates the filter response graph of each scale factor, and uses the corresponding scale with the larger maximum value of the response graph as the optimal scale.
The DSST algorithm (dispersive Scale Space Tracking) is a Scale estimation method for Martin Danelljan et al to train a Scale filter on the basis of KCF. DSST uses the idea of separating location estimation, which uses the KCF algorithm, from scale estimation, which uses a separate scale filter.
In the related filter-based tracking method, although the tracking performance is remarkably improved, the scale estimation of the target is still an open topic. The current scale estimation method mainly comprises the following steps: 1) The method is based on a multi-scale traversal search method of SAMF, and the tracking speed of the method is seriously influenced due to more preset search scales; 2) The independent scale estimation filter method based on the DSST method has high speed, but the tracking accuracy of the method is influenced to a certain extent because the scale estimation and the target positioning are operated independently.
The SAMF method integrates scale estimation and position estimation, can estimate the scale and has better position accuracy, but has the defect that the operation amount of a tracker is remarkably increased due to excessive preset scale number, so that the tracking speed is greatly reduced.
The DSST method is superior to SAMF in the amount of calculation of the estimation scale, and the tracking speed is faster than SAMF, but the position estimation accuracy is affected due to the use of a fixed scale filter in the position estimation.
Therefore, the method aims to provide a rapid scale estimation method, and the scale estimation speed is increased under the condition of not reducing the tracking precision so as to achieve the aim of accurately and rapidly tracking the target.
Disclosure of Invention
The invention provides a rapid target scale estimation method in target tracking, which can obviously reduce the calculated amount of an algorithm and improve the tracking speed while not reducing the position tracking precision.
The invention provides a quick target scale estimation method in target tracking, which comprises the following steps:
s1, inputting a frame to be processed of an image;
s2, preprocessing a current frame; if the diagonal pixel distance of the target is larger than 100, the original image is reduced by one time, and the size and the position of the target are correspondingly reduced by one time;
s3, acquiring gradient direction characteristics HOG, color characteristics CN, gray characteristics and the current target position of the tracked target in the current frame;
s4, training the tracker by using SAMF algorithm according to the gradient direction characteristic HOG, the color characteristic CN, the gray characteristic and the current target position of the tracked target, and obtaining a model parameter M n
If the current frame is the first frame image, the tracker is directly trained by the SAMF algorithm to obtain a model parameter M n Starting to input the next frame of image, and then turning to the step S2;
s5, calculating the optimal scale S 0
S5.1, presetting three scale factors, and with the current scale 1 as the center and alpha as the step length, designing the three scale factors as follows: (1-a, 1+ a);
s5.2, utilizing model parameter M n Calculating response graphs corresponding to the preset three scale factors, and recording the peak value of the response graph as V M(1-a) ,V M(1) ,V M(1+a) And the maximum of the three peaks s = max (V) is recorded M(1-a) ,V M(1) ,V M(1+a) );
S5.3, judging whether the direction of the current scale change is reduced, unchanged or increased according to the position of the maximum value in the peak value of the response diagram;
s5.4, calculating the average peak value correlation energy (APCE) value of the current frame and the previous frame, and obtaining the tracking reliability variation beta of the current target scene from the variation of the APCE of the current frame and the previous frame, namely beta = | (V) APCE(n) -V APCE(n-1) |;
S5.5, when the tracking reliability variation beta is larger, the current target scene is more complex or the target has large scale change, the scale factor change and the reliability change are changed in proportion, and therefore the self-adaptive scale increment delta is obtained, namely
Figure GDA0003962404930000031
Wherein c is a scale adjustment coefficient;
s5.6, if the scale S =1, then the optimal scale S 0 =1, go to step S6;
s5.7, if the scale S =1+a or S =1-a, obtaining the current adaptive scale coefficient gamma as:
Figure GDA0003962404930000032
s5.8, calculating a peak value R of a filter response diagram corresponding to the adaptive scale coefficient gamma and the scale S s And R γ
S5.9, taking R s And R γ The dimension corresponding to the larger of the two is the optimal dimension s 0 ,s 0 =arg s,γ max{R s ,R γ };
S6, according to the optimal dimension S 0 Acquiring a target position as a current target position;
s7, training the tracker according to the optimal scale and the current target position to obtain a model parameter M n+1
S8, updating the model to obtain a new model M new Comprises the following steps: m new =ηM n+1 +(1-η)M n Wherein eta is the set update rate;
and S9, if the current frame is the last frame, ending, otherwise, inputting the next frame and turning to the step S2.
The preprocessing of the current frame in the step S2 includes size limitation, windowing, expanding a region, and dividing an image moving block, where the windowing is to perform 1.5-fold expansion and cosine window processing on an initially given target window.
In said step S5.3, when the response map is peakedMaximum of values s = V M(1-a) Judging the direction of the current scale change to be reduction; when the maximum value in the peak of the response plot is s = V M(1) Judging that the direction of the current scale change is unchanged; when the maximum value in the peak of the response diagram is s = V M(1+a) And judging that the direction of the current scale change is increased.
Compared with the prior art, the invention has the beneficial effects that:
according to the invention, the SAMF algorithm is used as a reference algorithm, firstly, 7 original fixed search scales of the SAMF algorithm are reduced to 3, the direction of scale change is judged according to the 3 scales, and the scale with the largest response value is recorded, so that the calculation cost of the tracking algorithm is reduced to a great extent; on the other hand, the method of combining the scale estimation and the position estimation overcomes the defect of large error caused by adopting a fixed scale estimation position in the DSST algorithm in the aspect of tracking precision. A large number of experiments show that compared with a reference algorithm SAMF, the method can obviously improve the tracking speed under the condition of not reducing the tracking precision (and increasing the tracking precision to a certain extent). The method is simple and effective, can be integrated into any tracking method based on SAMF scale estimation, and is a beneficial supplement to the existing method.
Drawings
FIG. 1 is a flow chart of the main program of the present invention;
FIG. 2 is a flowchart of an optimal scale factor calculation subroutine of the present invention;
FIG. 3 is a CLE comparison of the present invention and SAMF algorithm on video CliffBar;
FIG. 4 is a CLE comparison of the present invention and SAMF algorithm on video Human 6;
FIG. 5 is a CLE comparison graph of the present invention and SAMF algorithm on video Soccer;
Detailed Description
One embodiment of the present invention will be described in detail below with reference to fig. 1-5, but it should be understood that the scope of the present invention is not limited to the embodiment.
1) Firstly, reducing 7 scale factors of SAMF to 3, taking the current scale as the center, taking a as the step length, and designing three scale factors: (1-a, 1+ a), the maximum value of the response map corresponding to the three scale factors is judged by the same detection method as SAMF, and if the scale corresponding to the maximum value is recorded as s, the direction of the current scale change is decreased by s =1-a, unchanged by s =1, or increased by s =1+a.
2) And then designing an adaptive scale factor according to the scale change trend and the currently tracked reliability. The tracking reliability change beta of the current target scene is known from the APCE change of the current frame and the previous frame, namely
β=|(V APCE(n) -V APCE(n-1) |
When the tracking reliability changes greatly, the target scene is represented to be more complex or the target scale may change greatly, so that the scale factor change should change in proportion to the reliability change. Thereby obtaining an adaptive scale increment delta, i.e.
Figure GDA0003962404930000051
Where c is the scaling factor.
Then, according to the scale change direction, the current adaptive scale coefficient γ can be obtained as:
Figure GDA0003962404930000052
the scale with large response value in the self-adaptive scale coefficient gamma and the scale s is taken as the optimal scale s 0 Namely:
s 0 =arg sγ max{R s ,R γ }
wherein R is s And R γ The scale s and y are shown to correspond to the maximum of the filter response map, respectively.
Examples
Referring to fig. 1, the present embodiment specifically includes the following steps:
step 101: inputting a frame to be processed of an image;
step 102: and preprocessing the image, and if the diagonal pixel distance of the target is more than 100, reducing the original image by one time, and correspondingly reducing the size and the position of the target by one time.
Step 103: the initially given target window is expanded by a factor of 1.5 and cosine windowed. And (4) extracting features of the processed image (HOG, CN and gray level features are taken in the algorithm).
Step 104: if the frame is the first frame, go to step 105 to train the tracker model parameters M directly n And starting to input the next frame image for tracking; if not, indicating that there are model parameters, step 106 is entered, and the SAMF algorithm is used to train and obtain the model parameters M n
Step 107: calling an optimal scale calculation subprogram to obtain an optimal scale s 0
Referring to fig. 2, when entering the optimal scale factor calculation subroutine, the subroutine receives the following three parameters from the main program: scale, position and tracking model parameters M of current target n
Step 201: presetting three scale factors, taking the current scale 1 as a center and a as a step length, and then designing the three scale factors as follows: (1-a, 1+ a). And using the model M n Calculating the response graphs corresponding to the three scales, and recording the maximum value of the response graph as V M(1-a) ,V M(1) ,V M(1+a) And the maximum value s = max (V) of the three values is recorded M(1-a) ,V M(1) ,V M(1+a) )。
Step 202: judging the direction in which the current scale should change according to the maximum value of the response graph is as follows: decrease (s = V) M(1-a) ) Unchanged (s = V) M(1) ) Or increased (s = V) M(1+a) )。
Step 203: the tracking reliability change beta of the current target scene is known from the APCE change of the current frame and the last frame, namely beta = | (V) APCE(n) -V APCE(n-1) |;
When the tracking reliability changes greatly, the current target scene is represented to be more complex or the target is likely to have large scale change, so the scale factor change and the reliability changeAnd is changed in direct proportion. Thereby obtaining an adaptive scale increment delta, i.e.
Figure GDA0003962404930000061
Where c is the scaling factor.
Step 204: if the scale s =1, it is said that it is best to keep the scale unchanged, go to step 210, and the best scale is s 0 =1, and returns the scale to the main program; otherwise, the current scale is not optimal, and the next step of further calculation is carried out.
Step 205: if s =1+a indicates that the scale should be further changed in a large direction, then step 206 is entered and the current adaptation scale γ is: γ = s + δ, otherwise step 207 is entered, and the current adaptation metric γ is calculated as: γ = s- δ.
Step 208: calculating the peak value R of the response graph corresponding to the self-adaptive scale coefficient gamma and the scale s s And R γ
Step 209: get R s And R γ The dimension corresponding to the larger of the two is the final optimal dimension s 0 . Namely:
s 0 =arg sγ max{R s ,R γ }
step 210: to this best dimension s 0 And returning the obtained result to the main program, and returning the final scale coefficient to the main program.
See fig. 1. The main routine is continued.
Step 108: the main program obtains the scale coefficient, the scale of the target and the position information returned by the subprogram. And according to the optimal scale factor s 0 The detected target position is the optimal target position.
Step 109: training the tracker according to the current target position and obtaining a model parameter M n+1
Step 110: updating the model to obtain a new model M by using the formula new Comprises the following steps: m new =ηM n+1 +(1-η)M n
And step 111, judging whether the last frame is reached by the main program, if so, ending the tracking program, and otherwise, entering the next step.
Step 112, inputting the next frame of image and turning to step 102 to repeat the tracking and updating process.
The above description is only a method designed based on the SAMF of the kernel-dependent tracking algorithm, but the algorithm can be applied to any tracking algorithm based on the SAMF scaling idea. It should be noted that when the method is applied to other kernel-dependent filtering algorithms, the parameter values need to be adjusted accordingly.
To evaluate the performance of the present invention, a comparative analysis was performed on the present invention based on the SAMF algorithm [ Matthias Mueller, neil Smith, bernard Ghanem. Context-Aware Correlation Filter training. In IEEE CVPR,2017 ]. The test data set was an OTB video set provided in the literature [ Yi Wu, jongwood Lim, and Ming-Hsua Yang. On line Object Tracking: A benchmark. In IEEE CVPR,2013 ]. An experiment platform: matlab 2016a, intel-i5-7400,3.0GHz,4GB memory.
And (3) selecting experimental parameters, adopting default parameters of a basic algorithm SAMF, wherein a proportion adjustment coefficient c =3500, and a set update rate eta =0.01.
Experiment one, accuracy and speed comparison.
40 groups of videos with Scale Variation (SV) characteristics are selected in the OTB video set for Distance Precision (DP) comparison with the SAMF algorithm. The accuracy refers to the percentage of the frame number with the Euclidean distance between the central coordinate of the tracking target and the calibrated real value smaller than a certain threshold value in all the video frame numbers, and the lower the threshold value, the higher the accuracy is, and the better the tracking effect is. The threshold value in this comparison is 20 pixels. The comparative results are shown in Table 1.
TABLE 1 accuracy and speed comparison of the inventive algorithm and SAMF algorithm
Figure GDA0003962404930000081
Figure GDA0003962404930000091
As can be seen from table 1, the tracking accuracy of the algorithm is 3.9% higher than that of the SAMF algorithm, and the improvement is obvious, and particularly, the accuracy of ClifBar, human6 and Soccer videos is respectively improved by 44.9%, 30.3% and 69.6%. In terms of speed, the speed of the algorithm is improved to 37.18FPS compared with 26.75FPS of SAMF, and the speed is improved by 40%. This shows that the algorithm significantly improves the tracking speed under the condition of improving the tracking accuracy of the algorithm.
Experiment two, center position error comparison.
The Center position Error CLE (Center Location Error) refers to the euclidean distance between the detected Center position and the true position of the marker.
The effect of the present invention will be further described with reference to fig. 3, 4 and 5.
Referring to fig. 3, the present invention is compared to the basic algorithm SAMF for center position error CLE on video ClifBar. It can be seen that before 220 frames, although the CLE of the SAMF algorithm is slightly better than that of the present algorithm, the CLE of the present algorithm is higher than 20 only about 10 frames and not more than 30 at most; from 220 frames later, the CLE of the SAMF algorithm obviously jumps, the algorithm drifts, and the CLE of the algorithm is lower than 20, which shows that the algorithm can better track the upper target. The overall CLE of the algorithm is obviously superior to SAMF.
Referring to fig. 4, the present invention is compared to the basic algorithm SAMF for center position error CLE on video Human 6. It can be seen that both algorithms start CLE up at 490 frames and fall below 20 around 540 frames. Although the CLE highest value of the algorithm is slightly higher than that of the SAMF algorithm in the period, the algorithm keeps the CLE below 20 all the time after 540 frames, namely, the upper target can be correctly tracked; the CLE of the SAMF algorithm rises significantly and does not decrease any more, i.e. the SAMF algorithm loses the target after 540 frames and the tracking fails. The tracking error of the algorithm is obviously smaller than the error of SAMF in the whole video process.
Referring to fig. 5, the present invention is compared to the basic algorithm SAMF for center position error CLE on video Soccer. It can be seen that the CLE of both algorithms lingers around 20 between 50 and 120 frames, indicating that the target is severely disturbed at this stage; from 120 frames later the CLE of the SAMF algorithm rises significantly above 70 and mostly oscillates between 150 and 200, i.e. the tracker has completely lost the target and the tracking fails. In contrast, after 120 frames, the CLE of the algorithm is below 20 except for the last 15 frames, namely, the algorithm can correctly track the target on most frames. In contrast, the performance of the algorithm is obviously superior to SAMF.
The invention reduces 7 scale factors in SAMF algorithm to 3, designs three scale factors by taking the current scale as the center, judges the maximum value of the response graph corresponding to the three scale factors by adopting the same detection method as SAMF to obtain the change direction of the current scale, designs the self-adaptive scale factor according to the scale change trend and the current tracking reliability to obtain the self-adaptive scale increment, obtains the current self-adaptive scale according to the change direction of the scale, and takes the scale with large response value in the self-adaptive scale factor and the scale as the optimal scale.
The above disclosure is only for a few specific embodiments of the present invention, however, the present invention is not limited to the above embodiments, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present invention.

Claims (3)

1. A method for quickly estimating target scale in target tracking comprises the following steps:
s1, inputting a frame to be processed of an image;
s2, preprocessing a current frame; if the diagonal pixel distance of the target is more than 100, reducing the original image by one time, and correspondingly reducing the size and the position of the target by one time;
s3, acquiring gradient direction characteristics HOG, color characteristics CN, gray characteristics and the current target position of the tracked target in the current frame;
s4, training the tracker by using SAMF algorithm according to the gradient direction characteristic HOG, the color characteristic CN, the gray characteristic and the current target position of the tracked target, and obtaining a model parameter M n
If the current frame is the first frame image, the tracker is directly trained by the SAMF algorithm to obtain a model parameter M n And starts inputting the next frame of image, and then goes to step S2;
s5, calculating the optimal scale S 0
S5.1, presetting three scale factors, taking the current scale 1 as a center and a as a step length, wherein the three designed scale factors are as follows: (1-a, 1+ a);
s5.2, utilizing model parameter M n Calculating response graphs corresponding to the preset three scale factors, and recording the peak value of the response graph as V M(1-a) ,V M(1) ,V M(1+a) And the maximum of the three peaks s = max (V) is recorded M(1-a) ,V M(1) ,V M(1+a) );
S5.3, judging whether the direction of the current scale change is reduced, unchanged or increased according to the position of the maximum value in the peak value of the response diagram;
s5.4, calculating the average peak value correlation energy (APCE) value of the current frame and the previous frame, and obtaining the tracking reliability variation beta of the current target scene from the variation of the APCE of the current frame and the previous frame, namely beta = | (V) APCE(n) -V APCE(n-1) |;
S5.5, when the tracking reliability variation beta is larger, the current target scene is more complex or the target has large scale change, the scale factor change and the reliability change are changed in proportion, and therefore the self-adaptive scale increment delta is obtained, namely the self-adaptive scale increment delta is obtained
Figure FDA0003962404920000011
Wherein c is a scale adjustment coefficient;
s5.6, if the scale S =1, then the optimal scale S 0 =1, go to step S6;
s5.7, if the scale S =1+a or S =1-a, obtaining the current adaptive scale coefficient gamma as:
Figure FDA0003962404920000021
s5.8, calculating the adaptive scaleThe peak value R of the filter response diagram corresponding to the coefficient gamma and the scale s s And R γ
S5.9, taking R s And R γ The dimension corresponding to the larger of the two is the optimum dimension s 0 ,s 0 =arg s,γ max{R s ,R γ };
S6, according to the optimal dimension S 0 Acquiring a target position as a current target position;
s7, training the tracker according to the optimal scale and the current target position to obtain a model parameter M n+1
S8, updating the model to obtain a new model M new Comprises the following steps: m new =ηM n+1 +(1-η)M n Wherein η is the set update rate;
and S9, if the current frame is the last frame, ending, otherwise, inputting the next frame and turning to the step S2.
2. The method as claimed in claim 1, wherein the preprocessing of the current frame in step S2 includes size limitation, windowing, 1.5 times expansion of an initially given target window and cosine windowing, and division of an expanded region and an image moving block.
3. The method for fast target dimension estimation in target tracking as claimed in claim 1, wherein in said step S5.3, when the maximum value in the peak of the response map is S = V M(1-a) Judging the direction of the current scale change to be reduction; when the maximum value in the peak of the response diagram is s = V M(1) Judging that the direction of the current scale change is unchanged; when the maximum value in the peak of the response plot is s = V M(1+a) And judging that the direction of the current scale change is increased.
CN201911366789.1A 2019-12-26 2019-12-26 Rapid target scale estimation method in target tracking Active CN111028268B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911366789.1A CN111028268B (en) 2019-12-26 2019-12-26 Rapid target scale estimation method in target tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911366789.1A CN111028268B (en) 2019-12-26 2019-12-26 Rapid target scale estimation method in target tracking

Publications (2)

Publication Number Publication Date
CN111028268A CN111028268A (en) 2020-04-17
CN111028268B true CN111028268B (en) 2023-02-24

Family

ID=70213831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911366789.1A Active CN111028268B (en) 2019-12-26 2019-12-26 Rapid target scale estimation method in target tracking

Country Status (1)

Country Link
CN (1) CN111028268B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112435280A (en) * 2020-11-13 2021-03-02 桂林电子科技大学 Moving target detection and tracking method for unmanned aerial vehicle video

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013020616A (en) * 2011-07-07 2013-01-31 Ricoh Co Ltd Object tracking method and object tracking device
CN107481264A (en) * 2017-08-11 2017-12-15 江南大学 A kind of video target tracking method of adaptive scale
CN109146917A (en) * 2017-12-29 2019-01-04 西安电子科技大学 A kind of method for tracking target of elasticity more new strategy
CN109146928A (en) * 2017-12-29 2019-01-04 西安电子科技大学 A kind of method for tracking target that Grads threshold judgment models update
CN110163090A (en) * 2019-04-11 2019-08-23 江苏大学 It is a kind of that tracking is identified based on the pcb board of multiple features and size estimation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013020616A (en) * 2011-07-07 2013-01-31 Ricoh Co Ltd Object tracking method and object tracking device
CN107481264A (en) * 2017-08-11 2017-12-15 江南大学 A kind of video target tracking method of adaptive scale
CN109146917A (en) * 2017-12-29 2019-01-04 西安电子科技大学 A kind of method for tracking target of elasticity more new strategy
CN109146928A (en) * 2017-12-29 2019-01-04 西安电子科技大学 A kind of method for tracking target that Grads threshold judgment models update
CN110163090A (en) * 2019-04-11 2019-08-23 江苏大学 It is a kind of that tracking is identified based on the pcb board of multiple features and size estimation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Adaptive Feature Fusion Object Tracking with Kernelized Correlation Filters;Baoyi Ge;《CNKI》;20180525;全文 *
基于相关滤波器的目标跟踪方法综述;马晓虹;《电子技术应用》;20180630;全文 *

Also Published As

Publication number Publication date
CN111028268A (en) 2020-04-17

Similar Documents

Publication Publication Date Title
KR102275452B1 (en) Method for tracking image in real time considering both color and shape at the same time and apparatus therefor
WO2020107716A1 (en) Target image segmentation method and apparatus, and device
JP3899523B2 (en) Image similarity calculation system and image search system
CN109146917B (en) Target tracking method for elastic updating strategy
CN110120065B (en) Target tracking method and system based on hierarchical convolution characteristics and scale self-adaptive kernel correlation filtering
CN111582349B (en) Improved target tracking algorithm based on YOLOv3 and kernel correlation filtering
CN107730536B (en) High-speed correlation filtering object tracking method based on depth features
CN111340842B (en) Correlation filtering target tracking method based on joint model
CN111583294B (en) Target tracking method combining scale self-adaption and model updating
CN108364305B (en) Vehicle-mounted camera video target tracking method based on improved DSST
CN110992401A (en) Target tracking method and device, computer equipment and storage medium
CN109087337B (en) Long-time target tracking method and system based on hierarchical convolution characteristics
CN111008991A (en) Background perception related filtering target tracking method
CN111028268B (en) Rapid target scale estimation method in target tracking
CN109146928B (en) Target tracking method for updating gradient threshold judgment model
CN112258557A (en) Visual tracking method based on space attention feature aggregation
US8537212B2 (en) Recording apparatus and recording method thereof
CN110660077A (en) Multi-scale target tracking method fusing multiple features
CN117218161B (en) Fish track tracking method and system in fish tank
CN110706254B (en) Target tracking template self-adaptive updating method
CN110598614B (en) Related filtering target tracking method combined with particle filtering
CN113538509B (en) Visual tracking method and device based on adaptive correlation filtering feature fusion learning
CN111145216A (en) Tracking method of video image target
CN113129332A (en) Method and apparatus for performing target object tracking
CN115511918A (en) Target tracking method and device based on parallel processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant