CN111105441A - Related filtering target tracking algorithm constrained by previous frame target information - Google Patents
Related filtering target tracking algorithm constrained by previous frame target information Download PDFInfo
- Publication number
- CN111105441A CN111105441A CN201911250358.9A CN201911250358A CN111105441A CN 111105441 A CN111105441 A CN 111105441A CN 201911250358 A CN201911250358 A CN 201911250358A CN 111105441 A CN111105441 A CN 111105441A
- Authority
- CN
- China
- Prior art keywords
- target
- frame
- formula
- tracking
- filter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a related filtering target tracking algorithm constrained by previous frame target information; belonging to the technical field of tracking algorithm; the technical key points comprise the following steps: (1) selecting a target in a rectangular frame mode through a mouse or a target identification method in a 1 st frame of a video to be used as a tracking object, so as to obtain the state of the target in the 1 st frame; extracting multi-channel features of the search area by taking the target midpoint as the midpoint and taking the scale 4 times that of the target as the scale; (2) training a filter f according to the characteristics obtained by the 1 st frame, judging the target state of the 2 nd frame according to f, updating the filter f according to the new target state of the 2 nd frame, judging the target state of the 3 rd frame, and repeating the steps until the last frame of the video stream; the invention aims to provide a related filtering target tracking algorithm which can effectively improve the accuracy and robustness of the algorithm and is constrained by the target information of the previous frame; the method is used for the fields of automatic driving, robot control, man-machine interaction and the like.
Description
Technical Field
The present invention relates to a tracking algorithm, and more particularly, to a correlation filtering target tracking algorithm constrained by previous frame target information.
Background
The target tracking algorithm is used for tracking the target state in the whole view screen through target information obtained by the first frame, and has considerable application value in the fields of automatic driving, robot control, human-computer interaction and the like. The main difficulty of target tracking is that only one target sample can be obtained from a first frame, a large number of samples are not used for learning, in the tracking process, the target can be deformed, changed in scale, rotated, shielded, changed in illumination and blurred in motion, and is influenced by a complex background, similar targets and the like, and the target tracking algorithm generally has the defects of low accuracy and high calculated amount.
Disclosure of Invention
The invention aims to provide a related filtering target tracking algorithm which can effectively improve the accuracy and robustness of the algorithm and is constrained by the target information of the previous frame aiming at the defects of the prior art.
The technical scheme of the invention is realized as follows: a correlation filtering target tracking algorithm constrained by previous frame target information, the method comprising the steps of:
(1) selecting a target in a rectangular frame mode through a mouse or a target recognition method in a 1 st frame of a video to be used as a tracking object, so as to obtain the state of the target in the 1 st frame; extracting multi-channel features of the search area by taking the target midpoint as the midpoint and taking the scale 4 times that of the target as the scale;
(2) training a filter f according to the characteristics obtained by the 1 st frame, judging the target state of the 2 nd frame according to f, updating the filter f according to the new target state of the 2 nd frame, judging the target state of the 3 rd frame, and repeating the steps until the last frame of the video stream;
wherein, the target equation of the training filter f is shown as formula (1):
the filter trained according to the obtained multichannel characteristics, wherein in the formula (1), the 1 st item is a main training item, and the aim is to ensure that a convolution matrix of the filter and a search area has the maximum response value at the center of a target and the response value farther away from the center of the target is smaller; the item 2 is an edge suppression item, which aims to reduce the influence of background change on the tracking result and simultaneously suppress the edge effect in the tracking process; item 3 is an inverse constraint item, whichThe forcing filter f can effectively express the result of the previous frame; item 4 is an inter-frame constraint item, which aims to reduce the influence on the filter when the target disappears or is lost;for convolution symbols,. for matrix dot-by-symbol, K is the number of channels of the feature,for the kth channel feature of the t-th frame of the search range, fkIs the filter for the k channel, y is the label matrix for the gaussian distribution, t is the video frame number,search range sample information, λ, obtained for previous frames1And λ2The weighting factors for the terms 3 and 4 of the target equation.
In the related filtering target tracking algorithm constrained by the target information of the previous frame, in order to reduce the calculation amount of convolution, the alternating direction factor method (ADMM) is used for decomposing the formula (1) into two variables to be alternately solved, f in the formula (1) is replaced by h, and a constraint term is addedSo equation (1) is extended:
let l be α/lambda3Formula (2) is equivalent to:
for equation (3), since there is no frame 1Andso λ is used in calculating the filter f of frame 11And λ2Is 0.
In the related filtering target tracking algorithm constrained by the target information of the previous frame, the iterative solution is performed by dividing equation (3) into 3 subproblems by using an alternating direction factor method:
sub-problem 1:
wherein i is the number of iterations;
for the subproblem 1, converting the formula (4) into a frequency domain by using a fast Fourier transform principle for processing, converting a convolution symbol in the formula into a dot product, and calculating an f value when a derivative of the convolution symbol is 0;
sub-problem 2:
for sub-problem 2, equation (5) is converted to:
wherein h ═ h [ (h)1)T,(h2)T...(hK)T]TW is a diagonal square matrix of K W, and the value of h is calculated when the derivative is 0
Sub-problem 3:
l(i+1)=li+f(i+1)-g(i+1)(7)
the sub-problem (3) can be solved directly.
The related filtering target tracking algorithm constrained by the target information of the previous frame is shown in formula (3)Updating every n frames to represent the main information obtained by the previous frames; is provided with
Samples obtained for the previous m framesThe mean value vector of (a) is,mean vectors of samples obtained for the next n frames;
Since there may be a change in the target form during the target tracking process, in the formula (8), a forgetting factor q is introduced, so thatCan be biased towards later obtained samples.
In the related filtering target tracking algorithm constrained by the target information of the previous frame, after the filter f of the t-th frame is obtained, the target state of the t + 1-th frame is determined by using f. Firstly, sampling the search range of the t-th frame, selecting a plurality of different scales for sampling in order to enable target tracking to be adaptive to the change of target scales, and setting the obtained j search ranges of different scales as C-C1,C2,...,Cj];
The response value of the ith search range is as shown in equation (9):
in the formula (9), ^ is a fast Fourier transform operator, ifft2 is fast inverse Fourier transform, response is a response value, in order to improve the tracking accuracy, a Newton interpolation method is adopted to interpolate the response, after interpolation, the maximum value in response values response of different scales is the tracking result of the t +1 th frame, and the scale where the maximum value is located is the scale of the result.
In the above related filtering target tracking algorithm constrained by the target information of the previous frame, in step (1), the state of the obtained target in the 1 st frame is specifically to obtain a midpoint coordinate and a scale parameter of the target in the 1 st frame.
In the above related filtering target tracking algorithm constrained by the previous frame target information, in step (1), the multi-channel features are a Hog feature and a color space feature.
In the related filtering target tracking algorithm constrained by the previous frame target information, in the formula (3), the incremental principal component analysis method is adopted to obtain
Suppose that the previous n frames obtain n tracking results a ═ a1,a2,...anIts singular value decomposition into a ═ UDVTLet B be { B ═ B1,b2,...bnThe newly obtained tracking result is obtained, if the traditional principle of principal component analysis is adopted, the calculation is neededThe calculated amount will increase with the increase of the frame number, so the invention adopts the increment principal component analysis method to obtain XpreThereby avoiding an increase in the amount of calculation;
as shown in the formula (10),
in the formulaB 'is the main component of B and is orthogonal to U, R and B' can be obtained by QR decomposition, i.e.
[U,B']R=[U,D,BT](11)
Let the singular value of R be decomposed into R ═ U ' D ' V 'TThen [ A, B]Is decomposed into singular values
it can be seen that, the principal component analysis process does not increase with the increase of the frame number,
After the algorithm is adopted, the reverse constraint term is added in the target direction of the filter, and the term is used for forcing the filter f to effectively express the result of the previous frame, so that the robustness of the tracking algorithm is improved. Further preferably, the solution is solved using a forgetting factor qThe tracking algorithm can effectively adapt to the change of the target form. At the same time, the user can select the desired position,or an incremental principal component analysis method. By adopting the method, the main information in the comprehensive tracking result can be more effectively extracted, the tracking algorithm has higher robustness, and the calculation amount can be ensured not to be increased along with the increase of the frame number.
The traditional related filtering target tracking algorithm adopts the tracking result of the current frame to train the filter, when the appearance of a target is changed violently, the phenomenon of filter overfitting is easy to occur, and the subsequent tracking failure is caused.
Drawings
The invention will be described in further detail with reference to examples of embodiments shown in the drawings, which are not intended to limit the invention in any way.
Fig. 1 is a schematic diagram of the filter f solved by using the alternating direction factor method in embodiment 1 of the present invention;
FIG. 2 is a schematic diagram of the principle of a preferred algorithm adopted to enable target tracking to adapt to the change of target scale in embodiment 1 of the present invention;
FIG. 3(a) is a schematic view showing a positioning error curve of comparative example of example 1 of the present invention;
FIG. 3(b) is a graph showing the overlapping ratio of the tracking result S1 and the actual result S0 in comparative example 1.
Detailed Description
Example 1
The invention relates to a related filtering target tracking algorithm constrained by previous frame target information, which comprises the following steps:
(1) selecting a target in a rectangular frame mode through a mouse or a target recognition method in a 1 st frame of a video to be used as a tracking object, so as to obtain the state of the target in the 1 st frame; and extracting the multichannel characteristics of the search area by taking the target midpoint as the midpoint and taking the target scale of 4 times as the scale. Wherein the multi-channel features are Hog features and color space features. The state of the obtained target in the 1 st frame is specifically the midpoint coordinate and the scale parameter of the obtained target in the 1 st frame.
(2) Training the filter f according to the characteristics obtained by the 1 st frame, judging the target state of the 2 nd frame according to f, updating the filter f according to the new target state of the 2 nd frame, judging the target state of the 3 rd frame, and repeating the steps until the last frame of the video stream.
Wherein, the target equation of the training filter f is shown as formula (1):
the filter trained according to the obtained multichannel characteristics, wherein in the formula (1), the 1 st item is a main training item, and the aim is to ensure that a convolution matrix of the filter and a search area has the maximum response value at the center of a target and the response value farther away from the center of the target is smaller; the item 2 is an edge suppression item, which aims to reduce the influence of background change on the tracking result and simultaneously suppress the edge effect in the tracking process; item 3 is an inverse constraint term that enforces the filter f to be able to efficiently express the results of the previous frame; item 4 is an inter-frame constraint item, which aims to reduce the influence on the filter when the target disappears or is lost;for convolution symbols,. for matrix dot-by-symbol, K is the number of channels of the feature,for the kth channel feature of the t-th frame of the search range, fkIs the filter for the k channel, y is the label matrix for the gaussian distribution, t is the video frame number,search range sample information, λ, obtained for previous frames1And λ2The weighting factors for the terms 3 and 4 of the target equation.
Preferably, in order to reduce the amount of calculation of convolution, equation (1) is decomposed into two variables using an alternating direction factor method (ADMM) to be solved alternately, f in equation (1) is replaced by h, and a constraint term is addedSo equation (1) is extended:
let l be α/lambda3Formula (2) is equivalent to:
for equation (3), since there is no frame 1Andso λ is used in calculating the filter f of frame 11And λ2Is 0.
Further preferably, the iterative solution is performed by dividing equation (3) into 3 sub-problems using an alternating direction factor method:
sub-problem 1:
wherein i is the number of iterations;
for the subproblem 1, converting the formula (4) into a frequency domain by using a fast Fourier transform principle for processing, converting a convolution symbol in the formula into a dot product, and calculating an f value when a derivative of the convolution symbol is 0;
sub-problem 2:
for sub-problem 2, equation (5) is converted to:
wherein h ═ h [ (h)1)T,(h2)T...(hK)T]TW is a diagonal square matrix of K W, and the value of h is calculated when the derivative is 0
Sub-problem 3:
l(i+1)=li+f(i+1)-g(i+1)(7)
the sub-problem (3) can be solved directly.
Preferably, in the formula (3)Updating every n frames to represent the main information obtained by the previous frames; is provided with
Samples obtained for the previous m framesThe mean value vector of (a) is,mean vectors of samples obtained for the next n frames;
Since there may be a change in the target form during the target tracking process, in the formula (8), a forgetting factor q is introduced, so thatCan be biased towards later obtained samples. The schematic diagram of the algorithm of this step is shown in fig. 1.
Preferably, after the filter f of the t-th frame is obtained, the target state of the t + 1-th frame is determined using f. Firstly, sampling the search range of the t-th frame, selecting a plurality of different scales for sampling in order to enable target tracking to adapt to the change of target scale, and setting j search ranges with different scales as C-C1,C2,...,Cj];
The response value of the ith search range is as shown in equation (9):
in the formula (9), ^ is a fast Fourier transform operator, ifft2 is fast inverse Fourier transform, response is a response value, in order to improve the tracking accuracy, a Newton interpolation method is adopted to interpolate the response, after interpolation, the maximum value in response values response of different scales is the tracking result of the t +1 th frame, and the scale where the maximum value is located is the scale of the result. The algorithm principle diagram of the step is shown in FIG. 2.
Example 2
The steps of the related filtering target tracking algorithm constrained by the target information of the previous frame are basically the same as those of the embodiment 1, but the difference is that the increment principal component analysis method is adopted to obtain the related filtering target tracking algorithm
Suppose that the previous n frames obtain n tracking results a ═ a1,a2,...anIts singular value decomposition into a ═ UDVTLet B be { B ═ B1,b2,...bnThe newly obtained tracking result is obtained, if the traditional principle of principal component analysis is adopted, the calculation is neededThe calculated amount will increase with the increase of the frame number, so the invention adopts the increment principal component analysis method to obtain XpreThereby avoiding an increase in the amount of calculation.
As shown in the formula (10),
in the formulaB 'is the main component of B and is orthogonal to U, R and B' can be obtained by QR decomposition, i.e.
[U,B']R=[U,D,BT](11)
Let the singular value of R be decomposed into R ═ U ' D ' V 'TThen [ A, B]Is decomposed into singular values
it can be seen that the process of pivot analysis does not increase with the number of frames.
By adopting the method, the main information in the comprehensive tracking result can be more effectively extracted, the tracking algorithm is more robust, and the calculation amount can be ensured not to be increased along with the increase of the frame number.
The algorithm flow of the invention is as follows:
1. acquiring the target state of a frame 1, determining a search area, and extracting multi-channel characteristics of the search area;
2. establishing a target equation (lambda) by the step formula (1)1,λ20) and calculating the filter f by the formulas (3) to (8);
3. for t is 2 to end, t is the video frame number
4、Begin
5. For 1 to end, i is the number of iterations
6、Begin
8. Calculating f by solving equation (4)(i+1)
9. Calculating h by solving equation (6)(i+1)
10. Calculating l by equation (7)(i+1)
11、End
12. Extracting features of multi-scale search range samples at t +1 frame
13. Obtaining response of each scale searching range by formula (9)
14. Adopting Newton interpolation method to obtain maximum response point as t +1 frame tracking result
15、End
Examples of the experiments
The algorithm of the invention tests on an OTB100(Object Tracking Benchmark) test set, and the website: http:// cvlab. hang. ac. kr/tracker _ benchmark/datasets. html. The test set has 100 open videos for tracking algorithm testing. In addition, the algorithm is compared with other 9 excellent algorithms, and experimental results show that the algorithm has the highest tracking success rate. The experimental result is shown in fig. 3, wherein fig. 3(a) is a positioning error curve, which refers to the distance between the tracking result and the center point of the real result, and the unit is a pixel; FIG. 3(b) shows the overlapping ratio of the tracking result S1 and the true result S0, which is expressed by the following formula:
it can be seen from fig. 3 that the algorithm of the present invention is superior to other excellent algorithms in terms of both positioning error and tracking success rate.
The above-mentioned embodiments are only for convenience of description, and are not intended to limit the present invention in any way, and it will be understood by those skilled in the art that the technical features of the present invention can be changed or modified in some ways without departing from the scope of the present invention.
Claims (8)
1. A correlation filtering target tracking algorithm constrained by previous frame target information, the method comprising the steps of:
(1) selecting a target in a rectangular frame mode through a mouse or a target identification method in a 1 st frame of a video to be used as a tracking object, so as to obtain the state of the target in the 1 st frame; extracting multi-channel features of the search area by taking the target midpoint as the midpoint and taking the scale 4 times that of the target as the scale;
(2) training a filter f according to the characteristics obtained by the 1 st frame, judging the target state of the 2 nd frame according to f, updating the filter f according to the new target state of the 2 nd frame, judging the target state of the 3 rd frame, and repeating the steps until the last frame of the video stream;
wherein, the target equation of the training filter f is shown as formula (1):
the filter trained according to the obtained multichannel characteristics, in the formula (1), the 1 st item is a main training item, and the aim is to enable a convolution matrix of the filter and a search area to have the maximum response value at the center of a target and to have the smaller response value farther away from the center of the target; the item 2 is an edge suppression item, which aims to reduce the influence of background change on the tracking result and simultaneously suppress the edge effect in the tracking process; item 3 is an inverse constraint term that enforces the filter f to be able to efficiently express the results of the previous frame; item 4 is an inter-frame constraint item, which aims to reduce the influence on the filter when the target disappears or is lost;for convolution symbols,. for matrix dot-by-symbol, K is the number of channels of the feature,for the kth channel feature of the t-th frame of the search range, fkIs the filter for the k channel, y is the label matrix for the gaussian distribution, t is the video frame number,search range sample information, λ, obtained for previous frames1And λ2The weighting factors for the terms 3 and 4 of the target equation.
2. The correlation filtering target tracking algorithm constrained by the target information of the previous frame as claimed in claim 1, wherein in order to reduce the calculation amount of convolution, the method of alternative direction factor is used to decompose the formula (1) into two variables for alternative solution, f in the formula (1) is replaced by h, and the constraint term is addedSo equation (1) is extended:
let l be α/lambda3Formula (2) is equivalent to:
3. The correlation filtering target tracking algorithm constrained by the target information of the previous frame as claimed in claim 2, wherein the iterative solution is performed by dividing equation (3) into 3 sub-problems using an alternating direction factor method:
sub-problem 1:
wherein i is the number of iterations;
for the subproblem 1, converting the formula (4) into a frequency domain by using a fast Fourier transform principle for processing, converting a convolution symbol in the formula into a dot product, and calculating an f value when a derivative of the convolution symbol is 0;
sub-problem 2:
for sub-problem 2, equation (5) is converted to:
wherein h ═ h [ (h)1)T,(h2)T...(hK)T]TW is a diagonal matrix composed of K W, and the value of h is calculated when the derivative is 0
Sub-problem 3:
l(i+1)=li+f(i+1)-g(i+1)(7)
the sub-problem (3) can be solved directly.
4. The correlation filtering target tracking algorithm constrained by the target information of the previous frame as claimed in claim 2, wherein in formula (3)Updating every n frames to represent the main information obtained by the previous frames; is provided with
Samples obtained for the previous m framesAll areA vector of values is generated by a vector of values,mean vectors of samples obtained for the next n frames;
5. The related filtering target tracking algorithm constrained by the target information of the previous frame as claimed in claim 2 or 3, characterized in that after the filter f of the t-th frame is obtained, f is used to determine the target state of the t + 1-th frame, the search range of the t-th frame is sampled first, in order to make the target tracking adapt to the change of the target scale, the invention selects a plurality of different scales for sampling, and j different scale search ranges are set as C ═ C1,C2,...,Cj];
The response value of the ith search range is as shown in equation (9):
in the formula (9), ^ is a fast Fourier transform operator, ifft2 is fast inverse Fourier transform, response is a response value, in order to improve the tracking accuracy, a Newton interpolation method is adopted to interpolate the response, after interpolation, the maximum value in response values response of different scales is the tracking result of the t +1 th frame, and the scale where the maximum value is located is the scale of the result.
6. The correlation filtering target tracking algorithm constrained by the target information of the previous frame as claimed in claim 1, wherein in step (1), the state of the obtained target in the 1 st frame is specifically the midpoint coordinate and the scale parameter of the obtained target in the 1 st frame.
7. The correlation filtering target tracking algorithm constrained by the target information of the previous frame as claimed in claim 1, wherein in step (1), the multi-channel features are Hog features and color space features.
8. The correlation filtering target tracking algorithm constrained by the target information of the previous frame as claimed in claim 2, wherein in the formula (3), the incremental principal component analysis method is adopted to obtain
Suppose that the previous n frames obtain n tracking results a ═ a1,a2,...anIts singular value decomposition into a ═ UDVTLet B be { B ═ B1,b2,...bnThe newly obtained tracking result is obtained, if the traditional principle of principal component analysis is adopted, the calculation is neededThe calculated amount will increase with the increase of the frame number, so the invention adopts the increment principal component analysis method to obtain XpreThereby avoiding an increase in the amount of calculation;
as shown in the formula (10),
in the formulaB' is the main component of B and is orthogonal to UR and B 'can be obtained by QR decomposition, i.e., [ U, B']R=[U,D,BT](11)
Let the singular value of R be decomposed into R ═ U ' D ' V 'TThen [ A, B]Is decomposed into singular values
it can be seen that, the principal component analysis process does not increase with the increase of the frame number,
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911250358.9A CN111105441B (en) | 2019-12-09 | 2019-12-09 | Related filtering target tracking method constrained by previous frame target information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911250358.9A CN111105441B (en) | 2019-12-09 | 2019-12-09 | Related filtering target tracking method constrained by previous frame target information |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111105441A true CN111105441A (en) | 2020-05-05 |
CN111105441B CN111105441B (en) | 2023-05-05 |
Family
ID=70422422
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911250358.9A Active CN111105441B (en) | 2019-12-09 | 2019-12-09 | Related filtering target tracking method constrained by previous frame target information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111105441B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103413312A (en) * | 2013-08-19 | 2013-11-27 | 华北电力大学 | Video target tracking method based on neighborhood components analysis and scale space theory |
WO2016026370A1 (en) * | 2014-08-22 | 2016-02-25 | Zhejiang Shenghui Lighting Co., Ltd. | High-speed automatic multi-object tracking method and system with kernelized correlation filters |
US20160277646A1 (en) * | 2015-03-17 | 2016-09-22 | Disney Enterprises, Inc. | Automatic device operation and object tracking based on learning of smooth predictors |
CN108647694A (en) * | 2018-04-24 | 2018-10-12 | 武汉大学 | Correlation filtering method for tracking target based on context-aware and automated response |
-
2019
- 2019-12-09 CN CN201911250358.9A patent/CN111105441B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103413312A (en) * | 2013-08-19 | 2013-11-27 | 华北电力大学 | Video target tracking method based on neighborhood components analysis and scale space theory |
WO2016026370A1 (en) * | 2014-08-22 | 2016-02-25 | Zhejiang Shenghui Lighting Co., Ltd. | High-speed automatic multi-object tracking method and system with kernelized correlation filters |
US20160277646A1 (en) * | 2015-03-17 | 2016-09-22 | Disney Enterprises, Inc. | Automatic device operation and object tracking based on learning of smooth predictors |
CN108647694A (en) * | 2018-04-24 | 2018-10-12 | 武汉大学 | Correlation filtering method for tracking target based on context-aware and automated response |
Also Published As
Publication number | Publication date |
---|---|
CN111105441B (en) | 2023-05-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107481264B (en) | Video target tracking method with self-adaptive scale | |
CN110927706B (en) | Convolutional neural network-based radar interference detection and identification method | |
CN108549839B (en) | Adaptive feature fusion multi-scale correlation filtering visual tracking method | |
CN108198209B (en) | People tracking method under the condition of shielding and scale change | |
Cui et al. | Recurrently target-attending tracking | |
CN108776975B (en) | Visual tracking method based on semi-supervised feature and filter joint learning | |
CN111080675B (en) | Target tracking method based on space-time constraint correlation filtering | |
CN107563433B (en) | Infrared small target detection method based on convolutional neural network | |
CN107194408B (en) | Target tracking method of mixed block sparse cooperation model | |
CN108320306B (en) | Video target tracking method fusing TLD and KCF | |
CN107341820A (en) | A kind of fusion Cuckoo search and KCF mutation movement method for tracking target | |
CN106709939B (en) | Method for tracking target and target tracker | |
CN110120064A (en) | A kind of depth related objective track algorithm based on mutual reinforcing with the study of more attention mechanisms | |
CN103729854A (en) | Tensor-model-based infrared dim target detecting method | |
CN106960447B (en) | Position correction method and system for video target tracking | |
CN104156929B (en) | Infrared weak and small target background inhibiting method and device on basis of global filtering | |
CN109166139B (en) | Scale self-adaptive target tracking method combined with rapid background suppression | |
CN104408742A (en) | Moving object detection method based on space-time frequency spectrum combined analysis | |
CN111754548B (en) | Multi-scale correlation filtering target tracking method and device based on response discrimination | |
Zhang et al. | Sparse learning-based correlation filter for robust tracking | |
CN110717154A (en) | Method and device for processing characteristics of motion trail and computer storage medium | |
CN113052873A (en) | Single-target tracking method for on-line self-supervision learning scene adaptation | |
CN110660077A (en) | Multi-scale target tracking method fusing multiple features | |
CN115471525A (en) | Target tracking method and system based on fusion of twin network and Kalman filtering | |
CN110276782B (en) | Hyperspectral target tracking method combining spatial spectral features and related filtering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20230725 Address after: 514000 Meixian New District, Meizhou, Guangdong, new town A3A4, 14, 15, and 16, New Zealand Tiantai new town. Patentee after: GUANGDONG JIASHITONG HIGH-TECH CO.,LTD. Address before: NO.160 Meisong Road, Meijiang district, Meizhou City, Guangdong Province 514000 Patentee before: JIAYING University |