CN107154024A - Dimension self-adaption method for tracking target based on depth characteristic core correlation filter - Google Patents
Dimension self-adaption method for tracking target based on depth characteristic core correlation filter Download PDFInfo
- Publication number
- CN107154024A CN107154024A CN201710355456.3A CN201710355456A CN107154024A CN 107154024 A CN107154024 A CN 107154024A CN 201710355456 A CN201710355456 A CN 201710355456A CN 107154024 A CN107154024 A CN 107154024A
- Authority
- CN
- China
- Prior art keywords
- mrow
- target
- msup
- scale
- map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000012549 training Methods 0.000 claims abstract description 18
- 230000003044 adaptive effect Effects 0.000 claims abstract description 13
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 11
- 230000006870 function Effects 0.000 claims description 16
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000001228 spectrum Methods 0.000 claims description 6
- 238000012935 Averaging Methods 0.000 claims description 4
- 238000007781 pre-processing Methods 0.000 claims description 4
- 102400000832 Antiplasmin-cleaving enzyme FAP, soluble form Human genes 0.000 claims 3
- 101800000492 Antiplasmin-cleaving enzyme FAP, soluble form Proteins 0.000 claims 3
- 230000008859 change Effects 0.000 abstract description 13
- 238000001514 detection method Methods 0.000 abstract description 10
- 238000012986 modification Methods 0.000 abstract 2
- 230000004048 modification Effects 0.000 abstract 2
- 239000002537 cosmetic Substances 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 10
- 238000012360 testing method Methods 0.000 description 8
- 238000011156 evaluation Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 3
- 241000282376 Panthera tigris Species 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4084—Scaling of whole images or parts thereof, e.g. expanding or contracting in the transform domain, e.g. fast Fourier transform [FFT] domain scaling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/269—Analysis of motion using gradient-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
- G06T7/41—Analysis of texture based on statistical description of texture
- G06T7/44—Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Probability & Statistics with Applications (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a kind of dimension self-adaption method for tracking target based on depth characteristic core correlation filter.This method comprises the following steps:The convolutional neural networks of pre-training completion are input an image into, depth convolution feature is extracted;Target tracking, using the model of training, estimates position and the yardstick of target;According to the target location of current detection and yardstick, core correlation filter is trained;Using the model update method of adaptive high confidence level, core correlation filter is updated.The present invention is extracted depth convolution feature, using adaptive scale method of estimation and the model modification strategy of adaptive high confidence level, improve the robustness of target following of the target in complex scene and cosmetic variation, target scale change can efficiently and accurately be handled, in addition, due to the model modification strategy using adaptive high confidence level, model following drift has been reduced as far as.
Description
Technical Field
The invention relates to the technical field of computer vision, in particular to a scale self-adaptive target tracking method based on a depth feature kernel correlation filter.
Background
In recent years, with the appearance of large-scale labeled data sets and the improvement of computer computing power, deep learning methods, particularly convolutional neural networks, are successfully applied to the computer vision field of image classification, target detection, target recognition, semantic segmentation and the like, which is mainly attributed to the strong target representation capability of the convolutional neural networks. Unlike traditional image features, deep convolution features are learned from a large number of thousands of classes of image data, so high-level convolution features represent semantic features of a target and are applicable to image classification problems. Because the resolution of the convolution features at the high layer is low, the method is not suitable for determining the position of the target, and because of the loss of training data, the training of a depth model in the first few frames of the tracking start is difficult.
Recently, discriminant tracking methods based on correlation filters are of interest to many researchers because of their high efficiency and accuracy in tracking. The tracking method based on the correlation filter trains a correlation filter on line by regressing the input characteristics to the Gaussian distribution of the target, and finds the peak value of the corresponding atlas output by the correlation filter in the subsequent tracking process to determine the position of the target. The related filter skillfully applies fast Fourier transform in operation, so that the calculation complexity is reduced, and the tracking speed is greatly improved. However, the kernel correlation filter algorithm uses the traditional gradient direction histogram feature, and when the appearance representation of the target changes, the tracking algorithm is easy to drift; in addition, the algorithm cannot estimate the scale change of the target, and a simple linear interpolation is adopted to update the model, so that when the target tracking is wrong, the tracking algorithm is shifted due to the final update mechanism.
Disclosure of Invention
The invention aims to provide a scale self-adaptive target tracking method based on a depth feature kernel correlation filter, so that the robustness of target tracking of a target in a complex scene and appearance change is improved, the target scale change is efficiently and accurately processed, and model tracking drift caused by error detection is reduced as much as possible.
The technical solution for realizing the purpose of the invention is as follows: a scale self-adaptive target tracking method based on a depth feature kernel correlation filter comprises the following steps:
step 1, inputting an initial position p of a target0Sum scale s0Setting the window size to be 2.0 times of the target initial bounding box;
step 2, according to the target position p of the t-1 th framet-1Obtaining a target area xt-1The size is the window size;
step 3, extracting a target area xt-1Depth of (2)Convolution characteristic and fast Fourier transform to obtain characteristic spectrumWherein ^ represents a discrete Fourier transform;
step 4, according to the characteristic mapCompute kernel autocorrelation
Step 5, training a position and scale correlation filter;
step 6, according to the position p of the t-1 frame targett-1Obtaining a candidate region z of the target in the t-th frametThe size is the window size;
step 7, extracting candidate region ztAnd performing fast Fourier transform to obtain a feature mapWherein ^ represents a discrete Fourier transform;
step 8, according to the characteristic map of the previous frame of the targetComputing kernel cross-correlation
Step 9, respectively detecting the positions corresponding to the maximum values in the output maps of the position filter and the scale filter to determine the position p of the target in the current frametSum scale st;
And step 10, updating the kernel correlation filter by adopting a self-adaptive model updating strategy.
Further, the method for extracting the deep convolution features in step 3 and step 7 specifically includes the following steps:
(3.1) preprocessing, scaling the window area I to the input size 224 × 224 specified by the convolutional neural network;
(3.2) extracting features, namely extracting feature maps of the 3 rd, 4 th and 5 th convolutional layers of the convolutional neural network;
and (3.3) carrying out bilinear interpolation, and upsampling the extracted 3-layer convolution characteristics to the same size.
Further, the calculating core autocorrelation of step 4And the step 8 of calculating the kernel cross correlationThe specific method comprises the following steps:
(4.1) using a Gaussian kernel, the formula is as follows:
wherein k (x, x ') represents a Gaussian kernel calculated by two feature maps x and x', exp (eta) represents an e index function, sigma is a standard deviation of the Gaussian function, the value of sigma is 0.5, | | |22-normal form representing a vector or matrix;
(4.2) calculating the kernel correlation, the formula is as follows:
wherein k isxx′Representing the nuclear correlation of the characteristic maps x and x', exp (eta) representing an e index function, sigma is the standard deviation of a Gaussian function, the value of sigma is 0.5, | | |. | tory2A 2-normal form representing a vector or matrix,represents the inverse of the discrete fourier transform, denotes the complex conjugate, denotes the discrete fourier transform, and ⊙ denotes the multiplication of the corresponding elements of the two matrices.
Further, the training position and scale correlation filter described in step 5 specifically includes the following steps:
respectively training 1 kernel correlation filter for each layer of feature spectrum according to the deep convolution features extracted in the step 3, and training a model by adopting the following formula:
wherein,representing a feature map convolved according to the l-th layer depthThe obtained correlation filter model is used for calculating the correlation filter model,representation feature mapRepresents discrete Fourier transform, lambda is a regularization parameter to prevent overfitting of the trained model, and lambda is 0.001.
Further, the step 9 of detecting the positions corresponding to the maximum values in the output maps of the position filter and the scale filter, respectively, to determine the position p of the target in the current frametSum scale stThe method comprises the following steps:
(9.1) from the t-th frame image, with position pt-1And selecting candidate area z by window sizet,trans;
(9.2) extracting candidate region zt,trans3 layers depth convolution characteristic map
(9.3) calculating a position filter correlation output confidence coefficient map f for the first layer characteristic mapt,trans (l):
Wherein f ist (l)The output corresponding map of the position filter representing the characteristic map of the l-th layer,representation feature mapAndthe cross-correlation of the kernels of (a),is the position filter trained and updated from the previous frame,represents an inverse discrete Fourier transform, represents a discrete Fourier transform,representing the multiplication of corresponding elements of two matrixes;
(9.4) outputting the corresponding map ft (3)Starting to estimate the target position from coarse to fine, the target position p of the l-th layert (l)For outputting the corresponding map ft,trans (l)The position corresponding to the maximum value;
(9.5) from the t-th frame image, with position ptSum scale st-1Extracting candidate region z of scale estimatet,sacleConstructing a scale pyramid;
(9.6) extracting gradient direction histogram feature of candidate regionComputing a scale filter correlation output confidence map ft,sacle;
(9.7) target dimension s detected at the t-th frametFor outputting the corresponding map ft,sacleThe maximum value corresponds to the scale.
Further, the updating of the kernel correlation filter by using the adaptive model updating strategy in step 10 is as follows:
(10.1) after completing the estimation of the target position and scale, calculating two tracking result confidence measure criteria, one of which is the peak value of the relevant corresponding output map:
fmax=maxf
wherein f is a corresponding map output by the correlation between the kernel correlation filter and the candidate region, fmaxIs the peak of the map;
the other is the average peak correlation energy APCE:
wherein f ismaxAnd fminRespectively the maximum and minimum of the output map f, mean () representing the averaging function, fi,jThe ith row and jth column of values representing the output map f;
(10.2) if fmaxAnd APCE are both greater than fmaxAnd APCE historical average value, updating the model, otherwise, not updating; for each depth convolution layer, using a linear interpolation methodUpdating the method, wherein the formula is as follows:
wherein,andthe feature map and the related filter of the previous frame of the ith layer are respectively represented, η is the learning rate, the larger η is, the faster model is updated, and η takes the value of 0.02.
Compared with the prior art, the invention has the following remarkable advantages: (1) the method uses the deep convolution characteristics which are obtained by learning from a large amount of image data of thousands of categories, and has strong target representation capability, so that the algorithm has strong robustness when the target appearance changes and external factors such as illumination change; (2) an adaptive scale estimation method is adopted, the method is similar to position estimation, an independent scale filter is trained, fast Fourier transform is skillfully used, high efficiency is realized, scale estimation is accurate, and the method can be combined into any discriminant tracking algorithm frame; (3) by adopting a self-adaptive model updating strategy with high confidence level, when an error occurs in a target tracking stage, the confidence level of target detection is low, and the model is not updated at the moment, so that the risk of the drift of the tracking algorithm is effectively reduced.
Drawings
FIG. 1 is a flowchart of a scale-adaptive target tracking method based on a depth feature kernel correlation filter according to the present invention.
Fig. 2 is a schematic diagram of extracting deep convolution features.
Fig. 3 is a schematic diagram of target position and scale estimation, wherein (a) is a schematic diagram of target position estimation from coarse to fine, and (b) is a schematic diagram of adaptive target scale estimation.
FIG. 4 is a diagram of adaptive high confidence model update.
FIG. 5 is a plot of the results of the evaluation of the present invention on a standard visual tracking dataset, where (a) is an accuracy plot of the OTB50 dataset, (b) is a correct rate plot of the OTB50 dataset, (c) is an accuracy plot of the OTB100 dataset, and (d) is a correct rate plot of the OTB100 dataset.
Fig. 6 is a graph of the actual video target tracking result of the present invention, where (a) is a graph of the result of Human testing video on the OTB100 data set, (b) is a graph of the result of Walking testing video on the OTB100 data set, (c) is a graph of the result of Tiger testing video on the OTB50 data set, and (d) is a graph of the result of Dog testing video on the OTB50 data set.
Detailed Description
For a better understanding and appreciation of the structural features and advantages achieved by the present invention, reference should be made to the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings, in which:
with reference to fig. 1, the method for tracking a scale-adaptive target based on a depth feature kernel correlation filter of the present invention includes the following steps:
step 1, inputting an initial position p of a target0Sum scale s0Setting the window size to be 2.0 times of the target initial bounding box;
step 2, according to the target position p of the t-1 th framet-1Obtaining a target area xt-1The size is the window size;
step 3, extracting a target area xt-1Is performed fast, andfast Fourier transform to obtain characteristic spectrumWherein ^ represents a discrete Fourier transform;
step 4, according to the characteristic mapCompute kernel autocorrelation
Step 5, training a position and scale correlation filter;
step 6, according to the position p of the t-1 frame targett-1Obtaining a candidate region z of the target in the t-th frametThe size is the window size;
step 7, extracting candidate region ztAnd performing fast Fourier transform to obtain a feature mapWherein ^ represents a discrete Fourier transform;
step 8, according to the characteristic map of the previous frame of the targetComputing kernel cross-correlation
Step 9, respectively detecting the positions corresponding to the maximum values in the output maps of the position filter and the scale filter to determine the position p of the target in the current frametSum scale st;
And step 10, updating the kernel correlation filter by adopting a self-adaptive model updating strategy.
As a specific example, the method for extracting the deep convolution features in steps 3 and 7 specifically includes the following steps:
(3.1) preprocessing, scaling the window area I to the input size 224 × 224 specified by the convolutional neural network;
(3.2) extracting features, namely extracting feature maps of the 3 rd, 4 th and 5 th convolutional layers of the convolutional neural network;
and (3.3) carrying out bilinear interpolation, and upsampling the extracted 3-layer convolution characteristics to the same size.
As a specific example, the computational core autocorrelation of step 4And the step 8 of calculating the kernel cross correlationThe specific method comprises the following steps:
(4.1) using a Gaussian kernel, the formula is as follows:
wherein k (x, x ') represents a Gaussian kernel calculated by two feature maps x and x', exp (eta) represents an e index function, sigma is a standard deviation of the Gaussian function, the value of sigma is 0.5, | | |22-normal form representing a vector or matrix;
(4.2) calculating the kernel correlation, the formula is as follows:
wherein k isxx′Representing the nuclear correlation of the characteristic maps x and x', exp (eta) representing an e index function, sigma is the standard deviation of a Gaussian function, the value of sigma is 0.5, | | |. | tory2A 2-normal form representing a vector or matrix,represents the inverse of the discrete Fourier transform, represents the complex conjugate, represents the discrete Fourier transform,representing the multiplication of corresponding elements of the two matrices.
As a specific example, the training position and scale correlation filter in step 5 is specifically as follows:
respectively training 1 kernel correlation filter for each layer of feature spectrum according to the deep convolution features extracted in the step 3, and training a model by adopting the following formula:
wherein,representing a feature map convolved according to the l-th layer depthThe obtained correlation filter model is used for calculating the correlation filter model,representation feature mapRepresents discrete Fourier transform, lambda is a regularization parameter to prevent overfitting of the trained model, and lambda is 0.001.
As a specific example, the positions corresponding to the maximum values in the output maps of the position filter and the scale filter are detected respectively in step 9 to determine the position p of the target in the current frametSum scale stThe method comprises the following steps:
(9.1) fromT frame image, with position pt-1And selecting candidate area z by window sizet,trans;
(9.2) extracting candidate region zt,trans3 layers depth convolution characteristic map
(9.3) calculating a position filter correlation output confidence coefficient map f for the first layer characteristic mapt,trans (l):
Wherein f ist (l)The output corresponding map of the position filter representing the characteristic map of the l-th layer,representation feature mapAndthe cross-correlation of the kernels of (a),is the position filter trained and updated from the previous frame,represents an inverse discrete Fourier transform, represents a discrete Fourier transform,representing the multiplication of corresponding elements of two matrixes;
(9.4) outputting the corresponding map ft (3)Starting to estimate the target position from coarse to fine, the target position p of the l-th layert (l)For outputting the corresponding map ft,trans (l)The position corresponding to the maximum value;
(9.5) from the t-th frame image, with position ptSum scale st-1Extracting candidate region z of scale estimatet,sacleConstructing a scale pyramid;
(9.6) extracting gradient direction histogram feature of candidate regionComputing a scale filter correlation output confidence map ft,sacle;
(9.7) target dimension s detected at the t-th frametFor outputting the corresponding map ft,sacleThe maximum value corresponds to the scale.
As a specific example, the updating the kernel correlation filter by using the adaptive model updating strategy in step 10 is as follows:
(10.1) after completing the estimation of the target position and scale, calculating two tracking result confidence measure criteria, one of which is the peak value of the relevant corresponding output map:
fmax=maxf
wherein f is a corresponding map output by the correlation between the kernel correlation filter and the candidate region, fmaxIs the peak of the map;
another is the average peak-to-correlation energy (APCE):
wherein f ismaxAnd fminRespectively the maximum and minimum of the output map f, mean () representing the averaging function, fi,jThe ith row and jth column of values representing the output map f;
(10.2) if fmaxAnd APCE are both greater than fmaxAnd APCE historical average value, updating the model, otherwise, not updating; for each depth convolution layer, updating using a linear interpolation method, the formula is as follows:
wherein,andthe feature map and the related filter of the previous frame of the ith layer are respectively represented, η is the learning rate, the larger η is, the faster model is updated, and η takes the value of 0.02.
The invention solves the defect of tracking failure caused by appearance change such as target deformation, illumination change, target rotation and scale change and target shielding. By means of the strong target representation capability of the depth convolution characteristics, the robustness of target tracking of the target in a complex scene and appearance change is improved; in addition, the method can efficiently and accurately process the target scale change, and finally, the model tracking drift caused by error detection is reduced due to the adoption of a self-adaptive model updating strategy with high confidence level.
The present invention will be described in further detail with reference to specific examples.
Example 1
The invention relates to a scale self-adaptive target tracking method based on a depth feature kernel correlation filter, which mainly comprises four steps, wherein the first step is to extract depth convolution features; secondly, training a kernel correlation filter; thirdly, estimating the position and the scale of the target in the current frame; and fourthly, adopting a self-adaptive model updating strategy with high confidence level.
Step 1, inputting an initial position p of a target0Sum scale s0Setting the window size to be 2.0 times of the target initial bounding box;
step 2, according to the target position p of the t-1 th framet-1Obtaining a target area xt-1The size is the window size;
step 3, extracting a target area xt-1The deep convolution characteristic and the fast Fourier transform are carried out to obtain a characteristic mapWherein ^ represents a discrete Fourier transform;
step 4, according to the characteristic mapCompute kernel autocorrelation
Step 5, training a position and scale correlation filter;
step 6, according to the position p of the t-1 frame targett-1Obtaining a candidate region z of the target in the t-th frametThe size is the window size;
step 7, extracting candidate region ztAnd performing fast Fourier transform to obtain a feature mapWherein ^ represents a discrete Fourier transform;
step 8, according to the characteristic map of the previous frame of the targetComputing kernel cross-correlation
Step 9, respectively detecting the positions corresponding to the maximum values of the output corresponding maps of the position filter and the scale filter to determine the position p of the target in the current frametSum scale st;
And step 10, updating the kernel correlation filter by adopting a self-adaptive model updating strategy with high confidence level.
As shown in fig. 2, a schematic diagram of extracting depth convolution features of a scale-adaptive target tracking method based on a depth feature kernel correlation filter is given. The invention relates to a scale self-adaptive target tracking method based on a depth feature kernel correlation filter, which is characterized in that depth convolution features are extracted, are obtained by learning from a large amount of image data of thousands of categories, and have strong target representation capability, so that an algorithm has strong robustness when the appearance of a target changes and external factors such as illumination change, and the specific steps are as follows:
(3.1) preprocessing, scaling the window area I to the input size 224 × 224 specified by the convolutional neural network;
(3.2) extracting features, namely extracting feature maps of the 3 rd, 4 th and 5 th convolutional layers of the convolutional neural network;
and (3.3) carrying out bilinear interpolation, and upsampling the extracted 3-layer convolution characteristics to the same size.
As shown in fig. 3, a schematic diagram of target position and scale estimation of a scale-adaptive target tracking method based on a depth feature kernel correlation filter is given, where fig. 3(a) is a schematic diagram of target position estimation from coarse to fine, and fig. 3(b) is a schematic diagram of adaptive target scale estimation. The invention relates to a scale self-adaptive target tracking method based on a depth characteristic kernel correlation filter, which is characterized by comprising a coarse-to-fine hierarchical position estimation method and a self-adaptive target scale estimation method. The model size of the target in the traditional kernel correlation filtering method is fixed, and the change of the target scale cannot be processed, so that the tracking failure is easily caused. The invention provides a self-adaptive scale estimation method, which has the main idea that an independent scale filter is trained and is estimated through the corresponding scale when the relevant response of the scale filter is maximum, the method skillfully applies fast Fourier transform, is simple and efficient, can be integrated into the traditional discriminant target tracking method, and has the following specific steps:
(9.1) from the t-th frame image, with position pt-1And selecting candidate area z by window sizet,trans;
(9.2) extracting candidate region zt,trans3 layers depth convolution characteristic map
(9.3) calculating a position filter correlation output confidence coefficient map f for the first layer characteristic mapt,trans (l):
Wherein f ist (l)The output corresponding map of the position filter representing the characteristic map of the l-th layer,representation feature mapAndthe cross-correlation of the kernels of (a),is the position filter trained and updated from the previous frame,represents an inverse discrete Fourier transform, represents a discrete Fourier transform,representing the multiplication of corresponding elements of the two matrices.
(9.4) outputting the corresponding map ft (3)Starting to estimate the target position from coarse to fine, the target position p of the l-th layert (l)For outputting the corresponding map ft,trans (l)The position corresponding to the maximum value;
(9.5) from the t-th frame image, with position ptSum scale st-1Extracting candidate region z of scale estimatet,sacleConstructing a scale pyramid;
(9.6) extracting gradient direction histogram feature of candidate regionComputing a scale filter correlation output confidence map ft,sacle;
(9.7) target dimension s detected at the t-th frametFor outputting the corresponding map ft,sacleThe maximum value corresponds to the scale.
As shown in fig. 4, a schematic diagram of model update with high confidence of self-adaptation of a scale-adaptive target tracking method based on a depth feature kernel correlation filter is provided. The invention relates to a scale self-adaptive target tracking method based on a depth characteristic kernel correlation filter, which is characterized by a self-adaptive high-confidence model updating strategy. The traditional updating method of the nuclear correlation filtering model adopts a simple linear interpolation method, tracking error detection is not carried out, when target detection is wrong during tracking, the model updating can pollute the nuclear correlation filter, and the tracking algorithm drifts. The invention provides two detection confidence degree criteria, when the target tracking is determined to be correct, the model is updated, otherwise, the model is not updated, and the method comprises the following specific steps:
(10.1) after completing the estimation of the target position and scale, calculating two tracking result confidence measure criteria, one of which is the peak value of the relevant corresponding output map:
fmax=maxf
wherein f is a corresponding map output by the correlation between the kernel correlation filter and the candidate region, fmaxIs the peak of the map. Another is the average peak-to-correlation energy (APCE):
wherein f ismaxAnd fminIs the maximum and minimum of the output map f, mean (or.) represents the averaging function. f. ofi,jThe ith row and jth column values of the output map f are represented.
(10.2) if fmaxAnd APCE are both greater than fmaxAnd APCE historical average value, updating the model, otherwise, not updating. For each depth convolution layer, updating using a linear interpolation method, the formula is as follows:
wherein,andfeature maps and associated filters representing the previous frame of layer l, respectively, η being the learning rate, the larger the model update ηThe faster the ground, η takes on the value 0.02 in the invention.
As shown in fig. 5, a graph of the results of the evaluation of the present invention on standard visual tracking data sets OTB50 and OTB100 is shown, where (a) is an accuracy plot of the OTB50 data set, (b) is a correct rate plot of the OTB50 data set, (c) is an accuracy plot of the OTB100 data set, and (d) is a correct rate plot of the OTB100 data set. The OTB50 dataset has 50 video sequences for a total of 29000 frames, while the OTB100 dataset has 100 video sequences for a total of 58897 frames, each of which has a marker of the target. The evaluation indexes are mainly two types: accuracy and success rate. In the accuracy plots (a) and (c), accuracy is defined as the number of frames between the algorithm detection position and the target calibration position that are no more than 20 pixels from the total evaluation frame number; in the success rate plots (b) and (d), the overlapping rate refers to the percentage of the frame number of the algorithm detection target bounding box and the target calibration bounding box, wherein the percentage of the overlapping part area (intersection operation) between the two bounding boxes accounts for the total area (union operation) and exceeds 50% of the total evaluation frame number. As can be seen from the evaluation results, the present invention (fcsf 2) performs well in the target tracking task compared to the classical target tracking algorithm.
As shown in fig. 6, a comparison graph of the target tracking results in the actual video of the present invention and some excellent algorithms in recent years is shown, where (a) is a graph of the results of Human test video on OTB100 data set, (b) is a graph of the results of Walking test video on OTB100 data set, (c) is a graph of the results of Tiger test video on OTB50 data set, and (d) is a graph of the results of Dog test video on OTB50 data set. In general, compared with a classical target tracking algorithm, the fSCF2 tracking method has the best tracking effect, and has strong target representation capability due to the use of deep convolution characteristics, and the adaptive scale estimation mechanism and the adaptive model updating strategy with high confidence coefficient enable the method to accurately track the target under the adverse factor conditions of shielding, scale change, target deformation, rapid target movement and the like of the target.
Claims (6)
1. A scale self-adaptive target tracking method based on a depth feature kernel correlation filter is characterized by comprising the following steps:
step 1, inputting an initial position p of a target0Sum scale s0Setting the window size to be 2.0 times of the target initial bounding box;
step 2, according to the target position p of the t-1 th framet-1Obtaining a target area xt-1The size is the window size;
step 3, extracting a target area xt-1Is characterized by deep convolution ofPerforming fast Fourier transform to obtain characteristic spectrumWherein ^ represents a discrete Fourier transform;
step 4, according to the characteristic mapCompute kernel autocorrelation
Step 5, training a position and scale correlation filter;
step 6, according to the position p of the t-1 frame targett-1Obtaining a candidate region z of the target in the t-th frametThe size is the window size;
step 7, extracting candidate region ztAnd performing fast Fourier transform to obtain a feature mapWherein ^ represents a discrete Fourier transform;
step 8, according to the characteristic map of the previous frame of the targetComputing kernel cross-correlation
Step 9, respectively detecting the positions corresponding to the maximum values in the output maps of the position filter and the scale filter to determine the position p of the target in the current frametSum scale st;
And step 10, updating the kernel correlation filter by adopting a self-adaptive model updating strategy.
2. The method for tracking scale-adaptive targets based on the depth feature kernel correlation filter according to claim 1, wherein the method for extracting depth convolution features in steps 3 and 7 specifically comprises the following steps:
(3.1) preprocessing, scaling the window area I to the input size 224 × 224 specified by the convolutional neural network;
(3.2) extracting features, namely extracting feature maps of the 3 rd, 4 th and 5 th convolutional layers of the convolutional neural network;
and (3.3) carrying out bilinear interpolation, and upsampling the extracted 3-layer convolution characteristics to the same size.
3. The method for scale-adaptive target tracking based on depth feature kernel correlation filter according to claim 1, wherein the step 4 is to calculate kernel autocorrelationAnd the step 8 of calculating the kernel cross correlationThe specific method comprises the following steps:
(4.1) using a Gaussian kernel, the formula is as follows:
<mrow> <mi>k</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <msup> <mi>x</mi> <mo>&prime;</mo> </msup> <mo>)</mo> </mrow> <mo>=</mo> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mn>1</mn> <msup> <mi>&sigma;</mi> <mn>2</mn> </msup> </mfrac> <mo>|</mo> <mo>|</mo> <mi>x</mi> <mo>-</mo> <msup> <mi>x</mi> <mo>&prime;</mo> </msup> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow>
wherein k (x, x ') represents a Gaussian kernel calculated by two feature maps x and x', exp (eta) represents an e index function, sigma is a standard deviation of the Gaussian function, the value of sigma is 0.5, | | |22-normal form representing a vector or matrix;
(4.2) calculating the kernel correlation, the formula is as follows:
wherein k isxx′Representing the nuclear correlation of the characteristic maps x and x', exp (eta) representing an e index function, sigma is the standard deviation of a Gaussian function, the value of sigma is 0.5, | | |. | tory2A 2-normal form representing a vector or matrix,represents the inverse of the discrete fourier transform, denotes the complex conjugate, denotes the discrete fourier transform, and ⊙ denotes the multiplication of the corresponding elements of the two matrices.
4. The method for tracking a scale-adaptive target based on a depth feature kernel correlation filter according to claim 1, wherein the training position and scale correlation filter of step 5 is specifically as follows:
respectively training 1 kernel correlation filter for each layer of feature spectrum according to the deep convolution features extracted in the step 3, and training a model by adopting the following formula:
<mrow> <msup> <mover> <mi>&alpha;</mi> <mo>^</mo> </mover> <mrow> <mo>(</mo> <mi>l</mi> <mo>)</mo> </mrow> </msup> <mo>=</mo> <mfrac> <mover> <mi>y</mi> <mo>^</mo> </mover> <mrow> <msup> <mover> <mi>k</mi> <mo>^</mo> </mover> <mrow> <msup> <mi>xx</mi> <mrow> <mo>(</mo> <mi>l</mi> <mo>)</mo> </mrow> </msup> </mrow> </msup> <mo>+</mo> <mi>&lambda;</mi> </mrow> </mfrac> </mrow>
wherein,representing a feature map convolved according to the l-th layer depthThe obtained correlation filter model is used for calculating the correlation filter model,representation feature mapRepresents discrete Fourier transform, lambda is a regularization parameter to prevent overfitting of the trained model, and lambda is 0.001.
5. The method according to claim 1, wherein the step 9 detects the position corresponding to the maximum value in the output maps of the position filter and the scale filter to determine the position p of the target in the current frametSum scale stThe method comprises the following steps:
(9.1) from the t-th frame image, with position pt-1And selecting candidate area z by window sizet,trans;
(9.2) extracting candidate region zt,trans3 layers depth convolution characteristic map
(9.3) calculating a position filter correlation output confidence coefficient map f for the first layer characteristic mapt,trans (l):
Wherein f ist (l)The output corresponding map of the position filter representing the characteristic map of the l-th layer,representation feature mapAndthe cross-correlation of the kernels of (a),is the position filter trained and updated from the previous frame,represents the inverse of the discrete Fourier transform, represents the discrete Fourier transform, ⊙ represents the multiplication of the corresponding elements of the two matrices;
(9.4) outputting the corresponding map ft (3)Starting to estimate the target position from coarse to fine, the target position p of the l-th layert (l)For outputting the corresponding map ft,trans (l)The position corresponding to the maximum value;
(9.5) from the t-th frame image, with position ptSum scale st-1Extracting candidate region z of scale estimatet,sacleConstructing a scale pyramid;
(9.6) extracting gradient direction histogram feature of candidate regionComputing a scale filter correlation output confidence map ft,sacle;
(9.7) target dimension s detected at the t-th frametFor outputting the corresponding map ft,sacleThe maximum value corresponds to the scale.
6. The method for scale-adaptive target tracking based on the depth feature kernel correlation filter according to claim 1, wherein the step 10 updates the kernel correlation filter by using an adaptive model update strategy, specifically as follows:
(10.1) after completing the estimation of the target position and scale, calculating two tracking result confidence measure criteria, one of which is the peak value of the relevant corresponding output map:
fmax=maxf
wherein f is a corresponding map output by the correlation between the kernel correlation filter and the candidate region, fmaxIs the peak of the map;
the other is the average peak correlation energy APCE:
<mrow> <mi>A</mi> <mi>P</mi> <mi>C</mi> <mi>E</mi> <mo>=</mo> <mfrac> <mrow> <mo>|</mo> <msub> <mi>f</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>f</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> <msup> <mo>|</mo> <mn>2</mn> </msup> </mrow> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <msub> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <msup> <mrow> <mo>(</mo> <msub> <mi>f</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>f</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
wherein f ismaxAnd fminRespectively, of the output map fLarge and minimum values, mean () representing the averaging function, fi,jThe ith row and jth column of values representing the output map f;
(10.2) if fmaxAnd APCE are both greater than fmaxAnd APCE historical average value, updating the model, otherwise, not updating; for each depth convolution layer, updating using a linear interpolation method, the formula is as follows:
<mrow> <msup> <msub> <mover> <mi>&alpha;</mi> <mo>^</mo> </mover> <mi>t</mi> </msub> <mrow> <mo>(</mo> <mi>l</mi> <mo>)</mo> </mrow> </msup> <mo>=</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&eta;</mi> <mo>)</mo> </mrow> <msup> <msub> <mover> <mi>&alpha;</mi> <mo>^</mo> </mover> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>l</mi> <mo>)</mo> </mrow> </msup> <mo>+</mo> <mi>&eta;</mi> <msup> <msub> <mover> <mi>&alpha;</mi> <mo>^</mo> </mover> <mi>t</mi> </msub> <mrow> <mo>(</mo> <mi>l</mi> <mo>)</mo> </mrow> </msup> </mrow>2
<mrow> <msup> <msub> <mover> <mi>x</mi> <mo>^</mo> </mover> <mi>t</mi> </msub> <mrow> <mo>(</mo> <mi>l</mi> <mo>)</mo> </mrow> </msup> <mo>=</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&eta;</mi> <mo>)</mo> </mrow> <msup> <msub> <mover> <mi>x</mi> <mo>^</mo> </mover> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>l</mi> <mo>)</mo> </mrow> </msup> <mo>+</mo> <mi>&eta;</mi> <msup> <msub> <mover> <mi>x</mi> <mo>^</mo> </mover> <mi>t</mi> </msub> <mrow> <mo>(</mo> <mi>l</mi> <mo>)</mo> </mrow> </msup> </mrow>
wherein,andthe feature map and the related filter of the previous frame of the ith layer are respectively represented, η is the learning rate, the larger η is, the faster model is updated, and η takes the value of 0.02.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710355456.3A CN107154024A (en) | 2017-05-19 | 2017-05-19 | Dimension self-adaption method for tracking target based on depth characteristic core correlation filter |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710355456.3A CN107154024A (en) | 2017-05-19 | 2017-05-19 | Dimension self-adaption method for tracking target based on depth characteristic core correlation filter |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107154024A true CN107154024A (en) | 2017-09-12 |
Family
ID=59794201
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710355456.3A Pending CN107154024A (en) | 2017-05-19 | 2017-05-19 | Dimension self-adaption method for tracking target based on depth characteristic core correlation filter |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107154024A (en) |
Cited By (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107644217A (en) * | 2017-09-29 | 2018-01-30 | 中国科学技术大学 | Method for tracking target based on convolutional neural networks and correlation filter |
CN107730536A (en) * | 2017-09-15 | 2018-02-23 | 北京飞搜科技有限公司 | A kind of high speed correlation filtering object tracking method based on depth characteristic |
CN108053424A (en) * | 2017-12-15 | 2018-05-18 | 深圳云天励飞技术有限公司 | Method for tracking target, device, electronic equipment and storage medium |
CN108182388A (en) * | 2017-12-14 | 2018-06-19 | 哈尔滨工业大学(威海) | A kind of motion target tracking method based on image |
CN108280808A (en) * | 2017-12-15 | 2018-07-13 | 西安电子科技大学 | The method for tracking target of correlation filter is exported based on structuring |
CN108346159A (en) * | 2018-01-28 | 2018-07-31 | 北京工业大学 | A kind of visual target tracking method based on tracking-study-detection |
CN108345885A (en) * | 2018-01-18 | 2018-07-31 | 浙江大华技术股份有限公司 | A kind of method and device of target occlusion detection |
CN108550126A (en) * | 2018-04-18 | 2018-09-18 | 长沙理工大学 | A kind of adaptive correlation filter method for tracking target and system |
CN108573499A (en) * | 2018-03-16 | 2018-09-25 | 东华大学 | A kind of visual target tracking method based on dimension self-adaption and occlusion detection |
CN108665481A (en) * | 2018-03-27 | 2018-10-16 | 西安电子科技大学 | Multilayer depth characteristic fusion it is adaptive resist block infrared object tracking method |
CN108734151A (en) * | 2018-06-14 | 2018-11-02 | 厦门大学 | Robust long-range method for tracking target based on correlation filtering and the twin network of depth |
CN108830878A (en) * | 2018-04-13 | 2018-11-16 | 上海大学 | A kind of method for tracking target based on FPN neural network |
CN108846345A (en) * | 2018-06-06 | 2018-11-20 | 安徽大学 | Moving object scale estimation method in monitoring scene |
CN109035300A (en) * | 2018-07-05 | 2018-12-18 | 桂林电子科技大学 | A kind of method for tracking target based on depth characteristic Yu average peak correlation energy |
CN109035290A (en) * | 2018-07-16 | 2018-12-18 | 南京信息工程大学 | A kind of track algorithm updating accretion learning based on high confidence level |
CN109064491A (en) * | 2018-04-12 | 2018-12-21 | 江苏省基础地理信息中心 | A kind of nuclear phase pass filter tracking method of adaptive piecemeal |
CN109146928A (en) * | 2017-12-29 | 2019-01-04 | 西安电子科技大学 | A kind of method for tracking target that Grads threshold judgment models update |
CN109146917A (en) * | 2017-12-29 | 2019-01-04 | 西安电子科技大学 | A kind of method for tracking target of elasticity more new strategy |
CN109166106A (en) * | 2018-08-02 | 2019-01-08 | 山东大学 | A kind of target detection aligning method and apparatus based on sliding window |
CN109255304A (en) * | 2018-08-17 | 2019-01-22 | 西安电子科技大学 | Method for tracking target based on distribution field feature |
CN109410246A (en) * | 2018-09-25 | 2019-03-01 | 深圳市中科视讯智能系统技术有限公司 | The method and device of vision tracking based on correlation filtering |
CN109410251A (en) * | 2018-11-19 | 2019-03-01 | 南京邮电大学 | Method for tracking target based on dense connection convolutional network |
CN109410247A (en) * | 2018-10-16 | 2019-03-01 | 中国石油大学(华东) | A kind of video tracking algorithm of multi-template and adaptive features select |
CN109461172A (en) * | 2018-10-25 | 2019-03-12 | 南京理工大学 | Manually with the united correlation filtering video adaptive tracking method of depth characteristic |
CN109584271A (en) * | 2018-11-15 | 2019-04-05 | 西北工业大学 | High speed correlation filtering tracking based on high confidence level more new strategy |
CN109741366A (en) * | 2018-11-27 | 2019-05-10 | 昆明理工大学 | A kind of correlation filtering method for tracking target merging multilayer convolution feature |
CN109785360A (en) * | 2018-12-18 | 2019-05-21 | 南京理工大学 | A kind of adaptive method for real time tracking excavated based on online sample |
CN109801311A (en) * | 2019-01-31 | 2019-05-24 | 长安大学 | A kind of visual target tracking method based on depth residual error network characterization |
CN109858455A (en) * | 2019-02-18 | 2019-06-07 | 南京航空航天大学 | A kind of piecemeal detection scale adaptive tracking method for circular target |
CN109859244A (en) * | 2019-01-22 | 2019-06-07 | 西安微电子技术研究所 | A kind of visual tracking method based on convolution sparseness filtering |
CN109858454A (en) * | 2019-02-15 | 2019-06-07 | 东北大学 | One kind being based on dual model self-adaptive kernel correlation filtering method for tracing |
CN109858326A (en) * | 2018-12-11 | 2019-06-07 | 中国科学院自动化研究所 | Based on classification semantic Weakly supervised online visual tracking method and system |
CN109886996A (en) * | 2019-01-15 | 2019-06-14 | 东华大学 | A kind of visual pursuit optimization method |
CN109934098A (en) * | 2019-01-24 | 2019-06-25 | 西北工业大学 | A kind of video camera intelligence system and its implementation with secret protection |
CN110033006A (en) * | 2019-04-04 | 2019-07-19 | 中设设计集团股份有限公司 | Vehicle detecting and tracking method based on color characteristic Nonlinear Dimension Reduction |
CN110197126A (en) * | 2019-05-06 | 2019-09-03 | 深圳岚锋创视网络科技有限公司 | A kind of target tracking method, device and portable terminal |
CN110211149A (en) * | 2018-12-25 | 2019-09-06 | 湖州云通科技有限公司 | A kind of dimension self-adaption nuclear phase pass filter tracking method based on context-aware |
CN110414439A (en) * | 2019-07-30 | 2019-11-05 | 武汉理工大学 | Anti- based on multi-peak detection blocks pedestrian tracting method |
CN110544267A (en) * | 2019-07-24 | 2019-12-06 | 中国地质大学(武汉) | correlation filtering tracking method for self-adaptive selection characteristics |
CN110555870A (en) * | 2019-09-09 | 2019-12-10 | 北京理工大学 | DCF tracking confidence evaluation and classifier updating method based on neural network |
CN110633595A (en) * | 2018-06-21 | 2019-12-31 | 北京京东尚科信息技术有限公司 | Target detection method and device by utilizing bilinear interpolation |
CN110689559A (en) * | 2019-09-30 | 2020-01-14 | 长安大学 | Visual target tracking method based on dense convolutional network characteristics |
CN110807473A (en) * | 2019-10-12 | 2020-02-18 | 浙江大华技术股份有限公司 | Target detection method, device and computer storage medium |
CN110889863A (en) * | 2019-09-03 | 2020-03-17 | 河南理工大学 | Target tracking method based on target perception correlation filtering |
CN111161323A (en) * | 2019-12-31 | 2020-05-15 | 北京理工大学重庆创新中心 | Complex scene target tracking method and system based on correlation filtering |
CN111192288A (en) * | 2018-11-14 | 2020-05-22 | 天津大学青岛海洋技术研究院 | Target tracking algorithm based on deformation sample generation network |
CN111221770A (en) * | 2019-12-31 | 2020-06-02 | 中国船舶重工集团公司第七一七研究所 | Kernel correlation filtering target tracking method and system |
CN111428740A (en) * | 2020-02-28 | 2020-07-17 | 深圳壹账通智能科技有限公司 | Detection method and device for network-shot photo, computer equipment and storage medium |
CN111476819A (en) * | 2020-03-19 | 2020-07-31 | 重庆邮电大学 | Long-term target tracking method based on multi-correlation filtering model |
CN111507999A (en) * | 2019-01-30 | 2020-08-07 | 北京四维图新科技股份有限公司 | FDSST algorithm-based target tracking method and device |
CN111696132A (en) * | 2020-05-15 | 2020-09-22 | 深圳市优必选科技股份有限公司 | Target tracking method and device, computer readable storage medium and robot |
CN112053386A (en) * | 2020-08-31 | 2020-12-08 | 西安电子科技大学 | Target tracking method based on depth convolution characteristic self-adaptive integration |
CN112200833A (en) * | 2020-09-17 | 2021-01-08 | 天津城建大学 | Relevant filtering video tracking algorithm based on residual error network and short-term visual memory |
CN112560695A (en) * | 2020-12-17 | 2021-03-26 | 中国海洋大学 | Underwater target tracking method, system, storage medium, equipment, terminal and application |
CN113379804A (en) * | 2021-07-12 | 2021-09-10 | 闽南师范大学 | Unmanned aerial vehicle target tracking method, terminal equipment and storage medium |
CN114708300A (en) * | 2022-03-02 | 2022-07-05 | 北京理工大学 | Anti-blocking self-adaptive target tracking method and system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015163830A1 (en) * | 2014-04-22 | 2015-10-29 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | Target localization and size estimation via multiple model learning in visual tracking |
CN106570486A (en) * | 2016-11-09 | 2017-04-19 | 华南理工大学 | Kernel correlation filtering target tracking method based on feature fusion and Bayesian classification |
-
2017
- 2017-05-19 CN CN201710355456.3A patent/CN107154024A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015163830A1 (en) * | 2014-04-22 | 2015-10-29 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | Target localization and size estimation via multiple model learning in visual tracking |
CN106570486A (en) * | 2016-11-09 | 2017-04-19 | 华南理工大学 | Kernel correlation filtering target tracking method based on feature fusion and Bayesian classification |
Non-Patent Citations (1)
Title |
---|
张雷: "复杂场景下实时目标跟踪算法及实现技术研究", 《中国博士学位论文全文数据库 信息科技辑》 * |
Cited By (92)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107730536A (en) * | 2017-09-15 | 2018-02-23 | 北京飞搜科技有限公司 | A kind of high speed correlation filtering object tracking method based on depth characteristic |
CN107730536B (en) * | 2017-09-15 | 2020-05-12 | 苏州飞搜科技有限公司 | High-speed correlation filtering object tracking method based on depth features |
CN107644217B (en) * | 2017-09-29 | 2020-06-26 | 中国科学技术大学 | Target tracking method based on convolutional neural network and related filter |
CN107644217A (en) * | 2017-09-29 | 2018-01-30 | 中国科学技术大学 | Method for tracking target based on convolutional neural networks and correlation filter |
CN108182388A (en) * | 2017-12-14 | 2018-06-19 | 哈尔滨工业大学(威海) | A kind of motion target tracking method based on image |
CN108053424A (en) * | 2017-12-15 | 2018-05-18 | 深圳云天励飞技术有限公司 | Method for tracking target, device, electronic equipment and storage medium |
CN108280808A (en) * | 2017-12-15 | 2018-07-13 | 西安电子科技大学 | The method for tracking target of correlation filter is exported based on structuring |
CN108053424B (en) * | 2017-12-15 | 2020-06-16 | 深圳云天励飞技术有限公司 | Target tracking method and device, electronic equipment and storage medium |
CN109146928A (en) * | 2017-12-29 | 2019-01-04 | 西安电子科技大学 | A kind of method for tracking target that Grads threshold judgment models update |
CN109146917A (en) * | 2017-12-29 | 2019-01-04 | 西安电子科技大学 | A kind of method for tracking target of elasticity more new strategy |
CN109146928B (en) * | 2017-12-29 | 2021-09-24 | 西安电子科技大学 | Target tracking method for updating gradient threshold judgment model |
CN109146917B (en) * | 2017-12-29 | 2020-07-28 | 西安电子科技大学 | Target tracking method for elastic updating strategy |
CN108345885A (en) * | 2018-01-18 | 2018-07-31 | 浙江大华技术股份有限公司 | A kind of method and device of target occlusion detection |
CN108346159A (en) * | 2018-01-28 | 2018-07-31 | 北京工业大学 | A kind of visual target tracking method based on tracking-study-detection |
CN108346159B (en) * | 2018-01-28 | 2021-10-15 | 北京工业大学 | Tracking-learning-detection-based visual target tracking method |
CN108573499A (en) * | 2018-03-16 | 2018-09-25 | 东华大学 | A kind of visual target tracking method based on dimension self-adaption and occlusion detection |
CN108573499B (en) * | 2018-03-16 | 2021-04-02 | 东华大学 | Visual target tracking method based on scale self-adaption and occlusion detection |
CN108665481A (en) * | 2018-03-27 | 2018-10-16 | 西安电子科技大学 | Multilayer depth characteristic fusion it is adaptive resist block infrared object tracking method |
CN108665481B (en) * | 2018-03-27 | 2022-05-31 | 西安电子科技大学 | Self-adaptive anti-blocking infrared target tracking method based on multi-layer depth feature fusion |
CN109064491A (en) * | 2018-04-12 | 2018-12-21 | 江苏省基础地理信息中心 | A kind of nuclear phase pass filter tracking method of adaptive piecemeal |
CN108830878A (en) * | 2018-04-13 | 2018-11-16 | 上海大学 | A kind of method for tracking target based on FPN neural network |
CN108830878B (en) * | 2018-04-13 | 2021-02-23 | 上海大学 | Target tracking method based on FPN neural network |
CN108550126A (en) * | 2018-04-18 | 2018-09-18 | 长沙理工大学 | A kind of adaptive correlation filter method for tracking target and system |
CN108846345B (en) * | 2018-06-06 | 2021-09-17 | 安徽大学 | Moving object scale estimation method in monitoring scene |
CN108846345A (en) * | 2018-06-06 | 2018-11-20 | 安徽大学 | Moving object scale estimation method in monitoring scene |
CN108734151B (en) * | 2018-06-14 | 2020-04-14 | 厦门大学 | Robust long-range target tracking method based on correlation filtering and depth twin network |
CN108734151A (en) * | 2018-06-14 | 2018-11-02 | 厦门大学 | Robust long-range method for tracking target based on correlation filtering and the twin network of depth |
CN110633595B (en) * | 2018-06-21 | 2022-12-02 | 北京京东尚科信息技术有限公司 | Target detection method and device by utilizing bilinear interpolation |
CN110633595A (en) * | 2018-06-21 | 2019-12-31 | 北京京东尚科信息技术有限公司 | Target detection method and device by utilizing bilinear interpolation |
CN109035300B (en) * | 2018-07-05 | 2021-03-26 | 桂林电子科技大学 | Target tracking method based on depth feature and average peak correlation energy |
CN109035300A (en) * | 2018-07-05 | 2018-12-18 | 桂林电子科技大学 | A kind of method for tracking target based on depth characteristic Yu average peak correlation energy |
CN109035290A (en) * | 2018-07-16 | 2018-12-18 | 南京信息工程大学 | A kind of track algorithm updating accretion learning based on high confidence level |
CN109166106A (en) * | 2018-08-02 | 2019-01-08 | 山东大学 | A kind of target detection aligning method and apparatus based on sliding window |
CN109255304A (en) * | 2018-08-17 | 2019-01-22 | 西安电子科技大学 | Method for tracking target based on distribution field feature |
CN109255304B (en) * | 2018-08-17 | 2021-07-27 | 西安电子科技大学 | Target tracking method based on distribution field characteristics |
CN109410246B (en) * | 2018-09-25 | 2021-06-11 | 杭州视语智能视觉系统技术有限公司 | Visual tracking method and device based on correlation filtering |
CN109410246A (en) * | 2018-09-25 | 2019-03-01 | 深圳市中科视讯智能系统技术有限公司 | The method and device of vision tracking based on correlation filtering |
CN109410247A (en) * | 2018-10-16 | 2019-03-01 | 中国石油大学(华东) | A kind of video tracking algorithm of multi-template and adaptive features select |
CN109461172A (en) * | 2018-10-25 | 2019-03-12 | 南京理工大学 | Manually with the united correlation filtering video adaptive tracking method of depth characteristic |
CN111192288B (en) * | 2018-11-14 | 2023-08-04 | 天津大学青岛海洋技术研究院 | Target tracking algorithm based on deformation sample generation network |
CN111192288A (en) * | 2018-11-14 | 2020-05-22 | 天津大学青岛海洋技术研究院 | Target tracking algorithm based on deformation sample generation network |
CN109584271A (en) * | 2018-11-15 | 2019-04-05 | 西北工业大学 | High speed correlation filtering tracking based on high confidence level more new strategy |
CN109410251A (en) * | 2018-11-19 | 2019-03-01 | 南京邮电大学 | Method for tracking target based on dense connection convolutional network |
CN109741366A (en) * | 2018-11-27 | 2019-05-10 | 昆明理工大学 | A kind of correlation filtering method for tracking target merging multilayer convolution feature |
CN109858326A (en) * | 2018-12-11 | 2019-06-07 | 中国科学院自动化研究所 | Based on classification semantic Weakly supervised online visual tracking method and system |
CN109785360A (en) * | 2018-12-18 | 2019-05-21 | 南京理工大学 | A kind of adaptive method for real time tracking excavated based on online sample |
CN110211149A (en) * | 2018-12-25 | 2019-09-06 | 湖州云通科技有限公司 | A kind of dimension self-adaption nuclear phase pass filter tracking method based on context-aware |
CN110211149B (en) * | 2018-12-25 | 2022-08-12 | 湖州云通科技有限公司 | Scale self-adaptive kernel correlation filtering tracking method based on background perception |
CN109886996B (en) * | 2019-01-15 | 2023-06-06 | 东华大学 | Visual tracking optimization method |
CN109886996A (en) * | 2019-01-15 | 2019-06-14 | 东华大学 | A kind of visual pursuit optimization method |
CN109859244A (en) * | 2019-01-22 | 2019-06-07 | 西安微电子技术研究所 | A kind of visual tracking method based on convolution sparseness filtering |
CN109859244B (en) * | 2019-01-22 | 2022-07-08 | 西安微电子技术研究所 | Visual tracking method based on convolution sparse filtering |
CN109934098A (en) * | 2019-01-24 | 2019-06-25 | 西北工业大学 | A kind of video camera intelligence system and its implementation with secret protection |
CN111507999B (en) * | 2019-01-30 | 2023-07-18 | 北京四维图新科技股份有限公司 | Target tracking method and device based on FDSST algorithm |
CN111507999A (en) * | 2019-01-30 | 2020-08-07 | 北京四维图新科技股份有限公司 | FDSST algorithm-based target tracking method and device |
CN109801311B (en) * | 2019-01-31 | 2021-07-16 | 长安大学 | Visual target tracking method based on depth residual error network characteristics |
CN109801311A (en) * | 2019-01-31 | 2019-05-24 | 长安大学 | A kind of visual target tracking method based on depth residual error network characterization |
CN109858454A (en) * | 2019-02-15 | 2019-06-07 | 东北大学 | One kind being based on dual model self-adaptive kernel correlation filtering method for tracing |
CN109858454B (en) * | 2019-02-15 | 2023-04-07 | 东北大学 | Adaptive kernel correlation filtering tracking method based on dual models |
CN109858455A (en) * | 2019-02-18 | 2019-06-07 | 南京航空航天大学 | A kind of piecemeal detection scale adaptive tracking method for circular target |
CN110033006A (en) * | 2019-04-04 | 2019-07-19 | 中设设计集团股份有限公司 | Vehicle detecting and tracking method based on color characteristic Nonlinear Dimension Reduction |
CN110197126A (en) * | 2019-05-06 | 2019-09-03 | 深圳岚锋创视网络科技有限公司 | A kind of target tracking method, device and portable terminal |
CN110544267B (en) * | 2019-07-24 | 2022-03-15 | 中国地质大学(武汉) | Correlation filtering tracking method for self-adaptive selection characteristics |
CN110544267A (en) * | 2019-07-24 | 2019-12-06 | 中国地质大学(武汉) | correlation filtering tracking method for self-adaptive selection characteristics |
CN110414439A (en) * | 2019-07-30 | 2019-11-05 | 武汉理工大学 | Anti- based on multi-peak detection blocks pedestrian tracting method |
CN110414439B (en) * | 2019-07-30 | 2022-03-15 | 武汉理工大学 | Anti-blocking pedestrian tracking method based on multi-peak detection |
CN110889863A (en) * | 2019-09-03 | 2020-03-17 | 河南理工大学 | Target tracking method based on target perception correlation filtering |
CN110889863B (en) * | 2019-09-03 | 2023-03-24 | 河南理工大学 | Target tracking method based on target perception correlation filtering |
CN110555870B (en) * | 2019-09-09 | 2021-07-27 | 北京理工大学 | DCF tracking confidence evaluation and classifier updating method based on neural network |
CN110555870A (en) * | 2019-09-09 | 2019-12-10 | 北京理工大学 | DCF tracking confidence evaluation and classifier updating method based on neural network |
CN110689559A (en) * | 2019-09-30 | 2020-01-14 | 长安大学 | Visual target tracking method based on dense convolutional network characteristics |
CN110689559B (en) * | 2019-09-30 | 2022-08-12 | 长安大学 | Visual target tracking method based on dense convolutional network characteristics |
CN110807473A (en) * | 2019-10-12 | 2020-02-18 | 浙江大华技术股份有限公司 | Target detection method, device and computer storage medium |
CN110807473B (en) * | 2019-10-12 | 2023-01-03 | 浙江大华技术股份有限公司 | Target detection method, device and computer storage medium |
CN111161323B (en) * | 2019-12-31 | 2023-11-28 | 北京理工大学重庆创新中心 | Complex scene target tracking method and system based on correlation filtering |
CN111221770A (en) * | 2019-12-31 | 2020-06-02 | 中国船舶重工集团公司第七一七研究所 | Kernel correlation filtering target tracking method and system |
CN111161323A (en) * | 2019-12-31 | 2020-05-15 | 北京理工大学重庆创新中心 | Complex scene target tracking method and system based on correlation filtering |
WO2021169625A1 (en) * | 2020-02-28 | 2021-09-02 | 深圳壹账通智能科技有限公司 | Method and apparatus for detecting reproduced network photograph, computer device, and storage medium |
CN111428740A (en) * | 2020-02-28 | 2020-07-17 | 深圳壹账通智能科技有限公司 | Detection method and device for network-shot photo, computer equipment and storage medium |
CN111476819A (en) * | 2020-03-19 | 2020-07-31 | 重庆邮电大学 | Long-term target tracking method based on multi-correlation filtering model |
WO2021227519A1 (en) * | 2020-05-15 | 2021-11-18 | 深圳市优必选科技股份有限公司 | Target tracking method and apparatus, and computer-readable storage medium and robot |
CN111696132B (en) * | 2020-05-15 | 2023-12-29 | 深圳市优必选科技股份有限公司 | Target tracking method, device, computer readable storage medium and robot |
CN111696132A (en) * | 2020-05-15 | 2020-09-22 | 深圳市优必选科技股份有限公司 | Target tracking method and device, computer readable storage medium and robot |
CN112053386B (en) * | 2020-08-31 | 2023-04-18 | 西安电子科技大学 | Target tracking method based on depth convolution characteristic self-adaptive integration |
CN112053386A (en) * | 2020-08-31 | 2020-12-08 | 西安电子科技大学 | Target tracking method based on depth convolution characteristic self-adaptive integration |
CN112200833A (en) * | 2020-09-17 | 2021-01-08 | 天津城建大学 | Relevant filtering video tracking algorithm based on residual error network and short-term visual memory |
CN112560695B (en) * | 2020-12-17 | 2023-03-24 | 中国海洋大学 | Underwater target tracking method, system, storage medium, equipment, terminal and application |
CN112560695A (en) * | 2020-12-17 | 2021-03-26 | 中国海洋大学 | Underwater target tracking method, system, storage medium, equipment, terminal and application |
CN113379804A (en) * | 2021-07-12 | 2021-09-10 | 闽南师范大学 | Unmanned aerial vehicle target tracking method, terminal equipment and storage medium |
CN113379804B (en) * | 2021-07-12 | 2023-05-09 | 闽南师范大学 | Unmanned aerial vehicle target tracking method, terminal equipment and storage medium |
CN114708300A (en) * | 2022-03-02 | 2022-07-05 | 北京理工大学 | Anti-blocking self-adaptive target tracking method and system |
CN114708300B (en) * | 2022-03-02 | 2024-07-23 | 北京理工大学 | Anti-shielding self-adaptive target tracking method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107154024A (en) | Dimension self-adaption method for tracking target based on depth characteristic core correlation filter | |
CN108665481B (en) | Self-adaptive anti-blocking infrared target tracking method based on multi-layer depth feature fusion | |
CN111354017B (en) | Target tracking method based on twin neural network and parallel attention module | |
WO2022036777A1 (en) | Method and device for intelligent estimation of human body movement posture based on convolutional neural network | |
CN107316316A (en) | The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features | |
CN112184752A (en) | Video target tracking method based on pyramid convolution | |
CN109741366B (en) | Related filtering target tracking method fusing multilayer convolution characteristics | |
CN111462191B (en) | Non-local filter unsupervised optical flow estimation method based on deep learning | |
CN110175649B (en) | Rapid multi-scale estimation target tracking method for re-detection | |
CN107369166A (en) | A kind of method for tracking target and system based on multiresolution neutral net | |
CN112364931B (en) | Few-sample target detection method and network system based on meta-feature and weight adjustment | |
CN108053419A (en) | Inhibited and the jamproof multiscale target tracking of prospect based on background | |
CN111080675A (en) | Target tracking method based on space-time constraint correlation filtering | |
CN109461172A (en) | Manually with the united correlation filtering video adaptive tracking method of depth characteristic | |
CN108427921A (en) | A kind of face identification method based on convolutional neural networks | |
CN104574445A (en) | Target tracking method and device | |
CN110120065B (en) | Target tracking method and system based on hierarchical convolution characteristics and scale self-adaptive kernel correlation filtering | |
CN114565655B (en) | Depth estimation method and device based on pyramid segmentation attention | |
CN106338733A (en) | Forward-looking sonar object tracking method based on frog-eye visual characteristic | |
CN107452022A (en) | A kind of video target tracking method | |
CN112949493A (en) | Lane line detection method and system combining semantic segmentation and attention mechanism | |
CN116524062B (en) | Diffusion model-based 2D human body posture estimation method | |
CN110245587B (en) | Optical remote sensing image target detection method based on Bayesian transfer learning | |
CN107944354A (en) | A kind of vehicle checking method based on deep learning | |
CN110660080A (en) | Multi-scale target tracking method based on learning rate adjustment and fusion of multilayer convolution features |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170912 |