CN107316316A - The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features - Google Patents
The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features Download PDFInfo
- Publication number
- CN107316316A CN107316316A CN201710355503.4A CN201710355503A CN107316316A CN 107316316 A CN107316316 A CN 107316316A CN 201710355503 A CN201710355503 A CN 201710355503A CN 107316316 A CN107316316 A CN 107316316A
- Authority
- CN
- China
- Prior art keywords
- target
- correlation
- illustrative plates
- characteristic spectrum
- collection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/262—Analysis of motion using transform domain methods, e.g. Fourier domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/269—Analysis of motion using gradient-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a kind of method for tracking target adaptively merged based on multiple features with nuclear phase pass filtering technique.Method and step is:The target location tracked according to former frame and yardstick, obtain the candidate region of target motion;The gradient orientation histogram and color characteristic of candidate region are extracted, two kinds of features are merged, and carries out FFT, obtains calculating core cross-correlation after characteristic spectrum;Determine that target, in the position of present frame and yardstick, obtains target area;The gradient orientation histogram and color characteristic of target area are extracted, two kinds of features are merged, and carries out FFT, obtains calculating core auto-correlation after characteristic spectrum;The adaptive target of design is corresponding, training position filtering device and scaling filter model;Characteristic spectrum and correlation filter are updated using linear interpolation method.Invention enhances the discriminating power of model, the robustness of target following of the target in complex scene and cosmetic variation is improved, computation complexity is reduced, improves the real-time of tracking.
Description
Technical field
It is particularly a kind of adaptively to be merged and core correlation filtering skill based on multiple features the present invention relates to computer vision field
The method for tracking target of art.
Background technology
Target following is the important research content in computer vision field, and target tracking is mainly according to target in video
In the first frame or former frames position, the track that estimation postorder sequence target occurs.At present, target following technology mainly has two
Major class:
(1) production method:This method is mainly the appearance features that target is described with generation model, in postorder sequence
Find most like with target appearance, that is to say, that minimize reconstructed error by searching for candidate target.Compare representative
Algorithm have sparse coding, online density estimation and principal component analysis (PCA) etc..Production method is conceived to target appearance
Portray, ignore background information, therefore drift is easily produced when target appearance changes acutely or is blocked, track failure.
(2) discriminate method:This method mainly trains two graders, Ran Hou with online machine learning techniques
Target detection is carried out with the grader in postorder sequence, target following is completed.In recent years, various machine learning algorithms are employed
In discriminate method, wherein more representational have many case-based learning methods, boosting and structure SVM etc..Discriminate
Method is because significantly distinguish the information of background and prospect, and discriminating power is strong, expressively more robust, gradually in target tracking domain
Occupy dominant position.It is noted that major part deep learning method for tracking target also belongs to discriminate framework at present.
But, traditional discriminate method has an important defect, i.e., in order to strengthen discriminating power, generally require a large amount of
Training sample, while also having aggravated computation burden so that these discriminate methods are struggled in the real-time of tracking.
The content of the invention
Present invention aims at providing, a kind of discriminating power is strong, the target following robustness in complex scene and cosmetic variation
High adaptively merges the method for tracking target that filtering technique is closed with nuclear phase based on multiple features, is calculated with being reduced in being handled in frequency domain
Complexity, improves the real-time of target following.
The technical solution for realizing the object of the invention is:One kind is adaptively merged and core correlation filtering skill based on multiple features
The method for tracking target of art, including following steps:
Step 1, t two field pictures are inputted, step 6 are entered if t=1, otherwise into next step;
Step 2, the target location p tracked according to t-1 framest-1With yardstick st-1, obtain the candidate region z of target motiont;
Step 3, candidate region z is extractedtGradient orientation histogram and color characteristic, two kinds of features are merged, so
After carry out FFT, obtain characteristic spectrumWherein ^ represents DFT;
Step 4, according to the characteristic spectrum of target former frameCalculate core cross-correlation
Step 5, the corresponding position of maximum in the corresponding collection of illustrative plates of output of difference test position wave filter and scaling filter,
Determine position p of the target in present frametWith yardstick st;
Step 6, according to the target location p of t framestWith yardstick st, obtain target area x;
Step 7, target area x is extractedtGradient orientation histogram and color characteristic, two kinds of features are merged, so
After carry out FFT, obtain characteristic spectrum
Step 8, according to characteristic spectrumCalculate core auto-correlation
Step 9, adaptive target response collection of illustrative plates is designedTrain position filtering device and scaling filter model;
Step 10, step 11 is entered if t=1, otherwise into step 12;
Step 11, characteristic spectrum is updated using linear interpolation methodAnd correlation filterAnd enter step 12;
Step 12, target following result is exported, makes t=t+1 be then back to the tracking that step 1 carries out next two field picture.
Further, step 3 and being merged two kinds of features described in step 7, specific as follows:
(3.1) according to size 4M*4N image-region I, gradient orientation histogram, the unit of use are extracted in 9 directions
Size 4*4, then after principal component analysis dimensionality reduction, obtains the size M*N of 31 dimensions characteristic spectrum;
(3.2) size 4M*4N image-region I is zoomed into M*N, extracts the color characteristic of 11 dimensions;
(3.3) Fusion Features for extracting (3.1) and (3.2), obtain the size M*N of 42 dimensions characteristic spectrum.
Further, the calculating core auto-correlation described in the calculating core cross-correlation and step 8 described in step 4, specific as follows:
(4.1) Gaussian kernel is used, formula is as follows:
Wherein, k (x, x ') is expressed as the Gaussian kernel of two characteristic spectrum x and x ' calculating, and exp () is expressed as e index letter
Number, σ is the standard deviation of Gaussian function, and value is 0.5, | | | |2It is expressed as 2 normal forms of vector or matrix;
(4.2) calculate nuclear phase to close, formula is as follows:
Wherein, kxx′Represent that characteristic spectrum x and x ' nuclear phase is closed, exp () is e index function, σ is the standard of Gaussian function
Difference, value is 0.5, | | | |2For vector or matrix 2 normal forms,For the inverse transformation of DFT, * is multiple
Conjugation, ^ is DFT,It is multiplied for two matrix corresponding elements.
Further, in the corresponding collection of illustrative plates of the output of difference test position wave filter and scaling filter described in step 5 most
It is worth corresponding position greatly, determines position p of the target in present frametWith yardstick st, it is specific as follows:
(5.1) from t two field pictures, with position pt-1With yardstick st-1Extract the candidate region z of location estimationt,trans;
(5.2) candidate region z is extractedt,transCharacteristic spectrum
(5.3) position filtering device correlation output response collection of illustrative plates f is calculated using formula belowt,trans:
Wherein, ftThe corresponding collection of illustrative plates of output of position filtering device is expressed as,It is characterized collection of illustrative platesWithCore cross-correlation,Position filtering device obtain and updated is trained for former frame,For the inverse transformation of DFT, ^ is
DFT, ⊙ is that two matrix corresponding elements are multiplied;
The target location p that (5.4) t frames are detectedtTo export corresponding collection of illustrative plates ft,transThe corresponding position of maximum;
(5.5) from t two field pictures, with position ptWith yardstick st-1Extract the candidate region z of size estimationt,sacle, build
Yardstick pyramid;
(5.6) scaling filter correlation output response collection of illustrative plates f is calculatedt,sacle;
The target scale s that (5.7) t frames are detectedtTo export corresponding collection of illustrative plates ft,sacleThe corresponding yardstick of maximum.
Further, adaptive target response collection of illustrative plates is designed described in step 9Train position filtering device and yardstick filter
Ripple device model, it is specific as follows:
(9.1) in t two field pictures, from away from m position of being sampled in the setting range of previous frame target location;
(9.2) the correlation filtering response collection of illustrative plates of m position is calculated, the maximum of each collection of illustrative plates is taken;
(9.3) target response collection of illustrative plates is filled using maximumCorresponding m position, remaining position is filled with Gauss interpolation
Put;
(9.4) training pattern formula is as follows:
Wherein,The correlation filter model tried to achieve is represented,It is characterized collection of illustrative platesCore auto-correlation, ^ be discrete Fourier
Leaf transformation,It is multiplied for two matrix corresponding elements, ξ and λ are regularization parameter, and value is respectively 0.01 and 0.001.
Further, the use linear interpolation method described in step 11 updates characteristic spectrumAnd correlation filterIt is public
Formula is as follows:
Wherein,WithThe respectively characteristic spectrum and correlation filter of former frame, η is learning rate, and value is
0.02。
Compared with prior art, its remarkable advantage is the present invention:(1) gradient orientation histogram and color characteristic are combined,
Wherein gradient orientation histogram feature reflects the structural information of target, and color characteristic focuses on the appearance information of target, and two
Complementary characteristic fusion is planted, the discriminating power of model is effectively enhanced, improves the stability of tracking;(2) adaptive chi is used
Method of estimation is spent, this method realizes that quickly, size estimation is accurate, can be incorporated into any discriminate track algorithm framework;(3)
Using adaptive target response designing technique, it combines the appearance information and movable information of target, devises one more
Real target response so that the correlation filter of training effectively prevent detection mistake.
Brief description of the drawings
Fig. 1 closes the flow of the method for tracking target of filtering technique for nuclear phase of the present invention based on adaptive multiple features fusion
Figure.
Fig. 2 gradient orientation histograms merge schematic diagram with color characteristic.
Fig. 3 is adaptive scale method of estimation schematic diagram.
Fig. 4 is that adaptive targets respond designing technique schematic diagram.
Fig. 5 evaluation result figures on standard vision track file for the present invention, wherein (a) is the standard of OTB50 data sets
Exactness is drawn, and (b) is that the accuracy of OTB50 data sets is drawn, and (c) is that the degree of accuracy of OTB100 data sets is drawn, and (d) is
The accuracy of OTB100 data sets is drawn.
Fig. 6 is actual video target following result figure of the present invention, wherein (a) is that Human tests are regarded on OTB100 data sets
Frequency result figure, (b) is CarScale test videos result figure on OTB100 data sets, and (c) is Jogging on OTB50 data sets
Test video result figure, (d) is Jogging test videos result figure on OTB50 data sets.
Embodiment
Below in conjunction with the accompanying drawings and specific embodiment is described in further detail to the present invention.The present invention is adaptive based on multiple features
The method for tracking target that filtering technique is closed with nuclear phase should be merged, this method is broadly divided into four big steps, and the first step carries for various features
Take fusion;Second step target detection, including location estimation and size estimation;Target location and chi of 3rd step according to current detection
Degree, training pattern;4th step, using simple linear interpolation method more new model.With reference to Fig. 1, comprise the following steps that:
Step 1, t two field pictures are inputted, step 6 are entered if t=1, otherwise into next step;
Step 2, the target location p tracked according to t-1 framest-1With yardstick st-1, obtain the candidate region z of target motiont;
Step 3, candidate region z is extractedtGradient orientation histogram and color characteristic, two kinds of features are merged, so
After carry out FFT, obtain characteristic spectrumWherein ^ represents DFT;
Step 4, according to the characteristic spectrum of target former frameCalculate core cross-correlation
Step 5, the corresponding position of maximum in the corresponding collection of illustrative plates of output of difference test position wave filter and scaling filter,
Determine position p of the target in present frametWith yardstick st;
Step 6, according to the target location p of t framestWith yardstick st, obtain target area x;
Step 7, target area x is extractedtGradient orientation histogram and color characteristic, two kinds of features are merged, so
After carry out FFT, obtain characteristic spectrum
Step 8, according to characteristic spectrumCalculate core auto-correlation
Step 9, adaptive target response collection of illustrative plates is designedTrain position filtering device and scaling filter model;
Step 10, step 11 is entered if t=1, otherwise into step 12;
Step 11, characteristic spectrum is updated using linear interpolation methodAnd correlation filterAnd enter step 12;
Step 12, target following result is exported, makes t=t+1 be then back to the tracking that step 1 carries out next two field picture.
As shown in Fig. 2 giving the method for tracking target that the nuclear phase based on adaptive multiple features fusion closes filtering technique
Multiple features fusion mechanism schematic diagram.Nuclear phase of the present invention based on adaptive multiple features fusion close the target of filtering technique with
Track method, it is characterised in that various features fusion method, in recent years, gradient orientation histogram are led in Object Detecting and Tracking
Domain is done well, and it mainly reflects the structural information of target.Color characteristic is in image retrieval, target detection, the neck such as target identification
Domain effect is protruded, and is combined the discriminating power for enhancing this method with histogram of gradients, described in step 3 and step 7 by two kinds of spies
Levy and merged, it is specific as follows:
(3.1) according to size 4M*4N image-region I, gradient orientation histogram, the unit of use are extracted in 9 directions
Size 4*4, then after principal component analysis dimensionality reduction, obtains the size M*N of 31 dimensions characteristic spectrum;
(3.2) size 4M*4N image-region I is zoomed into M*N, extracts the color characteristic of 11 dimensions;
(3.3) Fusion Features for extracting (3.1) and (3.2), obtain the size M*N of 42 dimensions characteristic spectrum.
The calculating core auto-correlation described in calculating core cross-correlation and step 8 described in step 4, it is specific as follows:
(4.1) Gaussian kernel is used, formula is as follows:
Wherein, k (x, x ') is expressed as the Gaussian kernel of two characteristic spectrum x and x ' calculating, and exp () is expressed as e index letter
Number, σ is the standard deviation of Gaussian function, and value is 0.5, | | | |2It is expressed as 2 normal forms of vector or matrix;
(4.2) calculate nuclear phase to close, formula is as follows:
Wherein, kxx′Represent that characteristic spectrum x and x ' nuclear phase is closed, exp () is e index function, σ is the standard of Gaussian function
Difference, value is 0.5, | | | |2For vector or matrix 2 normal forms,For the inverse transformation of DFT, * is multiple
Conjugation, ^ is DFT, and ⊙ is that two matrix corresponding elements are multiplied.
As shown in figure 3, giving the method for tracking target that the nuclear phase based on adaptive multiple features fusion closes filtering technique
Adaptive scale method of estimation schematic diagram.Nuclear phase of the present invention based on adaptive multiple features fusion closes the mesh of filtering technique
Mark tracking, it is characterised in that adaptive scale method of estimation.The moulded dimension that traditional nuclear phase closes filtering method target is consolidated
It is fixed, it is impossible to handle the change of target scale, therefore it is easily caused tracking failure.The present invention proposes a kind of adaptive scale estimation
Method, specifically trains an independent scaling filter, by scaling filter relevant response it is maximum when corresponding yardstick come
Estimation, this method application FFT is simple and efficient, is desirably integrated into traditional discriminate method for tracking target
In, the corresponding position of maximum in the corresponding collection of illustrative plates of the output of difference test position wave filter and scaling filter described in step 5,
Determine position p of the target in present frametWith yardstick st, it is specific as follows:
(5.1) from t two field pictures, with position pt-1With yardstick st-1Extract the candidate region z of location estimationt,trans;
(5.2) candidate region z is extractedt,transCharacteristic spectrum
(5.3) position filtering device correlation output response collection of illustrative plates f is calculated using formula belowt,trans:
Wherein, ftThe corresponding collection of illustrative plates of output of position filtering device is expressed as,It is characterized collection of illustrative platesWithCore cross-correlation,Position filtering device obtain and updated is trained for former frame,For the inverse transformation of DFT, ^
For DFT,It is multiplied for two matrix corresponding elements;
The target location p that (5.4) t frames are detectedtTo export corresponding collection of illustrative plates ft,transThe corresponding position of maximum;
(5.5) from t two field pictures, with position ptWith yardstick st-1Extract the candidate region z of size estimationt,sacle, build
Yardstick pyramid;
(5.6) scaling filter correlation output response collection of illustrative plates f is calculatedt,sacle;
The target scale s that (5.7) t frames are detectedtTo export corresponding collection of illustrative plates ft,sacleThe corresponding yardstick of maximum.
As shown in figure 4, giving the method for tracking target that the nuclear phase based on adaptive multiple features fusion closes filtering technique
Adaptive targets respond design method schematic diagram.Nuclear phase of the present invention based on adaptive multiple features fusion closes filtering technique
Method for tracking target, it is characterised in that adaptive target response design method.Traditional nuclear phase closes filtering method target and rung
Should be changeless, target response is produced centered on the first frame target location by Gaussian function, so once detecting
Stage is made a mistake, due to the update mechanism of model, and mistake can be propagated down always, until tracking failure.The present invention is understanding
Certainly the problem, employs a kind of adaptive target response design method so that target response is change, knot in each frame
The appearance information and movable information of target have been closed, the robustness of tracking is improved, the adaptive mesh of the design described in step 9
Mark response collection of illustrative platesPosition filtering device and scaling filter model are trained, it is specific as follows:
(9.1) in t two field pictures, from away from m position of being sampled in the setting range of previous frame target location;
(9.2) the correlation filtering response collection of illustrative plates of m position is calculated, the maximum of each collection of illustrative plates is taken;
(9.3) target response collection of illustrative plates is filled using maximumCorresponding m position, remaining position is filled with Gauss interpolation
Put;
(9.4) training pattern formula is as follows:
Wherein,The correlation filter model tried to achieve is represented,Represent characteristic spectrumCore auto-correlation, ^ represents discrete
Fourier transform, ⊙ represents two matrix corresponding element multiplications, and ξ and λ are regularization parameters, in order to prevent the model of training from crossing plan
Close, ξ and λ values are respectively 0.01 and 0.001 in the present invention.
As shown in figure 5, illustrate the present invention follows the trail of evaluation result figure on data set OTB50 and OTB100 in standard vision,
Wherein (a) is the degree of accuracy drawing of OTB50 data sets, and (b) is that the accuracy of OTB50 data sets is drawn, and (c) is OTB100 numbers
Drawn according to the degree of accuracy of collection, (d) is that the accuracy of OTB100 data sets is drawn.OTB50 data sets have 50 video sequences, altogether
Possess 29000 frames, and OTB100 data sets possess 100 video sequences, and 58897 frames are possessed altogether, they have target per frame
Mark.Evaluation metricses mainly have two kinds:The degree of accuracy and success rate.In (a) and (c) is drawn in the degree of accuracy, the degree of accuracy is defined as
The distance between algorithm test position and target designation position account for the percentage of total evaluation and test frame number no more than the frame number of 20 pixels;
Success rate is drawn in (b) and (d), and Duplication refers to that algorithm detection target bounding box is weighed between the two with target designation bounding box
Frame number of the percentage more than 50% that folded area (shipping calculation) accounts for the gross area (union) accounts for the percentage of total evaluation and test frame number
Than.From evaluation result as can be seen that the present invention does well in target tracking task.
As shown in fig. 6, illustrating the present invention and some outstanding algorithm target tracking results in actual video in recent years
Compare figure, wherein (a) is Human test videos result figure on OTB100 data sets, (b) is CarScale on OTB100 data sets
Test video result figure, (c) is Jogging test videos result figure on OTB50 data sets, and (d) is on OTB50 data sets
Jogging test video result figures.All in all, the present invention follows the trail of effect preferably, due to using color and gradient direction Nogata
Figure fusion feature, size measurement mechanism, the corresponding mechanism of adaptive targets, the present invention can be blocked in target, dimensional variation,
It is accurate under the conditions of the unfavorable factor such as target distortion and the quick motion of target to follow the trail of target.
Claims (6)
1. a kind of adaptively merge the method for tracking target that filtering technique is closed with nuclear phase based on multiple features, it is characterised in that including
Following steps:
Step 1, t two field pictures are inputted, step 6 are entered if t=1, otherwise into next step;
Step 2, the target location p tracked according to t-1 framest-1With yardstick st-1, obtain the candidate region z of target motiont;
Step 3, candidate region z is extractedtGradient orientation histogram and color characteristic, two kinds of features are merged, then carried out
FFT, obtains characteristic spectrumWherein ^ represents DFT;
Step 4, according to the characteristic spectrum of target former frameCalculate core cross-correlation
Step 5, the corresponding position of maximum in the corresponding collection of illustrative plates of output of difference test position wave filter and scaling filter, it is determined that
Position p of the target in present frametWith yardstick st;
Step 6, according to the target location p of t framestWith yardstick st, obtain target area x;
Step 7, target area x is extractedtGradient orientation histogram and color characteristic, two kinds of features are merged, then carried out
FFT, obtains characteristic spectrum
Step 8, according to characteristic spectrumCalculate core auto-correlation
Step 9, adaptive target response collection of illustrative plates is designedTrain position filtering device and scaling filter model;
Step 10, step 11 is entered if t=1, otherwise into step 12;
Step 11, characteristic spectrum is updated using linear interpolation methodAnd correlation filterAnd enter step 12;
Step 12, target following result is exported, makes t=t+1 be then back to the tracking that step 1 carries out next two field picture.
2. according to claim 1 adaptively merge the method for tracking target that filtering technique is closed with nuclear phase based on multiple features,
Characterized in that, step 3 and being merged two kinds of features described in step 7, specific as follows:
(3.1) according to size 4M*4N image-region I, gradient orientation histogram, the unit size of use are extracted in 9 directions
4*4, then after principal component analysis dimensionality reduction, obtains the size M*N of 31 dimensions characteristic spectrum;
(3.2) size 4M*4N image-region I is zoomed into M*N, extracts the color characteristic of 11 dimensions;
(3.3) Fusion Features for extracting (3.1) and (3.2), obtain the size M*N of 42 dimensions characteristic spectrum.
3. according to claim 1 adaptively merge the method for tracking target that filtering technique is closed with nuclear phase based on multiple features,
Characterized in that, calculate core cross-correlation and the calculating core auto-correlation described in step 8 described in step 4, it is specific as follows:
(4.1) Gaussian kernel is used, formula is as follows:
Wherein, k (x, x ') is expressed as the Gaussian kernel of two characteristic spectrum x and x ' calculating, and exp () is expressed as e index function, and σ is
The standard deviation of Gaussian function, value is 0.5, | | | |2It is expressed as 2 normal forms of vector or matrix;
(4.2) calculate nuclear phase to close, formula is as follows:
Wherein, kxx′Represent that characteristic spectrum x and x ' nuclear phase is closed, exp () is e index function, σ is the standard deviation of Gaussian function,
Value is 0.5, | | | |2For vector or matrix 2 normal forms,For the inverse transformation of DFT, * is complex conjugate,
^ is DFT,It is multiplied for two matrix corresponding elements.
4. according to claim 1 adaptively merge the method for tracking target that filtering technique is closed with nuclear phase based on multiple features,
Characterized in that, maximum pair in the corresponding collection of illustrative plates of the output of difference test position wave filter and scaling filter described in step 5
The position answered, determines position p of the target in present frametWith yardstick st, it is specific as follows:
(5.1) from t two field pictures, with position pt-1With yardstick st-1Extract the candidate region z of location estimationt,trans;
(5.2) candidate region z is extractedt,transCharacteristic spectrum
(5.3) position filtering device correlation output response collection of illustrative plates f is calculated using formula belowt,trans:
Wherein, ftThe corresponding collection of illustrative plates of output of position filtering device is expressed as,It is characterized collection of illustrative platesWithCore cross-correlation,For
Former frame trains obtain and updated position filtering device,For the inverse transformation of DFT, ^ is discrete
Fourier transform,It is multiplied for two matrix corresponding elements;
The target location p that (5.4) t frames are detectedtTo export corresponding collection of illustrative plates ft,transThe corresponding position of maximum;
(5.5) from t two field pictures, with position ptWith yardstick st-1Extract the candidate region z of size estimationt,sacle, build yardstick
Pyramid;
(5.6) scaling filter correlation output response collection of illustrative plates f is calculatedt,sacle;
The target scale s that (5.7) t frames are detectedtTo export corresponding collection of illustrative plates ft,sacleThe corresponding yardstick of maximum.
5. according to claim 1 adaptively merge the method for tracking target that filtering technique is closed with nuclear phase based on multiple features,
Characterized in that, designing adaptive target response collection of illustrative plates described in step 9Train position filtering device and scaling filter mould
Type, it is specific as follows:
(9.1) in t two field pictures, from away from m position of being sampled in the setting range of previous frame target location;
(9.2) the correlation filtering response collection of illustrative plates of m position is calculated, the maximum of each collection of illustrative plates is taken;
(9.3) target response collection of illustrative plates is filled using maximumCorresponding m position, remaining position is filled with Gauss interpolation;
(9.4) training pattern formula is as follows:
Wherein,The correlation filter model tried to achieve is represented,It is characterized collection of illustrative platesCore auto-correlation, ^ be discrete Fourier become
Change,It is multiplied for two matrix corresponding elements, ξ and λ are regularization parameter, and value is respectively 0.01 and 0.001.
6. according to claim 1 adaptively merge the method for tracking target that filtering technique is closed with nuclear phase based on multiple features,
Characterized in that, the use linear interpolation method described in step 11 updates characteristic spectrumAnd correlation filterFormula is as follows:
Wherein,WithThe respectively characteristic spectrum and correlation filter of former frame, η is learning rate, and value is 0.02.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710355503.4A CN107316316A (en) | 2017-05-19 | 2017-05-19 | The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710355503.4A CN107316316A (en) | 2017-05-19 | 2017-05-19 | The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107316316A true CN107316316A (en) | 2017-11-03 |
Family
ID=60181486
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710355503.4A Pending CN107316316A (en) | 2017-05-19 | 2017-05-19 | The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107316316A (en) |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108053425A (en) * | 2017-12-25 | 2018-05-18 | 北京航空航天大学 | A kind of high speed correlation filtering method for tracking target based on multi-channel feature |
CN108257153A (en) * | 2017-12-29 | 2018-07-06 | 中国电子科技集团公司第二十七研究所 | A kind of method for tracking target based on direction gradient statistical nature |
CN108288062A (en) * | 2017-12-29 | 2018-07-17 | 中国电子科技集团公司第二十七研究所 | A kind of method for tracking target based on core correlation filtering |
CN108364305A (en) * | 2018-02-07 | 2018-08-03 | 福州大学 | Vehicle-mounted pick-up video target tracking method based on modified DSST |
CN108647694A (en) * | 2018-04-24 | 2018-10-12 | 武汉大学 | Correlation filtering method for tracking target based on context-aware and automated response |
CN108876818A (en) * | 2018-06-05 | 2018-11-23 | 国网辽宁省电力有限公司信息通信分公司 | A kind of method for tracking target based on like physical property and correlation filtering |
CN109035290A (en) * | 2018-07-16 | 2018-12-18 | 南京信息工程大学 | A kind of track algorithm updating accretion learning based on high confidence level |
CN109035302A (en) * | 2018-07-26 | 2018-12-18 | 中国人民解放军陆军工程大学 | Target tracking algorism based on the perceptually relevant filtering of space-time |
CN109034193A (en) * | 2018-06-20 | 2018-12-18 | 上海理工大学 | Multiple features fusion and dimension self-adaption nuclear phase close filter tracking method |
CN109064497A (en) * | 2018-07-16 | 2018-12-21 | 南京信息工程大学 | A kind of video tracing method based on color cluster accretion learning |
CN109285179A (en) * | 2018-07-26 | 2019-01-29 | 昆明理工大学 | A kind of motion target tracking method based on multi-feature fusion |
CN109410246A (en) * | 2018-09-25 | 2019-03-01 | 深圳市中科视讯智能系统技术有限公司 | The method and device of vision tracking based on correlation filtering |
CN109461172A (en) * | 2018-10-25 | 2019-03-12 | 南京理工大学 | Manually with the united correlation filtering video adaptive tracking method of depth characteristic |
CN109670410A (en) * | 2018-11-29 | 2019-04-23 | 昆明理工大学 | A kind of fusion based on multiple features it is long when motion target tracking method |
CN109858415A (en) * | 2019-01-21 | 2019-06-07 | 东南大学 | The nuclear phase followed suitable for mobile robot pedestrian closes filtered target tracking |
CN109886996A (en) * | 2019-01-15 | 2019-06-14 | 东华大学 | A kind of visual pursuit optimization method |
CN109949342A (en) * | 2019-03-15 | 2019-06-28 | 中国科学院福建物质结构研究所 | The complementary study method for real time tracking of adaptive fusion based on destination probability model |
CN110033006A (en) * | 2019-04-04 | 2019-07-19 | 中设设计集团股份有限公司 | Vehicle detecting and tracking method based on color characteristic Nonlinear Dimension Reduction |
CN110211149A (en) * | 2018-12-25 | 2019-09-06 | 湖州云通科技有限公司 | A kind of dimension self-adaption nuclear phase pass filter tracking method based on context-aware |
CN110472607A (en) * | 2019-08-21 | 2019-11-19 | 上海海事大学 | A kind of ship tracking method and system |
CN110751670A (en) * | 2018-07-23 | 2020-02-04 | 中国科学院长春光学精密机械与物理研究所 | Target tracking method based on fusion |
CN108846851B (en) * | 2018-04-25 | 2020-07-28 | 河北工业职业技术学院 | Moving target tracking method and terminal equipment |
CN112598711A (en) * | 2020-12-25 | 2021-04-02 | 南京信息工程大学滨江学院 | Hyperspectral target tracking method based on joint spectrum dimensionality reduction and feature fusion |
CN113298851A (en) * | 2021-07-07 | 2021-08-24 | 沈阳航空航天大学 | Target image tracking method based on multi-scale and multi-feature |
CN113327273A (en) * | 2021-06-15 | 2021-08-31 | 中国人民解放军火箭军工程大学 | Infrared target tracking method based on variable window function correlation filtering |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015163830A1 (en) * | 2014-04-22 | 2015-10-29 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | Target localization and size estimation via multiple model learning in visual tracking |
CN106570486A (en) * | 2016-11-09 | 2017-04-19 | 华南理工大学 | Kernel correlation filtering target tracking method based on feature fusion and Bayesian classification |
CN106651913A (en) * | 2016-11-29 | 2017-05-10 | 开易(北京)科技有限公司 | Target tracking method based on correlation filtering and color histogram statistics and ADAS (Advanced Driving Assistance System) |
-
2017
- 2017-05-19 CN CN201710355503.4A patent/CN107316316A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015163830A1 (en) * | 2014-04-22 | 2015-10-29 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | Target localization and size estimation via multiple model learning in visual tracking |
CN106570486A (en) * | 2016-11-09 | 2017-04-19 | 华南理工大学 | Kernel correlation filtering target tracking method based on feature fusion and Bayesian classification |
CN106651913A (en) * | 2016-11-29 | 2017-05-10 | 开易(北京)科技有限公司 | Target tracking method based on correlation filtering and color histogram statistics and ADAS (Advanced Driving Assistance System) |
Non-Patent Citations (1)
Title |
---|
张雷: ""复杂场景下实时目标跟踪算法及实现技术研究"", 《中国博士学位论文全文数据库 信息科技辑》 * |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108053425A (en) * | 2017-12-25 | 2018-05-18 | 北京航空航天大学 | A kind of high speed correlation filtering method for tracking target based on multi-channel feature |
CN108257153A (en) * | 2017-12-29 | 2018-07-06 | 中国电子科技集团公司第二十七研究所 | A kind of method for tracking target based on direction gradient statistical nature |
CN108288062A (en) * | 2017-12-29 | 2018-07-17 | 中国电子科技集团公司第二十七研究所 | A kind of method for tracking target based on core correlation filtering |
CN108257153B (en) * | 2017-12-29 | 2021-09-07 | 中国电子科技集团公司第二十七研究所 | Target tracking method based on direction gradient statistical characteristics |
CN108364305A (en) * | 2018-02-07 | 2018-08-03 | 福州大学 | Vehicle-mounted pick-up video target tracking method based on modified DSST |
CN108364305B (en) * | 2018-02-07 | 2021-05-18 | 福州大学 | Vehicle-mounted camera video target tracking method based on improved DSST |
CN108647694B (en) * | 2018-04-24 | 2021-04-16 | 武汉大学 | Context-aware and adaptive response-based related filtering target tracking method |
CN108647694A (en) * | 2018-04-24 | 2018-10-12 | 武汉大学 | Correlation filtering method for tracking target based on context-aware and automated response |
CN108846851B (en) * | 2018-04-25 | 2020-07-28 | 河北工业职业技术学院 | Moving target tracking method and terminal equipment |
CN108876818A (en) * | 2018-06-05 | 2018-11-23 | 国网辽宁省电力有限公司信息通信分公司 | A kind of method for tracking target based on like physical property and correlation filtering |
CN109034193A (en) * | 2018-06-20 | 2018-12-18 | 上海理工大学 | Multiple features fusion and dimension self-adaption nuclear phase close filter tracking method |
CN109064497B (en) * | 2018-07-16 | 2021-11-23 | 南京信息工程大学 | Video tracking method based on color clustering supplementary learning |
CN109064497A (en) * | 2018-07-16 | 2018-12-21 | 南京信息工程大学 | A kind of video tracing method based on color cluster accretion learning |
CN109035290A (en) * | 2018-07-16 | 2018-12-18 | 南京信息工程大学 | A kind of track algorithm updating accretion learning based on high confidence level |
CN110751670B (en) * | 2018-07-23 | 2022-10-25 | 中国科学院长春光学精密机械与物理研究所 | Target tracking method based on fusion |
CN110751670A (en) * | 2018-07-23 | 2020-02-04 | 中国科学院长春光学精密机械与物理研究所 | Target tracking method based on fusion |
CN109285179A (en) * | 2018-07-26 | 2019-01-29 | 昆明理工大学 | A kind of motion target tracking method based on multi-feature fusion |
CN109035302A (en) * | 2018-07-26 | 2018-12-18 | 中国人民解放军陆军工程大学 | Target tracking algorism based on the perceptually relevant filtering of space-time |
CN109285179B (en) * | 2018-07-26 | 2021-05-14 | 昆明理工大学 | Moving target tracking method based on multi-feature fusion |
CN109410246A (en) * | 2018-09-25 | 2019-03-01 | 深圳市中科视讯智能系统技术有限公司 | The method and device of vision tracking based on correlation filtering |
CN109410246B (en) * | 2018-09-25 | 2021-06-11 | 杭州视语智能视觉系统技术有限公司 | Visual tracking method and device based on correlation filtering |
CN109461172A (en) * | 2018-10-25 | 2019-03-12 | 南京理工大学 | Manually with the united correlation filtering video adaptive tracking method of depth characteristic |
CN109670410A (en) * | 2018-11-29 | 2019-04-23 | 昆明理工大学 | A kind of fusion based on multiple features it is long when motion target tracking method |
CN110211149A (en) * | 2018-12-25 | 2019-09-06 | 湖州云通科技有限公司 | A kind of dimension self-adaption nuclear phase pass filter tracking method based on context-aware |
CN110211149B (en) * | 2018-12-25 | 2022-08-12 | 湖州云通科技有限公司 | Scale self-adaptive kernel correlation filtering tracking method based on background perception |
CN109886996A (en) * | 2019-01-15 | 2019-06-14 | 东华大学 | A kind of visual pursuit optimization method |
CN109886996B (en) * | 2019-01-15 | 2023-06-06 | 东华大学 | Visual tracking optimization method |
CN109858415A (en) * | 2019-01-21 | 2019-06-07 | 东南大学 | The nuclear phase followed suitable for mobile robot pedestrian closes filtered target tracking |
CN109949342A (en) * | 2019-03-15 | 2019-06-28 | 中国科学院福建物质结构研究所 | The complementary study method for real time tracking of adaptive fusion based on destination probability model |
CN109949342B (en) * | 2019-03-15 | 2022-07-15 | 中国科学院福建物质结构研究所 | Self-adaptive fusion complementary learning real-time tracking method based on target probability model |
CN110033006A (en) * | 2019-04-04 | 2019-07-19 | 中设设计集团股份有限公司 | Vehicle detecting and tracking method based on color characteristic Nonlinear Dimension Reduction |
CN110472607A (en) * | 2019-08-21 | 2019-11-19 | 上海海事大学 | A kind of ship tracking method and system |
CN112598711A (en) * | 2020-12-25 | 2021-04-02 | 南京信息工程大学滨江学院 | Hyperspectral target tracking method based on joint spectrum dimensionality reduction and feature fusion |
CN112598711B (en) * | 2020-12-25 | 2022-12-20 | 南京信息工程大学滨江学院 | Hyperspectral target tracking method based on joint spectrum dimensionality reduction and feature fusion |
CN113327273A (en) * | 2021-06-15 | 2021-08-31 | 中国人民解放军火箭军工程大学 | Infrared target tracking method based on variable window function correlation filtering |
CN113327273B (en) * | 2021-06-15 | 2023-12-19 | 中国人民解放军火箭军工程大学 | Infrared target tracking method based on variable window function correlation filtering |
CN113298851A (en) * | 2021-07-07 | 2021-08-24 | 沈阳航空航天大学 | Target image tracking method based on multi-scale and multi-feature |
CN113298851B (en) * | 2021-07-07 | 2023-09-26 | 沈阳航空航天大学 | Target image tracking method based on multi-scale multi-feature |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107316316A (en) | The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features | |
CN107154024A (en) | Dimension self-adaption method for tracking target based on depth characteristic core correlation filter | |
Li et al. | Density map guided object detection in aerial images | |
CN110009679B (en) | Target positioning method based on multi-scale feature convolutional neural network | |
Uhrig et al. | Sparsity invariant cnns | |
CN106952288B (en) | Based on convolution feature and global search detect it is long when block robust tracking method | |
CN108665481A (en) | Multilayer depth characteristic fusion it is adaptive resist block infrared object tracking method | |
CN107330357A (en) | Vision SLAM closed loop detection methods based on deep neural network | |
CN109461172A (en) | Manually with the united correlation filtering video adaptive tracking method of depth characteristic | |
CN105741316A (en) | Robust target tracking method based on deep learning and multi-scale correlation filtering | |
CN112597985B (en) | Crowd counting method based on multi-scale feature fusion | |
CN106570893A (en) | Rapid stable visual tracking method based on correlation filtering | |
CN108961308B (en) | Residual error depth characteristic target tracking method for drift detection | |
Ji et al. | Parallel fully convolutional network for semantic segmentation | |
CN109993095A (en) | A kind of other characteristic aggregation method of frame level towards video object detection | |
CN110188708A (en) | A kind of facial expression recognizing method based on convolutional neural networks | |
CN110503081A (en) | Act of violence detection method, system, equipment and medium based on inter-frame difference | |
CN110457515A (en) | The method for searching three-dimension model of the multi-angle of view neural network of polymerization is captured based on global characteristics | |
CN102034267A (en) | Three-dimensional reconstruction method of target based on attention | |
CN110826389A (en) | Gait recognition method based on attention 3D frequency convolution neural network | |
Lan et al. | Coherence-aware context aggregator for fast video object segmentation | |
Zhao et al. | Self-generated defocus blur detection via dual adversarial discriminators | |
Xiao et al. | MeMu: Metric correlation Siamese network and multi-class negative sampling for visual tracking | |
CN110348492A (en) | A kind of correlation filtering method for tracking target based on contextual information and multiple features fusion | |
CN109146925A (en) | Conspicuousness object detection method under a kind of dynamic scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20171103 |