CN109785366A - It is a kind of for the correlation filtering method for tracking target blocked - Google Patents
It is a kind of for the correlation filtering method for tracking target blocked Download PDFInfo
- Publication number
- CN109785366A CN109785366A CN201910052347.3A CN201910052347A CN109785366A CN 109785366 A CN109785366 A CN 109785366A CN 201910052347 A CN201910052347 A CN 201910052347A CN 109785366 A CN109785366 A CN 109785366A
- Authority
- CN
- China
- Prior art keywords
- frame
- target
- tracking
- weight map
- correlation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Image Analysis (AREA)
Abstract
The present invention relates to a kind of for the correlation filtering method for tracking target blocked, step 1: the video sequence tracked for one section provides the tracking position of object and size of t frame, determines region of search, calculates feature, calculates the weight map of t frame;Step 2: the weight map based on obtained t frame trains the correlation filter of t frame;Step 3: according to the correlation filter trained, calculating the target response figure of t+1 frame, calculate t+1 frame target position;Step 4: being based on t+1 frame target position, acquire the APSR strategy of high confidence level, determine whether the correlation filter of t frame is updated.
Description
Technical field
The present invention relates to a kind of for the correlation filtering method for tracking target blocked, belongs to pattern-recognition, computer vision
Field.
Background technique
Increasingly developed with computer vision, vision tracking has been widely used for many Computer Vision Tasks, example
Such as video monitoring, human-computer interaction and unmanned sensory perceptual system.The actual position of first frame target is provided, tracker can be entire
Interested target is positioned in video sequence.Although visual tracking method has made great progress, there are still many challenges, examples
Such as deformation, block, outside the visual field, dimensional variation, plane internal rotation etc. [1].
In recent years, differentiate that class tracking causes great concern.Target following is considered as two classification by method of discrimination, i.e.,
Background area in target and video.It is many to differentiate that class method is based on machine learning method, wherein correlation filtering (KCF) [2]
It is most popular since it has Computationally efficient and outstanding tracking performance.But the correlation filtering of standard is limited by boundary effect
It answers, false trained negative sample can be generated, the filter of overfitting may be trained, deformation cannot be coped with well and blocked
Deng challenge, therefore increase the risk of tracking failure.There are many work to be intended to improve the boundary effect of correlation filtering generation at present,
SRDCF [3] (spatial regularization differentiation correlation filter) introduces a spatial regularization window, which is 5 times of target
Size, it punishes the filter value except target rectangle frame range, this is suppressed many background samples, therefore it compares KCF
There is stronger tracking ability.However, SRDCF entire parameter during tracking is fixed, therefore this method cannot be well
Adapt to the change in shape of target.In addition to this, [4] CSR-DCF, it constructs two classification segmentation square using color histogram graph model
Battle array assigns the more weights in real goal region, while background pixel is suppressed, and the correlation filtering tracker trained in this way is just
It increasingly focuses in true target area.However, can not always by the two classification subdivision matrixes that color histogram obtains
Accurately, when especially blocking with illumination variation, the binary segmentation matrix of low confidence dramatically interferes tracking at this time
Device causes tracking to fail.
Article [2] proposes traditional KCF track algorithm process, uses popular tracking-by-detection [5]
Thought, the general thought of KCF are as follows: to given one trained positive sample, using the property of period matrix, generate it is a large amount of remaining
Negative sample simultaneously is used to train correlation filter.According to the property of circular matrix, DCF method is converted to time-consuming space correlation fastly
Element operation in the Fourier of speed.
Article [6] proposes that HOG (Histogram of Oriented Gradient) description, the generating mode of HOG are
According to such thought: it is by calculating the gradient orientation histogram with statistical picture regional area come constitutive characteristic.Target
Gradient or the direction Density Distribution at edge describe the presentation and shape of target well, therefore HOG feature is widely deployed
Target detection and tracking field.
Article [7] proposes CN (Color Names) description, and the generating mode of CN is according to such thought:
The color that it is likely to occur target is divided into 11 classes: black, blue, brown, grey, green, orange, powder, purple, red, white and yellow totally 11
Kind.By adaptive algorithm, using the thought of PCA (principal component analysis), each pixel ratio is more significant in selection target region
The color characteristic of 11 dimensions is reduced to 2 dimensions by color.
Article [4] proposes a kind of correlation filtering track algorithm of spatial perception.It generates weight square using color histogram
Battle array judges to track pixel class (target or background) in target area.Algorithm first to the tracking result of previous frame (generally by
Rectangle frame), it extracts target signature and calculates color histogram, the weight matrix of generation is then incorporated into traditional KCF tracking and is calculated
In method, trained filter is then obtained, in the region of search of present frame, navigates to most suitable target position.
In summary, it is desirable to design one kind and not only meet real-time, but also various external interferences can be coped with, tracking effect can also expire
The track algorithm of sufficient actual demand, still there is great difficulty.It is still reported at present without pertinent literature.
[1] Wang Shifeng, Dai Xiang, Xu Nin, and Zhang Pengfei, " pilotless automobile environment perception technology summary, " Changchun science and engineering
College journal (natural science edition), vol.40, no.01, pp.1-6,2017.
[2]J.F.Henriques,R.Caseiro,P.Martins,and J.Batista,"High-speed
tracking with kernelized correlation filters,"IEEE Transactions on Pattern
Analysis and Machine Intelligence,vol.37,no.3,pp.583-596,2015.
[3]M.Danelljan,G.Hager,F.Shahbaz Khan,and M.Felsberg,"Learning
spatially regularized correlation filters for visual tracking,"in Proceedings
of the IEEE International Conference on Computer Vision,2015,pp.4310-4318.
[4]A.Lukezic,T.Vojir,L.C.Zajc,J.Matas,and M.Kristan,"Discriminative
Correlation Filter with Channel and Spatial Reliability,"in CVPR,2017,vol.1,
no.2,p.3.
[5]Z.Kalal,K.Mikolajczyk,and J.Matas,"Tracking-learning-detection,"
IEEE transactions on pattern analysis and machine intelligence,vol.34,no.7,
p.1409,2012.
[6]N.Dalal and B.Triggs,"Histograms of oriented gradients for human
detection,"in Computer Vision and Pattern Recognition,2005.CVPR 2005.IEEE
Computer Society Conference on,2005,vol.1,pp.886-893:IEEE.
[7]J.Van De Weijer,C.Schmid,J.Verbeek,and D.Larlus,"Learning color
names for real-world applications,"IEEE Transactions on Image Processing,
vol.18,no.7,pp.1512-1523,2009.
[8]S.Boyd,N.Parikh,E.Chu,B.Peleato,and J.Eckstein,"Distributed
optimization and statistical learning via the alternating direction method of
multipliers,"Foundations andin Machine learning,vol.3,no.1,pp.1-122,
2011.
[9]Y.Wu,J.Lim,M.-H.Yang,Object tracking benchmark,IEEE Transactions
on Pattern Analys is and Machine Intelligence,vol.37,no.9,pp.1834–1848,2015.
Summary of the invention
The technology of the present invention is determined problem: overcome the deficiencies in the prior art, propose it is a kind of for the correlation filtering target blocked with
The problems such as track method, tracking accuracy is high, and robustness is good, and tracking velocity meets real-time requirement, can solve target occlusion and deformation.
The principle of the present invention: the correlation filtering tracker of the weighting proposed by the present invention based on color histogram is substantially thought
Think as follows.
On the one hand, the pixel with high weighted value should be considered as target;On the other hand, there is the pixel of low weighted value,
It is more likely considered as background, it should inhibit these pixels, prevent it from interfering training correlation filter (KCF).
It is novel with adaptive weighting figure the invention proposes one compared with CSR-DCF and SRDCF mentioned above
Spatial perception correlation filter.Adaptive weighting figure group and space weight map and target likelihood figure of the invention is (by color histogram
Figure obtains), reflect size a possibility that each pixel belongs to target in region of search.
In addition to this, when by blocking, target area can be polluted target by background pixel, if continue at this time update with
Track model, then tracker can be contaminated, once target reappears in the visual field, tracker can not also relock mesh at this time
Mark.For this purpose, for judging tracker tracking quality, being determined the invention proposes the adaptive updates strategy of a high confidence level
Whether the trace model that present frame training obtains is updated.
Of the invention is a kind of for the correlation filtering method for tracking target blocked, and steps are as follows:
Step 1: the video sequence tracked for one section provides the tracking position of object and size of t frame, determines the field of search
Feature is extracted, and calculates the weight map of t frame in domain;
Step 2: the weight map based on obtained t frame trains the correlation filter of t frame;
Step 3: according to the correlation filter trained, calculating the target response figure of t+1 frame, calculate t+1 frame mesh
Cursor position;
Step 4: being based on t+1 frame target position, acquire the APSR strategy of high confidence level, determine the correlation filtering of t frame
Whether device is updated.
The step 1 is implemented as follows:
The weight map based on t frame that step 1 is previously mentioned is by the similar weight map T of target and spatial perception weight map P structure
At;
The similar weight map T of target:
The target position of known t frame image and size construct color histogramWithIt is as follows:
Wherein γ is fixed turnover rate,WithThe target and background color histogram of t frame is respectively indicated,
WithFor historical frames, i.e. the target and background color histogram of the 1st frame to t-1 frame, then obtain based on color histogram
The similar weight map T of target:
WhereinWithFor prior probability, the size of the target area and background area that represent t frame accounts for entire search
The ratio in region;
The spatial perception weight map P, weighted value are decayed with far from target's center.For any in target frame
One pixel p i, the numerical value of spatial perception weight are denoted as P (pi), calculate P (pi) to each pixel in target frame, generate most
Whole P;
Above step has obtained target similar weight map T and spatial perception weight map P, then the weight of final t frame
Scheme Wt, it is calculated by following formula:
Wt=T+P.
The step 2 is implemented as follows:
Feature is extracted to the target area of t frame, is denoted as x, y is the label for meeting Gaussian Profile, training correlation filter
ft, majorized function is as follows:
ε (f)=| | ft*x-y||2+λ||ft||2
Wherein ft=ft⊙Wt, ⊙ represents dot product, when ε (f) minimum, trains the correlation filter f of t framet。
The step 3 is implemented as follows:
T+1 frame image is inputted, needs to find target position in t+1 frame search region, with the target position of previous frame
Centered on cut region of search, and extract its feature, be expressed as zt+1, the correlation filtering of the t frame then obtained according to step 2
Device ft: obtain the response diagram S of final t+1 framet+1:
WhereinWithIt indicates to ftAnd zt+1Carry out Fourier transformation, F-1Represent inverse Fourier transform, St+1It is t+1
Frame target response figure;
S is schemed according to responset+1, calculate the target position of t+1 frame.
The step 4 is implemented as follows:
For the target response figure S for the t+1 frame that step 3 obtainst+1, using following APSR strategy, judge tracking quality,
Wherein APSR is defined as follows:
Wherein SmaxRepresent St+1Maximum value, SminRepresent St+1Minimum value, μ1Represent peak value near zone Ω1Be averaged
Value, σ1For region Ω1Standard deviation, wherein St+1In addition to Ω1Remaining region in addition is denoted as Ω2, w and h indicate St+1Middle pixel
Abscissa and ordinate, Sw,hIndicate St+1The corresponding numerical value of middle coordinate (w, h), mean are function of averaging;
By calculating the numerical value of APSR, tracking quality is assessed, determines whether the correlation filter of t frame is updated.
The present invention compared with prior art the advantages of and good effect:
(1) present invention can be effectively treated target and be blocked, the tracking under the complex scenes such as deformation
For target following under actual scene, correlation filter is trained using spatial perception adaptive weighting figure, in this way
Obtained filter can effectively recognize real object pixel, while reduce the interference of background pixel.The filtering learnt in this way
Device has Memorability, disappears within view when target is of short duration, tracker judges that target disappears in region of search, can stop more
Newly training pattern (being polluted by background pixel) at this time, waits until that target reappears in sight in this way, and tracker still can be with
Locking tracking target.On OTB2015 target tracking data collection [9], 84.7% precision is achieved, is tracked compared to others
Device KCF [2], SRDCF [3] and CSRDCF [4] tracking, are respectively increased 14.8%, 5.3% and 5% precision.
(2) track algorithm time-consuming of the invention is few
Calculating speed of the invention is very fast, on the one hand has benefited from the advantage of KCF algorithm, and still further aspect has been abandoned multiple herein
Miscellaneous optimization process trains ideal filter using the method for loop iteration.Experiment shows method of the invention, per second
30 frame data can be handled, are able to satisfy the requirement of real-time tracking completely.
Detailed description of the invention
Fig. 1 is the method for the present invention implementation flow chart;
Fig. 2 is the schematic diagram of t frame weight map;
Fig. 3 is the explanatory diagram of high confidence level more new strategy;
Fig. 4 is experiment show figure.
Specific embodiment
The following describes the present invention in detail with reference to the accompanying drawings and embodiments.
As shown in Figure 1, the implementation steps of the invention is as follows:
First choice gives one section of video sequence, provides the tracking position of object and size of t frame, and then determine target search area
Feature is extracted in domain;
Then in target search region, it is similar with based on color histogram target to calculate separately spatial perception weight map P
Weight map T obtains the weight map W of t framet;
Weight map W based on obtained t framet, train the correlation filter of t frame;
According to the correlation filter of the t frame trained, the target response figure of t+1 frame is calculated, calculates t+1 frame mesh
Cursor position.
The APSR strategy for seeking high confidence level, determines whether the correlation filter of t frame is updated.
Detailed process is specifically described below.
1. the similar weight map T of target based on color histogram
T frame image is the tracking result that previous frame has obtained, and target position and size for t frame are determining to search
Rope region.It is described as discussed above, firstly generate the similar weight map T of color histogram building target.
The target area of t frame is defined into Ot, target ambient background is defined as Bt, it is directed to two extracted region colors in this way
Histogram, note atWithWhereinWithRespectively indicate the target and background color histogram of t frame.While in order to mention
The reliability of high color histogram considers the target and background color histogram of historical frames (the 1st frame to t-1 frame)WithIt obtainsWithIt is as follows:
γ is fixed turnover rate, and the present invention takes γ=0.04 by a large amount of repetition tests.
Then the similar weight map T of target based on color histogram is obtained:
WhereinWithFor prior probability, t frame target area O is representedtWith background area BtSize account for and entirely search
The ratio in rope region.
2. generating the weight map W of t framet
In general, there is a priori knowledge.A possibility that pixel in target area is target is higher, in mesh
The easier interference by background pixel of pixel at region rectangle frame edge is marked, and is located at the pixel outside target area generally all
It is background.The invention proposes a spatial perception weight map P, assign the higher weight of pixel close to target area center,
The weighted value of rest of pixels within target rectangle frame with far from target area center pixel and gradually decay, target area with
Outer pixel assignment is 0.5, allows it to retain a possibility that same, is selected to target area or background.For in target frame
Any one pixel p i, the numerical value of spatial perception weight is denoted as P (pi), P (pi) is calculated to each pixel in target frame,
Generate final P;
Wherein, tracking box is rectangle, CtFor the center pixel coordinate of rectangle frame, CxFor rest of pixels coordinate, d in rectangle frame
(Ct-Cx) indicate CtTo CxDistance.
Above step has obtained target similar weight map T and spatial perception weight map P, then the weight of final t frame
Scheme Wt, it is calculated by following formula:
Wt=T+P.
The space weight map W of finally obtained t framet, it is in conjunction with pixel space position and colouring information generation, therefore energy
It enough can good regional partial objectives for and background.Effect as shown in Fig. 2, (a) be t frame search figure, rectangle frame give previous frame with
Track result;(b) the similar weight map T of target based on color histogram;It (c) is spatial perception weight map P;(d) space of t frame
Weight map Wt。
As can be seen from Figure 2, the space weight map W of the t frame ultimately generatedt, the weighted value of target area is higher, background area
Weighted value is lower.
3. training filter
Feature is extracted in the target area obtained to t frame, is denoted as x, feature operator using the HOG referred in background technique and
CN feature, y are the labels for meeting Gaussian Profile, and the present invention needs training to obtain the correlation filter f of t framet, majorized function is such as
Shown in lower:
ε (f)=| | ft*x-y||2+λ||ft||2
Wherein ft=ft⊙Wt, ⊙ represents dot product.It * is convolution operation.λ is regularization parameter, takes 0.05.ε (f) is loss
Function.WtFor the weight map of t frame obtained in step 2.By ADMM [8] alternative manner, keeps ε (f) minimum, thus learn
The correlation filter f of t frame outt.In this way in WtUnder intervention, obtained correlation filter f is trainedtOnly work to object pixel,
Greatly improve tracking accuracy.
4. tracking target
The input picture of t+1 frame is needed to find target position in t+1 frame, be cut out centered on the target position of previous frame
Region of search is cut, and extracts its feature and is expressed as zt+1.Then the correlation filter f of the t frame according to obtained in step 3t,
The response diagram S of t+1 framet+1:
WhereinWithIt indicates to ftAnd zt+1Carry out Fourier transformation, F-1Represent inverse Fourier transform.⊙ is represented a little
Multiply.Response diagram St+1Maximum value position, i.e. the target position of t+1 frame.
5. high confidence level more new strategy
Most of tracking is to update filter using a fixed turnover rate.But once target is serious
It blocks, or even disappears in the visual field, if still updating correlation filter at this time, may result in tracking failure.In the present invention
In, high confidence level score evaluation strategy is introduced, i.e., from response diagram St+1In correlation filter is calculated and determined whether should be by more
Newly.The confidence score of introducing is mainly from the peak value acuity of response diagram and the smoothness of trough.Normal response diagram,
Have a sharp keen peak value and other flat responses, shows to detect reliable tracking target.On the contrary, when response diagram has
When multiple peak values, target is blocked at this time.
Wherein SmaxRepresent St+1Maximum value, SminRepresent St+1Minimum value, μ1Represent peak value near zone Ω1Be averaged
Value, σ1For region Ω1Standard deviation, wherein St+1In addition to Ω1Remaining region in addition is denoted as Ω2, w and h indicate St+1Middle pixel
Abscissa and ordinate, Sw,hIndicate St+1The corresponding numerical value of middle coordinate (w, h), mean are function of averaging.
The tracking quality that APSR can be assessed, and then judge whether the correlation filter of t frame updates.
Also find out from Fig. 3, A tracking rectangle frame is the tracker for taking APSR strategy of the invention, and it is not that B, which tracks rectangle frame,
Using the tracker of APSR more new strategy.90th frame, when not blocking, A tracking box and B tracking box are all accurately kept up at this time
Target.When tracking target is when the 113rd frame is blocked, APSR value drops to 1.34 from 7.92, APSR of the invention at this time
Strategy judges that target is blocked, and stops updating by contaminated correlation filter.When 135 frame, target is reappeared in
In the visual field, with the target lost before A tracking box (using APSR strategy) is successfully found at this time, and B rectangle frame (does not use APSR plan
Slightly) thoroughly with losing target.Therefore, the APSR strategy that the present invention uses, can be very good reply occlusion issue.
6. experiment show
The present invention tests tracking effect of the invention in video sequence Girl2 and Human3.In Fig. 4, A tracking box is
Tracker used in the present invention, B, C, D are other existing track algorithm (article [2- respectively referred in background technique
4] tracking proposed).From FIG. 4, it can be seen that target can be hidden by barrier in the scene sequence of Girl2 and Human3
Gear, when target reappears in the visual field, tracking only of the invention can successfully be detected true target, remaining
Tracking all tracks failure, this demonstrates tracking of the invention to a certain extent can cope with choosing of blocking well
War.
Although describing specific implementation method of the invention above, it will be appreciated by those of skill in the art that these
It is merely illustrative of, under the premise of without departing substantially from the principle of the invention and realization, numerous variations can be made to these embodiments
Or modification, therefore, protection scope of the present invention is defined by the appended claims.
Claims (5)
1. a kind of for the correlation filtering method for tracking target blocked, which is characterized in that steps are as follows:
Step 1: the video sequence tracked for one section provides the tracking position of object and size of t frame, determines region of search,
Feature is extracted, and calculates the weight map of t frame;
Step 2: the weight map based on obtained t frame trains the correlation filter of t frame;
Step 3: according to the correlation filter trained, calculating the target response figure of t+1 frame, calculate t+1 frame target position
It sets;
Step 4: being based on t+1 frame target position, acquire the APSR strategy of high confidence level, determine that the correlation filter of t frame is
It is no to be updated.
2. according to claim 1 for the correlation filtering method for tracking target blocked, it is characterised in that: the step 1
It is implemented as follows:
The weight map based on t frame that step 1 is previously mentioned is made of target similar weight map T and spatial perception weight map P;
The similar weight map T of target:
The target position of known t frame image and size construct color histogramWithIt is as follows:
Wherein γ is fixed turnover rate,WithThe target and background color histogram of t frame is respectively indicated,WithFor historical frames, i.e. the target and background color histogram of the 1st frame to t-1 frame, then the mesh based on color histogram is obtained
Mark similar weight map T:
WhereinWithFor prior probability, the size of the target area and background area that represent t frame accounts for entire region of search
Ratio;
The spatial perception weight map P, weighted value are decayed with far from target's center;For any one in target frame
Pixel p i, the numerical value of spatial perception weight are denoted as P (pi), calculate P (pi) to each pixel in target frame, generate final
P;
Above step has obtained target similar weight map T and spatial perception weight map P, then the weight map W of final t framet,
It is calculated by following formula:
Wt=T+P.
3. according to claim 1 for the correlation filtering method for tracking target blocked, it is characterised in that: the step 2
It is implemented as follows:
Feature is extracted to the target area of t frame, is denoted as x, y is the label for meeting Gaussian Profile, training correlation filter ft, excellent
It is as follows to change function:
ε (f)=| | ft*x-y||2+λ||ft||2
Wherein ft=ft⊙Wt, ⊙ represents dot product, when ε (f) minimum, trains the correlation filter f of t framet。
4. according to claim 1 for the correlation filtering method for tracking target blocked, it is characterised in that: the step 3
It is implemented as follows:
T+1 frame image is inputted, needs to find target position in t+1 frame search region, during the target position with previous frame is
Idea cuts region of search, and extracts its feature, is expressed as zt+1, the correlation filter f of the t frame then obtained according to step 2t:
Obtain the response diagram S of final t+1 framet+1:
WhereinWithIt indicates to ftAnd zt+1Carry out Fourier transformation, F-1Represent inverse Fourier transform, St+1It is t+1 frame mesh
Mark response diagram;
S is schemed according to responset+1, calculate the target position of t+1 frame.
5. according to claim 1 for the correlation filtering method for tracking target blocked, it is characterised in that: the step 4
It is implemented as follows:
For the target response figure S for the t+1 frame that step 3 obtainst+1, using following APSR strategy, judge tracking quality, wherein
APSR is defined as follows:
Wherein SmaxRepresent St+1Maximum value, SminRepresent St+1Minimum value, μ1Represent peak value near zone Ω1Average value, σ1
For region Ω1Standard deviation, wherein St+1In addition to Ω1Remaining region in addition is denoted as Ω2, w and h indicate St+1The abscissa of middle pixel
And ordinate, Sw,hIndicate St+1The corresponding numerical value of middle coordinate (w, h), mean are function of averaging;
By calculating the numerical value of APSR, tracking quality is assessed, determines whether the correlation filter of t frame is updated.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910052347.3A CN109785366B (en) | 2019-01-21 | 2019-01-21 | Related filtering target tracking method for shielding |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910052347.3A CN109785366B (en) | 2019-01-21 | 2019-01-21 | Related filtering target tracking method for shielding |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109785366A true CN109785366A (en) | 2019-05-21 |
CN109785366B CN109785366B (en) | 2020-12-25 |
Family
ID=66500899
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910052347.3A Active CN109785366B (en) | 2019-01-21 | 2019-01-21 | Related filtering target tracking method for shielding |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109785366B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110276383A (en) * | 2019-05-31 | 2019-09-24 | 北京理工大学 | A kind of nuclear phase pass filtered target localization method based on multichannel memory models |
CN110599519A (en) * | 2019-08-27 | 2019-12-20 | 上海交通大学 | Anti-occlusion related filtering tracking method based on domain search strategy |
CN110765970A (en) * | 2019-10-31 | 2020-02-07 | 北京地平线机器人技术研发有限公司 | Method and device for determining nearest obstacle, storage medium and electronic equipment |
CN111091583A (en) * | 2019-11-22 | 2020-05-01 | 中国科学技术大学 | Long-term target tracking method |
CN111260689A (en) * | 2020-01-16 | 2020-06-09 | 东华大学 | Effective confidence enhancement correlation filtering visual tracking algorithm |
CN111583306A (en) * | 2020-05-12 | 2020-08-25 | 重庆邮电大学 | Anti-occlusion visual target tracking method |
CN112598710A (en) * | 2020-12-25 | 2021-04-02 | 杭州电子科技大学 | Space-time correlation filtering target tracking method based on feature online selection |
CN117011340A (en) * | 2023-08-09 | 2023-11-07 | 北京航空航天大学 | Reconfigurable relevant filtering target tracking algorithm based on statistical color characteristics |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101694723A (en) * | 2009-09-29 | 2010-04-14 | 北京航空航天大学 | Real-time moving target tracking method based on global matching similarity function |
KR101517538B1 (en) * | 2013-12-31 | 2015-05-15 | 전남대학교산학협력단 | Apparatus and method for detecting importance region using centroid weight mask map and storage medium recording program therefor |
CN106097383A (en) * | 2016-05-30 | 2016-11-09 | 海信集团有限公司 | A kind of method for tracking target for occlusion issue and equipment |
JP6079076B2 (en) * | 2012-09-14 | 2017-02-15 | 沖電気工業株式会社 | Object tracking device and object tracking method |
CN106570887A (en) * | 2016-11-04 | 2017-04-19 | 天津大学 | Adaptive Mean Shift target tracking method based on LBP features |
CN106651913A (en) * | 2016-11-29 | 2017-05-10 | 开易(北京)科技有限公司 | Target tracking method based on correlation filtering and color histogram statistics and ADAS (Advanced Driving Assistance System) |
CN108053419A (en) * | 2017-12-27 | 2018-05-18 | 武汉蛋玩科技有限公司 | Inhibited and the jamproof multiscale target tracking of prospect based on background |
CN108734723A (en) * | 2018-05-11 | 2018-11-02 | 江南大学 | A kind of correlation filtering method for tracking target based on adaptive weighting combination learning |
CN108764064A (en) * | 2018-05-07 | 2018-11-06 | 西北工业大学 | SAR Target Recognition Algorithms based on Steerable filter device and self-encoding encoder |
CN108776975A (en) * | 2018-05-29 | 2018-11-09 | 安徽大学 | Visual tracking method based on semi-supervised feature and filter joint learning |
CN109146912A (en) * | 2018-07-26 | 2019-01-04 | 湖南人文科技学院 | A kind of visual target tracking method based on Objective analysis |
-
2019
- 2019-01-21 CN CN201910052347.3A patent/CN109785366B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101694723A (en) * | 2009-09-29 | 2010-04-14 | 北京航空航天大学 | Real-time moving target tracking method based on global matching similarity function |
JP6079076B2 (en) * | 2012-09-14 | 2017-02-15 | 沖電気工業株式会社 | Object tracking device and object tracking method |
KR101517538B1 (en) * | 2013-12-31 | 2015-05-15 | 전남대학교산학협력단 | Apparatus and method for detecting importance region using centroid weight mask map and storage medium recording program therefor |
CN106097383A (en) * | 2016-05-30 | 2016-11-09 | 海信集团有限公司 | A kind of method for tracking target for occlusion issue and equipment |
CN106570887A (en) * | 2016-11-04 | 2017-04-19 | 天津大学 | Adaptive Mean Shift target tracking method based on LBP features |
CN106651913A (en) * | 2016-11-29 | 2017-05-10 | 开易(北京)科技有限公司 | Target tracking method based on correlation filtering and color histogram statistics and ADAS (Advanced Driving Assistance System) |
CN108053419A (en) * | 2017-12-27 | 2018-05-18 | 武汉蛋玩科技有限公司 | Inhibited and the jamproof multiscale target tracking of prospect based on background |
CN108764064A (en) * | 2018-05-07 | 2018-11-06 | 西北工业大学 | SAR Target Recognition Algorithms based on Steerable filter device and self-encoding encoder |
CN108734723A (en) * | 2018-05-11 | 2018-11-02 | 江南大学 | A kind of correlation filtering method for tracking target based on adaptive weighting combination learning |
CN108776975A (en) * | 2018-05-29 | 2018-11-09 | 安徽大学 | Visual tracking method based on semi-supervised feature and filter joint learning |
CN109146912A (en) * | 2018-07-26 | 2019-01-04 | 湖南人文科技学院 | A kind of visual target tracking method based on Objective analysis |
Non-Patent Citations (7)
Title |
---|
A.LUKEZIC ET AL: "Discrimination Correlation Filter with Channel and Spaial Reliablity", 《2017 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR)》 * |
H.Z WANG ET AL: "Efficient Visual Tracking by Probabilistic Fusion of Multiple Cues", 《DEPARTMENT OF COMPUTER SCIENCE》 * |
J.F.HENRIQUES ET AL: "High-speed tracking with kernelized correlation filters", 《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》 * |
L.B ET AL: "Staple: Complementary Learners for Real-Time Tracking", 《2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR)》 * |
M.WANG ET AL: "Large margin object tracking with circulant feature maps", 《IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR)》 * |
熊昌镇 等: "自适应特征融合的核相关滤波跟踪算法", 《计算机辅助设计与图形学学报》 * |
钟国崇: "基于相关滤波的单目标跟踪算法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110276383A (en) * | 2019-05-31 | 2019-09-24 | 北京理工大学 | A kind of nuclear phase pass filtered target localization method based on multichannel memory models |
CN110599519A (en) * | 2019-08-27 | 2019-12-20 | 上海交通大学 | Anti-occlusion related filtering tracking method based on domain search strategy |
CN110599519B (en) * | 2019-08-27 | 2022-11-08 | 上海交通大学 | Anti-occlusion related filtering tracking method based on domain search strategy |
CN110765970B (en) * | 2019-10-31 | 2022-08-09 | 北京地平线机器人技术研发有限公司 | Method and device for determining nearest obstacle, storage medium and electronic equipment |
CN110765970A (en) * | 2019-10-31 | 2020-02-07 | 北京地平线机器人技术研发有限公司 | Method and device for determining nearest obstacle, storage medium and electronic equipment |
CN111091583A (en) * | 2019-11-22 | 2020-05-01 | 中国科学技术大学 | Long-term target tracking method |
CN111091583B (en) * | 2019-11-22 | 2022-09-06 | 中国科学技术大学 | Long-term target tracking method |
CN111260689B (en) * | 2020-01-16 | 2022-10-11 | 东华大学 | Confidence enhancement-based correlation filtering visual tracking method |
CN111260689A (en) * | 2020-01-16 | 2020-06-09 | 东华大学 | Effective confidence enhancement correlation filtering visual tracking algorithm |
CN111583306A (en) * | 2020-05-12 | 2020-08-25 | 重庆邮电大学 | Anti-occlusion visual target tracking method |
CN112598710A (en) * | 2020-12-25 | 2021-04-02 | 杭州电子科技大学 | Space-time correlation filtering target tracking method based on feature online selection |
CN112598710B (en) * | 2020-12-25 | 2024-03-12 | 杭州电子科技大学 | Space-time correlation filtering target tracking method based on feature on-line selection |
CN117011340A (en) * | 2023-08-09 | 2023-11-07 | 北京航空航天大学 | Reconfigurable relevant filtering target tracking algorithm based on statistical color characteristics |
Also Published As
Publication number | Publication date |
---|---|
CN109785366B (en) | 2020-12-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109785366A (en) | It is a kind of for the correlation filtering method for tracking target blocked | |
Mahadevan et al. | Saliency-based discriminant tracking | |
CN107016357B (en) | Video pedestrian detection method based on time domain convolutional neural network | |
CN104835178B (en) | A kind of tracking of low signal-to-noise ratio moving small target is with knowing method for distinguishing | |
CN107748873A (en) | A kind of multimodal method for tracking target for merging background information | |
CN109344725A (en) | A kind of online tracking of multirow people based on space-time attention rate mechanism | |
CN103886325B (en) | Cyclic matrix video tracking method with partition | |
CN108268859A (en) | A kind of facial expression recognizing method based on deep learning | |
Haghbayan et al. | An efficient multi-sensor fusion approach for object detection in maritime environments | |
CN102598057A (en) | Method and system for automatic object detection and subsequent object tracking in accordance with the object shape | |
CN111754519B (en) | Class activation mapping-based countermeasure method | |
CN109766823A (en) | A kind of high-definition remote sensing ship detecting method based on deep layer convolutional neural networks | |
CN110334703B (en) | Ship detection and identification method in day and night image | |
CN109886079A (en) | A kind of moving vehicles detection and tracking method | |
CN105913455A (en) | Local image enhancement-based object tracking method | |
Warsi et al. | Automatic handgun and knife detection algorithms: a review | |
Liu et al. | Multi-type road marking recognition using adaboost detection and extreme learning machine classification | |
CN113129336A (en) | End-to-end multi-vehicle tracking method, system and computer readable medium | |
CN104537363A (en) | Full-automatic adjustable cupboard leg assembly control method based on visual inspection system | |
Der et al. | Probe-based automatic target recognition in infrared imagery | |
CN113205494B (en) | Infrared small target detection method and system based on adaptive scale image block weighting difference measurement | |
CN109887004A (en) | A kind of unmanned boat sea area method for tracking target based on TLD algorithm | |
CN110111358B (en) | Target tracking method based on multilayer time sequence filtering | |
CN112766145A (en) | Method and device for identifying dynamic facial expressions of artificial neural network | |
CN116665097A (en) | Self-adaptive target tracking method combining context awareness |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20220608 Address after: 230093 room 1701, block C, building 1, zone J, phase II, Hefei Innovation Industrial Park, No. 2800, innovation Avenue, high tech Zone, Hefei, Anhui Patentee after: SNEGRID ELECTRIC TECHNOLOGY Co.,Ltd. Address before: 230026 Jinzhai Road, Baohe District, Hefei, Anhui Province, No. 96 Patentee before: University of Science and Technology of China |