CN107492112A - A kind of method for tracking target based on unmanned aerial vehicle platform - Google Patents
A kind of method for tracking target based on unmanned aerial vehicle platform Download PDFInfo
- Publication number
- CN107492112A CN107492112A CN201710558763.1A CN201710558763A CN107492112A CN 107492112 A CN107492112 A CN 107492112A CN 201710558763 A CN201710558763 A CN 201710558763A CN 107492112 A CN107492112 A CN 107492112A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- msup
- msubsup
- mfrac
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
Abstract
The present invention relates to a kind of method for tracking target based on unmanned aerial vehicle platform, respectively from target area and background area extraction positive sample template and negative sample template, the correlation filtering of ambient interferences can be suppressed by realizing, can effectively be removed the interference of background, be improved the accuracy of tracking.In order to improve the rapidity of tracking, the present invention is modified using integral feedback and feedforward strategy to correlation filtering tracking result, can be reduced tracking window and be calculated without multi-Step Iterations, improve the rapidity of tracking.
Description
Technical field
The invention belongs to image procossing and technical field of computer vision, be related to it is a kind of based on the target of unmanned aerial vehicle platform with
Track method.
Background technology
With the development of modern science and technology, UAS performance improves constantly.Unmanned plane survey and draw, take photo by plane, traffic
The fields such as monitoring, security patrol have broad application prospects and development space.Mobile mesh is realized using unmanned plane vision system
Quickly tracking is of great immediate significance target.
Correlation tracking algorithm is a kind of conventional fast tracking method, by choosing To Template, calculates image and template
Coefficient correlation can obtain the positional information of target in the picture.This kind of method has generally only used To Template as positive sample
Originally, have ignored background etc. can be as the information of negative sample.When target speed is too fast, partly beyond tracking window when,
Need to iterate to calculate.Need to perform multistep and calculate per two field picture just restrain iterative process.Due to the meter on unmanned aerial vehicle platform
Calculation resource-constrained, requirement of real-time are higher, and the improvement to multi-Step Iterations algorithm it is also proposed higher requirement with simplification.
Document " the space-time context tracking based on local sensitivity histogram, sensor and micro-system, 2017, Vol.36
(1), p149-156 " discloses a kind of method for tracking target based on space-time context.The method has used the figure of target area
As carrying out feature extraction and correlation filtering.Due to only having used the image of target area to be analyzed as positive sample, fail to fill
The information provided using background image is provided, therefore tracking failure is easily caused in the case of complex background.
Document " visual target tracking method during long based on correlation filter, computer application, 2017, Vol.7 (5),
P1466-1470 " discloses a kind of method for tracking target based on correlation filter.The method needs larger tracking window, with
Prevent from exceeding tracking window when target from quickly moving, cause tracking to fail.
The content of the invention
Technical problems to be solved
In order to avoid the shortcomings of the prior art, the present invention proposes a kind of target following side based on unmanned aerial vehicle platform
Method, background is suppressed to be added in correlation tracking algorithm with control strategy, improves the rapidity of target tracking algorism.
Technical scheme
A kind of method for tracking target based on unmanned aerial vehicle platform, it is characterised in that step is as follows:
Step 1, track algorithm initialization:The rectangular area I for surrounding target is selected first in the first two field picture0If square
Shape regional center is (x0,y0);In distance objective center radius r1It is interior to randomly choose n and I to be uniformly distributed0Same shape with
The rectangular area of size forms positive sample set A+:
Each rectangular area centre coordinateMeet following probability density function:
Using the average value of all positive samples as positive sample template:
In distance objective center r1With r2Between to be uniformly distributed random selection m and I0The rectangle of same shape and size
Region forms negative sample set A-:
Each rectangular area centre coordinateMeet following probability density function:
Using the average value of all negative samples as negative sample template:
Utilize positive sample and negative sample generation target correlation filter:
And background correlation filter:
Wherein:
Fo(u, v) is positive sample average value T+Fourier transformation:Fb(u, v) is negative sample average value T-Fourier become
Change 0 < λ1< 1 is optional parameters
Step 2, background suppression correlation tracking is carried out to obtained every two field picture during tracking:
With former frame target location pnCentered on select tracking window IT, and handled using following formula to reduce track window
Influence caused by mouth blocks:
Wherein:xn,ynFor tracking window centre coordinate;H is the height of tracking window;W is the width of tracking window;
Tracking window is filtered using target correlation filter and background correlation filter, obtains filter result:
Ko(u, v)=Ho(u,v)Fn(u,v)
Kb(u, v)=Hb(u,v)Fn(u,v)
Wherein Fn(u, v) is image I after tracking window processingnThe Fourier transformation of (x, y);
By target correlation filtering result Ko(u, v) carries out Fourier inversion and obtains Io(x, y), by background correlation filtering knot
Fruit Kb(u, v) carries out Fourier inversion and obtains Ib(x, y), the position for obtaining target in current frame image are:
The position of target is modified in step 3, the current frame image obtained to step 2:
Using integral feedback strategy, obtain:
Using feedforward strategy, obtain:pff=f (pn)+pn-pn-1
Integral feedback is combined with feedforward, obtains the position of revised target:
pn+1=f (pn)+λ2pff+(1-λ2)pfb
0≤λ2≤1。
Beneficial effect
A kind of method for tracking target based on unmanned aerial vehicle platform proposed by the present invention, respectively from target area and background area
Positive sample template and negative sample template are extracted, the correlation filtering that can suppress ambient interferences is realized, can effectively remove
The interference of background, improve the accuracy of tracking.In order to improve the rapidity of tracking, the present invention is using integral feedback and feedforward strategy
Correlation filtering tracking result is modified, tracking window can be reduced and calculated without multi-Step Iterations, improve the fast of tracking
Speed.
The method beneficial effect of the present invention mainly includes:
(1) by using the step 2 in technical scheme, the correlation filtering for suppressing ambient interferences, Neng Gouyou are realized
Effect ground removes the interference of background, improves the accuracy of tracking.
(2) by using the step 3 in technical scheme, realize and correlation filtering tracking result is modified, reduce
Tracking window and calculated without multi-Step Iterations, improve the rapidity of tracking.
Brief description of the drawings
Fig. 1:The inventive method trace flow figure
Fig. 2:Correlation tracking algorithm result;
(a) the first frame;(b) the 3rd frame;(c) the 9th frame;D) target actual positions curve and tracking result curve;
Fig. 3:It is to take λ2Background suppresses track algorithm result when=0
(a) the first frame;(b) the 3rd frame;(c) the 9th frame;D) target actual positions curve and tracking result curve;
Fig. 4:It is to take λ2Background suppresses track algorithm result when=1
(a) the first frame;(b) the 3rd frame;(c) the 9th frame;D) target actual positions curve and tracking result curve;
Embodiment
In conjunction with embodiment, accompanying drawing, the invention will be further described:
With reference to figure 1, the basic procedure of the method for tracking target of the invention based on unmanned aerial vehicle platform specifically includes following steps:
1) the rectangular area I for surrounding target is selected first in the first two field picture0If rectangular area center is (x0,y0)。
Rectangular area size is selected in present embodiment to surround the minimum rectangle of target.
2) in distance objective center r1It is interior to randomly choose n and I to be uniformly distributed0The rectangular area of same shape and size
Form positive sample set A+:
Each rectangular area centre coordinateMeet following probability density function:
3) using the average value of all positive samples as positive sample template:
Distance r in present embodiment1It is set as the 0.9 of target rectangle region catercorner length, chooses positive sample
Number n=5.
4) in distance objective center r1With r2Between to be uniformly distributed random selection m and I0The square of same shape and size
Shape region forms negative sample set A-:
Each rectangular area centre coordinateMeet following probability density function:
5) using the average value of all negative samples as negative sample template:
R in present embodiment2It is set to target rectangle region catercorner length, r3It is set to target rectangle region diagonal
2 times of length, m=5.
6) positive sample and negative sample generation target correlation filter are utilized:
And background correlation filter:
Wherein:
Fo(u, v) is positive sample average value T+Fourier transformation:
Fb(u, v) is negative sample average value T-Fourier transformation;
0 < λ1< 1 is optional parameters.
Selecting All Parameters λ in present embodiment1=0.5.
7) with former frame target location pnCentered on select tracking window IT, and handled using following formula to reduce tracking
Window influences caused by blocking:
Wherein:
xn,ynFor tracking window centre coordinate;
H is the height of tracking window;
W is the width of tracking window.
Tracking window is sized such that 2 times of target rectangle region in present embodiment.And have:
W=1.1w0
H=1.9h0
Wherein:
w0For target rectangle peak width;
h0For target rectangle region height.
8) tracking window is analyzed using target correlation filter and background correlation filter:
Ko(u, v)=Ho(u,v)Fn(u,v)
Kb(u, v)=Hb(u,v)Fn(u,v)
Wherein:
Fn(u, v) is image I after tracking window processingnThe Fourier transformation of (x, y).
9) by target correlation filtering result Ko(u, v) carries out Fourier inversion and obtains Io(x, y), by background correlation filtering
As a result Kb(u, v) carries out Fourier inversion and obtains Ib(x,y).Position f (the p of target in current frame image can be obtainedn) following institute
Show:
10) during tracking, target location p that correlation tracking algorithm tracks according to previous frame imagenCalculate present frame
Target location is f (pn)。
Using integral feedback strategy, obtain:
Using feedforward strategy, obtain:
pff=f (pn)+pn-pn-1
Integral feedback is combined with feedforward, obtained:
pn+1=f (pn)+λ2pff+(1-λ2)pfb, 0≤λ2≤1
λ is realized in present embodiment2=0, λ2Tracking result in the case of=1 two kind.
With reference to figure 2, Fig. 3, Fig. 4, the present invention realizes the correlation filtering that can suppress ambient interferences, with the unused back of the body
The correlation tracking algorithm that scape suppresses is compared, and can effectively be removed the interference of background, be improved the accuracy of tracking;It is anti-using integration
Feedback is modified with feedforward strategy to correlation filtering tracking, is calculated, is improved using less tracking window and without multi-Step Iterations
The rapidity of tracking.
Claims (1)
1. a kind of method for tracking target based on unmanned aerial vehicle platform, it is characterised in that step is as follows:
Step 1, track algorithm initialization:The rectangular area I for surrounding target is selected first in the first two field picture0If rectangular area
Center is (x0,y0);In distance objective center radius r1It is interior to randomly choose n and I to be uniformly distributed0Same shape and size
Rectangular area forms positive sample set A+:
<mrow>
<msup>
<mi>A</mi>
<mo>+</mo>
</msup>
<mo>=</mo>
<mo>{</mo>
<msubsup>
<mi>I</mi>
<mn>1</mn>
<mo>+</mo>
</msubsup>
<mo>,</mo>
<msubsup>
<mi>I</mi>
<mn>2</mn>
<mo>+</mo>
</msubsup>
<mo>,</mo>
<mo>...</mo>
<mo>,</mo>
<msubsup>
<mi>I</mi>
<mi>n</mi>
<mo>+</mo>
</msubsup>
<mo>}</mo>
</mrow>
Each rectangular area centre coordinateMeet following probability density function:
<mrow>
<mi>f</mi>
<mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>n</mi>
<mo>+</mo>
</msubsup>
<mo>,</mo>
<msubsup>
<mi>y</mi>
<mi>n</mi>
<mo>+</mo>
</msubsup>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mfrac>
<mn>1</mn>
<mrow>
<mn>2</mn>
<msup>
<msub>
<mi>&pi;r</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>n</mi>
<mo>+</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>x</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>y</mi>
<mi>n</mi>
<mo>+</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>y</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo><</mo>
<msup>
<msub>
<mi>r</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mn>0</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>n</mi>
<mo>+</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>x</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>y</mi>
<mi>n</mi>
<mo>+</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>y</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>&GreaterEqual;</mo>
<msup>
<msub>
<mi>r</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
Using the average value of all positive samples as positive sample template:
<mrow>
<msup>
<mi>T</mi>
<mo>+</mo>
</msup>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>k</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</munderover>
<msubsup>
<mi>I</mi>
<mi>k</mi>
<mo>+</mo>
</msubsup>
</mrow>
<mi>n</mi>
</mfrac>
</mrow>
In distance objective center r1With r2Between to be uniformly distributed random selection m and I0The rectangular area of same shape and size
Form negative sample set A-:
<mrow>
<msup>
<mi>A</mi>
<mo>-</mo>
</msup>
<mo>=</mo>
<mo>{</mo>
<msubsup>
<mi>I</mi>
<mn>1</mn>
<mo>-</mo>
</msubsup>
<mo>,</mo>
<msubsup>
<mi>I</mi>
<mn>2</mn>
<mo>-</mo>
</msubsup>
<mo>,</mo>
<mo>...</mo>
<mo>,</mo>
<msubsup>
<mi>I</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>}</mo>
</mrow>
Each rectangular area centre coordinateMeet following probability density function:
<mrow>
<mi>f</mi>
<mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>,</mo>
<msubsup>
<mi>y</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mfrac>
<mn>1</mn>
<mrow>
<mn>2</mn>
<mi>&pi;</mi>
<mrow>
<mo>(</mo>
<msup>
<msub>
<mi>r</mi>
<mn>2</mn>
</msub>
<mn>2</mn>
</msup>
<mo>-</mo>
<msup>
<msub>
<mi>r</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
</mrow>
</mfrac>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<msup>
<msub>
<mi>r</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
<mo><</mo>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>x</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>y</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>y</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo><</mo>
<msup>
<msub>
<mi>r</mi>
<mn>2</mn>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mn>0</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>x</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>y</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>y</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>&GreaterEqual;</mo>
<msup>
<msub>
<mi>r</mi>
<mn>2</mn>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mn>0</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>x</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>y</mi>
<mi>n</mi>
<mo>-</mo>
</msubsup>
<mo>-</mo>
<msub>
<mi>y</mi>
<mn>0</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>&le;</mo>
<msup>
<msub>
<mi>r</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
Using the average value of all negative samples as negative sample template:
<mrow>
<msup>
<mi>T</mi>
<mo>-</mo>
</msup>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>k</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>m</mi>
</munderover>
<msubsup>
<mi>I</mi>
<mi>k</mi>
<mo>-</mo>
</msubsup>
</mrow>
<mi>m</mi>
</mfrac>
</mrow>
Utilize positive sample and negative sample generation target correlation filter:
<mrow>
<msub>
<mi>H</mi>
<mi>o</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<msub>
<mover>
<mi>F</mi>
<mo>&OverBar;</mo>
</mover>
<mn>0</mn>
</msub>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mo>|</mo>
<msub>
<mi>F</mi>
<mi>o</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
</mrow>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
<mo>+</mo>
<mfrac>
<msub>
<mi>&lambda;</mi>
<mn>1</mn>
</msub>
<mrow>
<mn>1</mn>
<mo>-</mo>
<msub>
<mi>&lambda;</mi>
<mn>1</mn>
</msub>
</mrow>
</mfrac>
<mo>|</mo>
<msub>
<mi>F</mi>
<mi>b</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
</mrow>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
<mo>+</mo>
<msub>
<mi>&lambda;</mi>
<mn>1</mn>
</msub>
</mrow>
</mfrac>
</mrow>
And background correlation filter:
<mrow>
<msub>
<mi>H</mi>
<mi>b</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<msub>
<mover>
<mi>F</mi>
<mo>&OverBar;</mo>
</mover>
<mi>b</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mo>|</mo>
<msub>
<mi>F</mi>
<mi>b</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
</mrow>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
<mo>+</mo>
<mfrac>
<mrow>
<mn>1</mn>
<mo>-</mo>
<msub>
<mi>&lambda;</mi>
<mn>1</mn>
</msub>
</mrow>
<msub>
<mi>&lambda;</mi>
<mn>1</mn>
</msub>
</mfrac>
<mo>|</mo>
<msub>
<mi>F</mi>
<mi>o</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
</mrow>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
<mo>+</mo>
<mn>1</mn>
<mo>-</mo>
<msub>
<mi>&lambda;</mi>
<mn>1</mn>
</msub>
</mrow>
</mfrac>
</mrow>
Wherein:
Fo(u, v) is positive sample average value T+Fourier transformation:Fb(u, v) is negative sample average value T-The < of Fourier transformation 0
λ1< 1 is optional parameters
Step 2, background suppression correlation tracking is carried out to obtained every two field picture during tracking:
With former frame target location pnCentered on select tracking window IT, and handled using following formula and cut with reducing tracking window
Influence caused by disconnected:
<mrow>
<msub>
<mi>I</mi>
<mi>n</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msub>
<mi>I</mi>
<mi>T</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mi>s</mi>
<mi>i</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<mfrac>
<mrow>
<mi>&pi;</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>-</mo>
<msub>
<mi>x</mi>
<mi>n</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
<mi>h</mi>
</mfrac>
<mo>)</mo>
</mrow>
<mi>s</mi>
<mi>i</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<mfrac>
<mrow>
<mi>&pi;</mi>
<mrow>
<mo>(</mo>
<mi>y</mi>
<mo>-</mo>
<msub>
<mi>y</mi>
<mi>n</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
<mi>w</mi>
</mfrac>
<mo>)</mo>
</mrow>
<mo>|</mo>
</mrow>
Wherein:xn,ynFor tracking window centre coordinate;H is the height of tracking window;W is the width of tracking window;
Tracking window is filtered using target correlation filter and background correlation filter, obtains filter result:
Ko(u, v)=Ho(u,v)Fn(u,v)
Kb(u, v)=Hb(u,v)Fn(u,v)
Wherein Fn(u, v) is image I after tracking window processingnThe Fourier transformation of (x, y);
By target correlation filtering result Ko(u, v) carries out Fourier inversion and obtains Io(x, y), by background correlation filtering result Kb
(u, v) carries out Fourier inversion and obtains Ib(x, y), the position for obtaining target in current frame image are:
<mrow>
<mi>f</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>p</mi>
<mi>n</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mi>arg</mi>
<munder>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
<mrow>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
</munder>
<mo>|</mo>
<msub>
<mi>I</mi>
<mi>o</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mo>-</mo>
<mo>|</mo>
<msub>
<mi>I</mi>
<mi>b</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
</mrow>
The position of target is modified in step 3, the current frame image obtained to step 2:
Using integral feedback strategy, obtain:
Using feedforward strategy, obtain:pff=f (pn)+pn-pn-1
Integral feedback is combined with feedforward, obtains the position of revised target:
pn+1=f (pn)+λ2pff+(1-λ2)pfb
0≤λ2≤1。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710558763.1A CN107492112B (en) | 2017-07-11 | 2017-07-11 | A kind of method for tracking target based on unmanned aerial vehicle platform |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710558763.1A CN107492112B (en) | 2017-07-11 | 2017-07-11 | A kind of method for tracking target based on unmanned aerial vehicle platform |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107492112A true CN107492112A (en) | 2017-12-19 |
CN107492112B CN107492112B (en) | 2019-11-22 |
Family
ID=60644483
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710558763.1A Active CN107492112B (en) | 2017-07-11 | 2017-07-11 | A kind of method for tracking target based on unmanned aerial vehicle platform |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107492112B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109166139A (en) * | 2018-07-18 | 2019-01-08 | 天津大学 | A kind of dimension self-adaption method for tracking target that combination fast background inhibits |
CN109360223A (en) * | 2018-09-14 | 2019-02-19 | 天津大学 | A kind of method for tracking target of quick spatial regularization |
CN111161323A (en) * | 2019-12-31 | 2020-05-15 | 北京理工大学重庆创新中心 | Complex scene target tracking method and system based on correlation filtering |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106023257A (en) * | 2016-05-26 | 2016-10-12 | 南京航空航天大学 | Target tracking method based on rotor UAV platform |
CN106097393A (en) * | 2016-06-17 | 2016-11-09 | 浙江工业大学 | A kind of based on multiple dimensioned and adaptive updates method for tracking target |
-
2017
- 2017-07-11 CN CN201710558763.1A patent/CN107492112B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106023257A (en) * | 2016-05-26 | 2016-10-12 | 南京航空航天大学 | Target tracking method based on rotor UAV platform |
CN106097393A (en) * | 2016-06-17 | 2016-11-09 | 浙江工业大学 | A kind of based on multiple dimensioned and adaptive updates method for tracking target |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109166139A (en) * | 2018-07-18 | 2019-01-08 | 天津大学 | A kind of dimension self-adaption method for tracking target that combination fast background inhibits |
CN109166139B (en) * | 2018-07-18 | 2022-03-22 | 天津大学 | Scale self-adaptive target tracking method combined with rapid background suppression |
CN109360223A (en) * | 2018-09-14 | 2019-02-19 | 天津大学 | A kind of method for tracking target of quick spatial regularization |
CN111161323A (en) * | 2019-12-31 | 2020-05-15 | 北京理工大学重庆创新中心 | Complex scene target tracking method and system based on correlation filtering |
CN111161323B (en) * | 2019-12-31 | 2023-11-28 | 北京理工大学重庆创新中心 | Complex scene target tracking method and system based on correlation filtering |
Also Published As
Publication number | Publication date |
---|---|
CN107492112B (en) | 2019-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11120270B2 (en) | Spatiotemporal action detection method | |
CN105425304B (en) | A kind of compensation method of Aircraft magnetic disturbance | |
CN107492112A (en) | A kind of method for tracking target based on unmanned aerial vehicle platform | |
DE112020002697T5 (en) | MODELING A VEHICLE ENVIRONMENT WITH CAMERAS | |
CN109461172A (en) | Manually with the united correlation filtering video adaptive tracking method of depth characteristic | |
CN105354806B (en) | Rapid defogging method and system based on dark | |
CN111640101B (en) | Ghost convolution characteristic fusion neural network-based real-time traffic flow detection system and method | |
CN109886079A (en) | A kind of moving vehicles detection and tracking method | |
US20200242777A1 (en) | Depth-aware object counting | |
CN103886553B (en) | Method and system for non-local average value denoising of image | |
CN103745203A (en) | Visual attention and mean shift-based target detection and tracking method | |
CN105184779A (en) | Rapid-feature-pyramid-based multi-dimensioned tracking method of vehicle | |
CN103578083B (en) | Single image defogging method based on associating average drifting | |
CN106919902A (en) | A kind of vehicle identification and trajectory track method based on CNN | |
CN110580713A (en) | Satellite video target tracking method based on full convolution twin network and track prediction | |
CN111046767B (en) | 3D target detection method based on monocular image | |
CN111915583B (en) | Vehicle and pedestrian detection method based on vehicle-mounted thermal infrared imager in complex scene | |
CN109035369B (en) | Sample expansion method for fusing virtual samples | |
CN102903108B (en) | Edge detection method based on underwater image statistical property | |
CN105279770A (en) | Target tracking control method and device | |
CN102129670B (en) | Method for detecting and repairing movie scratch damage | |
CN109754424B (en) | Correlation filtering tracking algorithm based on fusion characteristics and self-adaptive updating strategy | |
CN113901897A (en) | Parking lot vehicle detection method based on DARFNet model | |
CN108133231B (en) | Scale-adaptive real-time vehicle detection method | |
CN103325114A (en) | Target vehicle matching method based on improved vision attention model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20200925 Address after: 2303, 23 / F, building 1, yard 6, Dacheng Road, Fengtai District, Beijing Patentee after: Beijing Guodian Ruiyuan Technology Development Co., Ltd Address before: 710072 Xi'an friendship West Road, Shaanxi, No. 127 Patentee before: Northwestern Polytechnical University |
|
TR01 | Transfer of patent right |