CN110322473A - Target based on significant position is anti-to block tracking - Google Patents

Target based on significant position is anti-to block tracking Download PDF

Info

Publication number
CN110322473A
CN110322473A CN201910612958.9A CN201910612958A CN110322473A CN 110322473 A CN110322473 A CN 110322473A CN 201910612958 A CN201910612958 A CN 201910612958A CN 110322473 A CN110322473 A CN 110322473A
Authority
CN
China
Prior art keywords
piecemeal
target
scale
tracking
conspicuousness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910612958.9A
Other languages
Chinese (zh)
Inventor
詹昭焕
韩松臣
李炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN201910612958.9A priority Critical patent/CN110322473A/en
Publication of CN110322473A publication Critical patent/CN110322473A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of anti-shelter target trackings based on significant position, it is characterized in that the collaboration conspicuousness for calculating target obtains collaboration significant characteristics figure, and piecemeal is carried out to target based on collaboration significant characteristics figure, the higher new piecemeal of degree of overlapping between healthy and strong old piecemeal is rejected again, by setting scale Gradient, calculate the change in displacement of piecemeal, the Euclidean distance for comparing piecemeal change in displacement and scale Gradient selects the candidate scale changed apart from the smallest scale as target scale.This method is focused on tracking on the highest piecemeal of target conspicuousness using attention mechanism aiming at the problem that blocking under scene target following and failing, and this patent algorithm can preferably complete target following task, is had and stronger anti-is blocked ability.

Description

Target based on significant position is anti-to block tracking
Technical field
The present invention relates to computer vision fields, specifically, being a kind of method for tracking target.
Background technique
Target following is component part indispensable in video monitor work, plays linking objective Detection task and row For the key player of the high-orders tasks such as analysis.
Target following is to carry out lasting tracking to the target in the rectangle frame given to first frame, is provided in subsequent frames The positions and dimensions size of target.
Although current goal track algorithm has been applied to the fields such as security protection, monitoring, since there are targets in monitoring Feature is weaker, vulnerable to the difficult points such as blocking, and target following is caused to still suffer from many challenges.
Target tracking algorism, which is largely divided into, generates Modeling Approaches and differentiation Modeling Approaches.Class model is generated using currently Frame establishes regional model, and finds and the immediate region of regional model in next frame.Differentiating class method then is one mesh of training Identifier is marked, distinguishes target and background information in a kind of method similar to target classification.With machine learning field It is constantly progressive, becomes target tracking algorism using correlation filtering and deep learning algorithm as the differentiation class model represented Mainstream.
Current overwhelming majority tracking is all based on the track algorithm of target overall situation resemblance, and this kind of algorithm is facing Target is blocked or target distortion when may show and bad.In order to solve this problem, based on the target following of piecemeal Algorithm is proposed out.Piecemeal track algorithm is blocked according to target or the feature of partial block still remains target when target distortion Information principle, the block for retaining target local message is carried out to continue tracking, and to each piece of progress comprehensive assessment with determination The positions and dimensions of target.
That there is piecemeals is coarse for existing block algorithm, and block tracker performance is not advanced enough, and dimensional variation inaccuracy etc. lacks Point.
Summary of the invention
It proposes that one kind has more targetedly method of partition: instructing target segment using the collaboration conspicuousness of target.This side Method has used for reference the attention mechanism of human eye, ensure that the feature significance and integrality of piecemeal.
It proposes a kind of completely new piecemeal more new strategy: piecemeal being carried out to target based on collaboration conspicuousness, and for stalwartness The higher new piecemeal of degree of overlapping is rejected between old piecemeal, which balances the conspicuousness and one-sidedness of target segment.
On the one hand the stronger filter of introduced feature extractability improves the accuracy rate of piecemeal tracking, on the other hand presses down Made as conspicuousness detection mistake caused by interference, ensure that in less tracker quantity higher accuracy and Robustness.
The thought for using for reference particle filter and correlation filtering processing target dimensional variation, proposes a kind of novel dimensional variation side Method, this method can reflect target size variation and former method of partition caused by dimensional variation it is more violent Problem.
Detailed description of the invention
Fig. 1 is implementation flow chart of the present invention.
Specific embodiment
Use ECO-HC as piecemeal tracker.Relative to the correlation filtering on basis, ECO-HC algorithm is to extraction Feature carries out dimensionality reduction, is merged using gauss hybrid models to similar sample, and adjusted to the more new strategy of filter It is whole, so that the accuracy rate of correlation filter is higher.The score for finally detecting target is:
S=Pf*J
Here S indicates that detection target score, J represent the characteristic pattern after interpolation, and f indicates filter.P is a Feature Dimension Reduction square Battle array.The frequency domain representation of minimum error of sum square function after Feature Mapping is shown below:
In this objective function,Indicate interpolation characteristic pattern,Indicate desired output,Indicate square of two normal forms, Indicate the Frobenius norm of P,Representation space constraint is finally adding adjusting item a β, q' to indicate filtering to constrain P Device index.Above formula can be solved using Gauss-Newton and conjugate gradient method.In traditional correlation filtering, to each sample It is added in sample space, this may result in filter over-fitting.ECO-HC utilizes gauss hybrid models by similar sample It is merged, improves the robustness of filter.
The division of piecemeal is often more casual in current piecemeal track algorithm, leads to piecemeal meaning of one's words missing and background interference too It is more.In order to solve this problem, an intuitive idea is to be divided using the most attractive part guidance piecemeal of target with being promoted The representativeness of block.Based on, to the Research foundation of human visual attention, piecemeal preferentially being focused on target most in computer vision The place that human eye can be caused to pay attention to, the i.e. salient region of target.Target conspicuousness follows the original of the visual stimulus in single picture Then, which can recognize well-marked target in background.It, first will using k means clustering method if input picture is I Pixel in image is divided into K' cluster, then calculates the contrast and spatial cues of each cluster.Finally by two kinds of lines Suo Ronghe becomes significant characteristics figure.Contrast clue is extensive as a kind of unique conventional method for measuring visual signature Applied in the conspicuousness detection of single picture.Contrast clue measures the difference between feature, can be indicated with following formula:
Here ukIndicate the center of k-th of cluster, L2Normal form is used to calculate the distance between feature in feature space, niRepresent The pixel number of i classification, N' indicate the total pixel number of image.It is bigger to cluster bigger weights influence.
The advantages of contrast clue is that sparse cluster can be made to distribute higher conspicuousness score.But this clue encounters Then effect is poor when the background interference of complexity, therefore considers the interference that complex background is handled using spatial cues.It is similar Cosine Window operation in correlation filtering, spatial cues are higher than the vacation of other region significances based on picture centre region conspicuousness If inhibiting to the conspicuousness in the region far from picture centre.Spatial cues can be indicated with following formula:
Here normalization coefficient nkIndicate that the pixel number of k-th of cluster, G () indicate Gaussian kernel, x 'iIndicate pixel, o' is indicated Picture centre, variances sigma2Indicate the normalization radius of image.δ () indicates Kronecker function, b'(ti) indicate pixel ti's Cluster index, C'kIndicate k-th of cluster.
Two clues are merged by multiplication operation.The significance probability of each cluster is calculated using following formula first, Obtain the significance value of cluster grade:
pr(C'k)=ωc(k)·ωs(k)
The significance value of Pixel-level in order to obtain needs to connect pixel and cluster.For the pixel in each cluster, Conspicuousness likelihood Gaussian distributed:
Here v'x'Represent the feature vector of pixel x', σkRepresent cluster C'kVariance.The characteristic pattern of final pixel grade can be with table It is shown as the sum of all cluster significance value:
Each pixel in target area is calculated using above formula, has obtained the conspicuousness distribution of target area, i.e., Significant characteristics figure.The significance value of Pixel-level, Er Qie can be not only calculated in conspicuousness detection method based on cluster The cutting operation of target different zones is performed in a way, so that it is more reasonable for the guidance of piecemeal.
After the conspicuousness distribution for obtaining target, piecemeal operation is carried out to target, the block style of this paper refers to rectangle The piecemeal of box form.Piecemeal specific size is determined by target, and block size is determined as to 0.7 times of size of target here.First The principle that entire target area is traversed according to sliding window covers entire target area using rectangle frame according to certain step-length. I-th of rectangle frame can be expressed as in target areaWhereinPixel in expression rectangle frame, m ∈ 1, 2,...,wi×hiIndicate pixel index, wiAnd hiRespectively indicate the width and height of rectangle frame, s'iIndicate that the conspicuousness of rectangle frame obtains Point, which is defined by the formula:
Then, using be similar to Faster R-CNN in candidate frame eliminative mechanism, using non-maxima suppression to rectangle frame into Row selection: being ranked up first, in accordance with conspicuousness score, then friendship and ratio between rectangle frame is calculated according to ranking results, to friendship And the rectangle frame than being greater than certain threshold value is rejected.It hands over and the calculating than θ can be expressed from the next:
Although rectangle frame initialization be throughout whole image block, due to only to target area carry out piecemeal, it is right The conspicuousness score calculation amount of the rectangle frame of target area is simultaneously little.Meanwhile non-maxima suppression operation eliminates many redundancies Rectangle frame so that piecemeal in precedence partition on the basis of conspicuousness strong region, widely cover entire target area, from And balance is obtained on stressing property and uniformity.This flexible partition strategy has broken traditional piecemeal convention, i.e., fixed Piecemeal position and quantity, save calculation amount, improve the science of piecemeal.
If the centre coordinate of the target of t frame is μc t, the centre coordinate of i-th of piecemeal isPiecemeal sum is l', then ruler Spend the objective function of variation are as follows:
Wherein constant λ indicates the gradient truth of a matter, is set to 1.02 herein.vr∈ -16, -15 ..., and 16 } indicate gradient index. By minimizing objective function, candidate scale is obtainedIn order to inhibit influence of the piecemeal tracking error to dimensional variation, introduce One judgement factor η ∈ { 0,1 }:
When the piecemeal gross area and target area friendship and than θ be less than threshold value 0.3 when, η=0, otherwise, η=1.If StIt indicates The scale of t frame, then final dimensional variation is shown below:
Dimensional variation method of the invention is between particle filter and correlation filtering, one side particle filter dimensional variation thought The dimensional variation relationship of target segment and target population is reflected, another aspect correlation filtering dimensional variation method inhibits scale Acute variation.In view of the movement tendency of piecemeal piecemeal when being unevenly distributed in target can not accurately reflect target population Dimensional variation trend, introduces the concept of the degree of overlapping of piecemeal and target in dimensional variation, thinks when degree of overlapping is too low point Block tracking error is larger, does not execute dimensional variation, so as to avoid the accumulation of error.
On the basis of the partition strategy and dimensional variation method that are presented above, propose an algorithm flow with complete entirely with The automatic Iterative of track process.Tracker model M is by multiple piecemeal trace model HiComposition:The centre coordinate of target is determined using the method that Hough is voted, but not by piecemeal point Class is positive and negative, this is because using Monte Carlo frame next life component block, (frame does not use and doing for piecemeal is randomly generated Method), but piecemeal is generated in target salient region, therefore all piecemeals can be all considered as positive sample.The center of target is sat Mark can be determined by piecemeal centre coordinate using Hough voting method:
Here weight ω ' is determined by the confidence level of each piecemeal, specifically using each piecemeal peak-secondary lobe ratio (Peak- To-Sidelobe Ratio, PSR) voting stake of the normalized value as each piecemeal.It can be seen that response from formula Big piecemeal is affected to dbjective state assessment, has corresponded to the principle that different piece importance is different in piecemeal tracking.Most The state of available target this moment afterwards:
With the continuous iteration of piecemeal, the error of tracking is also being accumulated, and in order to reduce error, needs to adopt part piecemeal again Sample operation.Different with the method for traditional whole piecemeal resamplings and the random resampling of part piecemeal, the present invention is for piecemeal Resampling be what the conspicuousness based on target carried out.The influence for further also contemplating degree of overlapping between piecemeal, avoids piecemeal Repetition leads to the offset of target's center.For needing the judgment criteria of the piecemeal of resampling that can be summarized as following three points:
(α) low confidence.Delete the piecemeal that confidence level is less than certain threshold value, one new piecemeal of resampling.Point of the invention Block confidence level uses PSR as evaluation criterion, therefore confidence level also represents the response of tracker.Response is low mean with The current tracking result confidence level of track device is lower, it is most likely that occur tracking mistake, therefore first by the low piecemeal of confidence level into Row resetting;
(β) is far from target's center.Over time, the biggish piecemeal of some tracking errors can be gradually distance from target's center, For example the piecemeal comprising more background information rests on the background area of previous frame.If not to far from target's center piecemeal into Row resampling, then the piecemeal of higher, the separate target's center of some confidence levels may seriously affect the determination of target's center.It will Piecemeal centre coordinate regards as the piecemeal far from target's center beyond the piecemeal in 1.5 times of regions of target size;
(γ) and the excessively high new piecemeal of existing piecemeal degree of overlapping.Due to partition strategy of the invention be the conspicuousness based on target into Row piecemeal, therefore resampling strategy still follows this strategy.However during resampling, if addition new piecemeal with Existing piecemeal degree of overlapping is excessively high to be likely to result in gathering for piecemeal, to influence the accuracy of tracking.When new piecemeal and existing point Block degree of overlapping is more than 50%, then the new piecemeal will not be added.

Claims (4)

1. the target based on significant position is anti-to block tracking, which is characterized in that detailed process are as follows:
(1) the collaboration conspicuousness of target is calculated to instruct target segment;
(2) piecemeal is carried out to target based on collaboration conspicuousness, and for the higher new piecemeal of degree of overlapping between healthy and strong old piecemeal It is rejected;
(3) scale Gradient is set, the change in displacement of piecemeal is calculated, compares the Euclidean distance of piecemeal change in displacement and scale Gradient, Select the candidate scale changed apart from the smallest scale as target scale.
2. the target according to claim 1 based on significant position, which resists, blocks tracking, which is characterized in that the step (1) detailed process are as follows:
To input picture, using k means clustering method then the pixel partition clustering in image is calculated into each cluster first Contrast and spatial cues, the multiplication of two kinds of clues is finally become into significant characteristics figure.
3. the target according to claim 1 based on significant position, which resists, blocks tracking, which is characterized in that the step (2) detailed process are as follows:
Block size is determined as to the Local size of target, is shown with rectangle box form, the conspicuousness score of rectangle frame is calculated, makes Rectangle frame is selected with non-maxima suppression: being ranked up first, in accordance with conspicuousness score, then according to ranking results meter The friendship between rectangle frame and ratio are calculated, to friendship and the rectangle frame than being greater than certain threshold value is rejected.
4. the target according to claim 1 based on significant position, which resists, blocks tracking, which is characterized in that the step (3) detailed process are as follows:
By minimizing dimensional variation objective function, candidate scale is obtained;In order to inhibit piecemeal tracking error to dimensional variation Influence, introduce a judgement factor: when the piecemeal gross area and target area friendship and when than less than certain threshold value, scale is not Variation;Otherwise rescaling;In view of the movement tendency of piecemeal piecemeal when being unevenly distributed in target can not accurately reflect mesh Overall dimensional variation trend is marked, the concept of the degree of overlapping of piecemeal and target is introduced in dimensional variation, when degree of overlapping is lower than Think that piecemeal tracking error is larger when certain threshold value, does not execute dimensional variation.
CN201910612958.9A 2019-07-09 2019-07-09 Target based on significant position is anti-to block tracking Pending CN110322473A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910612958.9A CN110322473A (en) 2019-07-09 2019-07-09 Target based on significant position is anti-to block tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910612958.9A CN110322473A (en) 2019-07-09 2019-07-09 Target based on significant position is anti-to block tracking

Publications (1)

Publication Number Publication Date
CN110322473A true CN110322473A (en) 2019-10-11

Family

ID=68123065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910612958.9A Pending CN110322473A (en) 2019-07-09 2019-07-09 Target based on significant position is anti-to block tracking

Country Status (1)

Country Link
CN (1) CN110322473A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528966A (en) * 2021-02-05 2021-03-19 华东交通大学 Intelligent monitoring and identifying method, device and medium for peripheral environment of payee
CN114167468A (en) * 2021-12-14 2022-03-11 四川大学 Target space positioning method based on image and GNSS

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130287248A1 (en) * 2012-04-26 2013-10-31 General Electric Company Real-time video tracking system
CN106898015A (en) * 2017-01-17 2017-06-27 华中科技大学 A kind of multi thread visual tracking method based on the screening of self adaptation sub-block
CN109002750A (en) * 2017-12-11 2018-12-14 罗普特(厦门)科技集团有限公司 A kind of correlation filtering tracking based on conspicuousness detection and image segmentation
CN109658440A (en) * 2018-11-30 2019-04-19 华南理工大学 A kind of method for tracking target based on target significant characteristics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130287248A1 (en) * 2012-04-26 2013-10-31 General Electric Company Real-time video tracking system
CN106898015A (en) * 2017-01-17 2017-06-27 华中科技大学 A kind of multi thread visual tracking method based on the screening of self adaptation sub-block
CN109002750A (en) * 2017-12-11 2018-12-14 罗普特(厦门)科技集团有限公司 A kind of correlation filtering tracking based on conspicuousness detection and image segmentation
CN109658440A (en) * 2018-11-30 2019-04-19 华南理工大学 A kind of method for tracking target based on target significant characteristics

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SONG-CHEN HAN ET AL.: "Recurrently exploiting co-saliency of target for part-based visual tracking", 《 EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528966A (en) * 2021-02-05 2021-03-19 华东交通大学 Intelligent monitoring and identifying method, device and medium for peripheral environment of payee
CN114167468A (en) * 2021-12-14 2022-03-11 四川大学 Target space positioning method based on image and GNSS

Similar Documents

Publication Publication Date Title
CN112926410B (en) Target tracking method, device, storage medium and intelligent video system
JP7208480B2 (en) Learning program, detection program, learning device, detection device, learning method and detection method
CN111696128B (en) High-speed multi-target detection tracking and target image optimization method and storage medium
CN108416250B (en) People counting method and device
CN108470354A (en) Video target tracking method, device and realization device
Shen et al. Probabilistic multiple cue integration for particle filter based tracking
JP4849464B2 (en) Computerized method of tracking objects in a frame sequence
CN109816701A (en) A kind of method for tracking target and device, storage medium
CN109886998A (en) Multi-object tracking method, device, computer installation and computer storage medium
CN107784663A (en) Correlation filtering tracking and device based on depth information
CN111582062B (en) Re-detection method in target tracking based on YOLOv3
CN108921873A (en) The online multi-object tracking method of Markovian decision of filtering optimization is closed based on nuclear phase
CN106408594A (en) Video multi-target tracking method based on multi-Bernoulli characteristic covariance
CN101814149A (en) Self-adaptive cascade classifier training method based on online learning
US20110243398A1 (en) Pattern recognition apparatus and pattern recognition method that reduce effects on recognition accuracy, and storage medium
CN110532921A (en) The more Bernoulli Jacob's video multi-target trackings of broad sense label are detected based on SSD
CN110322473A (en) Target based on significant position is anti-to block tracking
JP2009187397A (en) Image processor and image processing program
CN110349188A (en) Multi-object tracking method, device and storage medium based on TSK fuzzy model
CN116645396A (en) Track determination method, track determination device, computer-readable storage medium and electronic device
CN110147768B (en) Target tracking method and device
US20220398400A1 (en) Methods and apparatuses for determining object classification
CN103971362A (en) Synthetic aperture radar (SAR) imagine change detection based on histogram and elite genetic clustering algorithm
Karavasilis et al. Visual tracking using spatially weighted likelihood of Gaussian mixtures
CN113096157A (en) Reliable local target tracking method and tracker

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20191011

WD01 Invention patent application deemed withdrawn after publication