CN110717934A - Anti-occlusion target tracking method based on STRCF - Google Patents

Anti-occlusion target tracking method based on STRCF Download PDF

Info

Publication number
CN110717934A
CN110717934A CN201910988383.0A CN201910988383A CN110717934A CN 110717934 A CN110717934 A CN 110717934A CN 201910988383 A CN201910988383 A CN 201910988383A CN 110717934 A CN110717934 A CN 110717934A
Authority
CN
China
Prior art keywords
target
occlusion
area
strcf
correlation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910988383.0A
Other languages
Chinese (zh)
Other versions
CN110717934B (en
Inventor
张汗灵
谢悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN201910988383.0A priority Critical patent/CN110717934B/en
Publication of CN110717934A publication Critical patent/CN110717934A/en
Application granted granted Critical
Publication of CN110717934B publication Critical patent/CN110717934B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an anti-blocking target tracking method based on STRCF (Standard solution control function). according to the constructed quality evaluation function and the blocking judgment function, whether an object is blocked or not can be accurately judged, and when the object is not blocked, the STRCF is used for positioning the target and normally updating the target. When occlusion occurs, whether the object is partially occluded or completely occluded is judged by utilizing block matching. The STRCF can realize good positioning for slight partial shielding, and on the basis, the STRCF is used as a basic filter to perform target positioning, but filter updating and scale updating are not performed, so that template pollution is avoided. When the object is completely shielded, according to historical position information, a Kalman filter is combined to carry out position estimation and define a secondary detection area, and accurate positioning is carried out in the secondary detection area through feature matching.

Description

Anti-occlusion target tracking method based on STRCF
Technical Field
The invention relates to the technical field of target tracking of computer vision, in particular to an anti-occlusion target tracking method based on STRCF (stereo vision).
Background
Visual target tracking is an important branch of research in the field of computer vision. Although the objective tracking has been widely studied in recent years, the research heat is still slightly lower than the basic visual tasks such as objective detection and semantic segmentation due to the high difficulty of the objective tracking and the scarcity of high-quality data. Nevertheless, with the development of deep learning and related filtering and the improvement of computer computing power, the performance of the visual algorithm is dramatically improved, so that target tracking is very useful in the fields of video monitoring, virtual reality, human-computer interaction, medical diagnosis, unmanned driving and the like. Generally, target tracking faces several difficulties: object deformation, brightness change, rapid movement, background interference coverage and the like, and irregular changes are the difficulties of target tracking and the problems which need to be solved when the method is applied to the industry.
Occlusion problems are typical of target tracking, and a visual target may be occluded by a stationary object or a plurality of moving objects, or may be occluded due to self-deformation. The occlusion range can be divided into partial occlusion and complete occlusion. The shielding process is mainly divided into three parts: firstly, when a visual target enters a shielding state, partial shielding occurs, and key information is gradually lost; secondly, the target is covered by a background or a shelter, the complete shelter occurs, and the target information is completely lost; and thirdly, the object gradually breaks away from the shielding, and the object information is gradually recovered.
In order to solve the problem of tracking failure caused by occlusion, a literature (Visual object tracking using adaptive correlation filters [ C ] CVPR 2010, San Francisco, CA, USA,13-18june2010.ieee,2010) proposes a tracking failure detection strategy PSR for judging whether an object is occluded or not, but the actual effect is poor. The tracking failure detection strategy proposed in the literature (A Fully-relative Long-Term Tracker [ J ].2017) is PSR multiplied by the maximum response value. The document (Switching Kalman Filter-Based application for tracking and Event Detection at Traffic interactions. ieee,2005) uses a Kalman Filter to predict the position of an occluded object, but this method works poorly when the object moves irregularly or the occlusion time is too long. A Correlation-Based Layered Matching and Target tracking method 2006,27(4): 670-. The literature (Real-time part-based visual adaptive correction filters, in: Proceedings of the EEConference on Computer Vision and Pattern Recognition,2015, pp. 4902-4912) divides the target into blocks, uses the blocks that are not occluded to perform effective tracking, and for the case where the target is completely occluded, there is no effective method that can completely solve the problem at present. The document (occupancy-aware fragmentation with spatial-temporal consistency, j.ieee trans. image process.25(8) (2016) 3814-3825) proposes filters based on CSR-DCF, which use quality assessment functions to determine whether objects are occluded and use different filter update strategies, with Occlusion trackers for tracking when occluded, but which also drift to the "background" when the target is completely occluded. The document (Effective Occlusion Handling for Fast Correlation filters [ J ].2018.) uses an additional Correlation Filter to locate the occluded object, the method is suitable for partial Occlusion, has poor applicability to severe Occlusion and long-time Occlusion, and brings additional computational complexity. The related filter-based target tracking proposed in the literature (Learning Spatial-Temporal regulated Correlation Filters for visual tracking [ J ].2018.) and the road (road visual tracking based on reduced Spatial network and kernelling Correlation Filters [ J ]. Journal of Electronic Imaging,2018,27(2):1) propose that the tracking algorithm based on the combination of CNNS and CF achieves good performance, but both fail to solve the occlusion problem well and limit the scale change on the contrary.
The target tracking based on the DCF is greatly concerned due to higher computing efficiency and better robustness, and the STRCF proposed by the literature (Learning spatial-temporal regulated correlation filters for visual tracking.2018:4904 and 4913.) introduces a regular space-time term on the basis of the DCF, so that the target tracking has better robustness, higher accuracy and higher speed under the condition of large change of the appearance, but when the STRCF is used for covering for a long time, the phenomenon that the tracking is shifted to the background still occurs.
Disclosure of Invention
The invention aims to solve the technical problem that the existing related filter cannot well solve the problem of target occlusion or only effectively solves the problems of partial occlusion and short-time occlusion, and provides an anti-occlusion target tracking method based on STRCF (stereo matching pursuit), wherein on the basis of a STRCF tracker, an occlusion processing strategy is introduced, and a quality evaluation function based on PSR (particle swarm optimization) is used for judging whether a target is occluded or not; when the target is not shielded, the target is positioned by using the STRCF, and the shielding state of the target is judged by using the block matching idea; different tracking strategies and updating strategies are adopted when the partial occlusion and the full occlusion occur. For video target tracking with longer occlusion time, the method has higher tracking speed, higher tracking precision and lower computational complexity.
The invention is realized by the following steps: an anti-occlusion target tracking method based on STRCF comprises the following steps:
step 1, reading an image sequence, framing a target to be tracked in an initial frame image, determining the center coordinate and the size of the target, initializing a tracker, forming an initial target template according to a target area, and initializing a Kalman filter;
step 2, calculating a shielding judgment function value st
Step 3, judging the occlusion function value s obtained in the step 2tComparing the current value with a preset threshold value delta, and judging the shielding degree if the current value is greater than the preset threshold value delta; otherwise, acquiring a maximum correlation response value by using the STRCF tracker, outputting a target center coordinate and a scale corresponding to the maximum correlation response value, and performing the step 8 and the step 9;
and 4, if the shielding judgment function value obtained in the step 3 is larger than a threshold value, determining the shielding degree by using a block matching algorithm, and calculating a block matching number Num.
Step 5, if the block matching number Num calculated in the step 4 is larger than a threshold value xi, judging that the target to be tracked is in a partial shielding state; otherwise, the screen is in a complete shielding state;
step 6, if the target to be tracked is judged to be in a partial shielding state in the step 5, using the STRCF tracker to obtain a maximum correlation response value, outputting a target center coordinate and a scale corresponding to the maximum correlation response value, and performing the steps 8 and 9, otherwise performing the step 7;
and 7, if the target to be tracked is judged to be in a completely shielded state in the step 5, performing position estimation on the target to be detected by using a Kalman filter, dividing a secondary rechecking area around the estimated position, and performing accurate positioning in the range area by using feature matching.
Step 8, updating the target scale;
and 9, updating the correlation filter.
As a further aspect of the present invention, in step 2, the occlusion decision function value stThe calculation formula is as follows:
Figure BDA0002237441040000051
Figure BDA0002237441040000052
Figure BDA0002237441040000053
wherein, formula (2) is a quality evaluation function, CRM is a correlation filter response diagram (correlation operation is carried out on the correlation filter and the characteristics of the candidate image), mu and sigma are respectively the mean value and the standard deviation of a side lobe region, and the side lobe region is a region (excluding a maximum peak response point) of 11 multiplied by 11 with the maximum peak response point as the center; z is an offset coefficient and is set to 0.1.
As a further aspect of the present invention, in step 3, the threshold δ is set to 1.55.
As a further scheme of the present invention, in step 4, the block matching number Num is calculated as follows:
Figure BDA0002237441040000061
where Num represents the previous frame object andthe number of the candidate target blocks in the previous frame with higher matching degree. hist is a function for comparing cosine similarity of two blocks by using color histogram information; uniform four-partitioning of image blocks, BtIs a set of four blocks of candidate image blocks in the current frame. B ist-1Is a set of four blocks of the target in the previous frame.
As a further aspect of the present invention, in step 5, the threshold ξ is set as 3.
As a further scheme of the present invention, in step 7, the secondary review area is centered on the position estimated by kalman filtering, and has a length equal to n1+n2+1 times the height of the object in more than one frame, with a width equal to m1+m2+m3Multiplying by the width of the object in the previous frame. m is1,m2,n1,n2The calculation formula is as follows:
Figure BDA0002237441040000062
Figure BDA0002237441040000063
Figure BDA0002237441040000064
wherein ,
Figure BDA0002237441040000071
for the position estimated by the Kalman filter, HEIGHT, WIDE is the size of the input image, (t)w,th) The width and height of the target of the previous frame.
In summary, compared with the prior art, the invention has the following advantages:
after the method is evaluated on OTB-2015, VOT-2016 and sample-color-128 data sets, the calculation complexity and the accuracy for long-time shielding are lower, and the average overlapping accuracy and the average overlapping expectation are respectively improved by 4% and 4.5% compared with an STRCF tracker; in addition, the tracking speed is improved by 8.8%, and the method has good effects on out-of-bounds target, in-plane rotation and out-of-plane rotation.
Drawings
FIG. 1 is a flow chart of the implementation of the method of the present invention;
FIG. 2 is a response diagram of an embodiment with targets in an unoccluded, partially occluded, and out-occluded state.
FIG. 3 is a diagram of an occlusion determination function in an embodiment.
Fig. 4 is a schematic diagram of secondary detection in an embodiment.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described together with the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely exemplary of the invention, and not all embodiments are intended to be exhaustive. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
As shown in fig. 1, an anti-occlusion target tracking method based on STRCF includes the following steps:
1) and inputting a picture frame: and acquiring a picture frame from the original video and preprocessing the picture frame.
2) Selecting a target to be tracked: and selecting a target to be tracked in the first frame image, and acquiring the position and scale information of the target.
3) And shielding judgment: and judging whether the target is occluded or not through an occlusion judging function.
4) And judging the shielding degree: and judging the target shielding state (partial shielding and complete shielding) through block matching.
5) And shielding treatment: partial shielding adopts a STRCF tracker, and scale updating and STRCF tracker updating are not carried out; and the complete shielding adopts a Kalman filter to carry out position estimation, a secondary reinspection area is defined, and secondary accurate positioning is carried out.
Further, in step 1), the preprocessing includes obtaining a picture by frame extraction of the video, and performing filtering and denoising on the picture.
In the filtering and denoising process, the pixel value of the central point is replaced by the median value in the central point neighborhood, so that the isolated point with the pixel value of 0 or 255 is removed, and the principle is as follows:
F(x,y)=Med(P(x,y))
p (x, y) is a set of pixel points in the NXN template neighborhood at the point (x, y), Med () represents a median function, the gray values of the respective pixel points in the neighborhood with the pixel point (x, y) as the center are arranged from large to small, and the gray value F (x, y) of the pixel at the middle position is output to represent a new pixel value after median filtering at the point (x, y).
Further, in step 2), the selecting the target area to be tracked includes defining a target to be tracked, and obtaining coordinate information (X, Y, W, H), where (X, Y) is a center point coordinate of the target, and (W, H) is a length and a width of the target.
Further, in step 3), the occlusion determination includes performing occlusion determination by using an occlusion determination function proposed by the present invention.
And establishing a search area of the current frame, wherein the search area is the same as the position of the target frame of the previous frame, and the area of the search area is 1.8 times of that of the target frame. Use of
Figure BDA0002237441040000091
Calculating the PSRtExtracting feature vectors of the target
Figure BDA0002237441040000092
And weighted by cosine window, using
Figure BDA0002237441040000093
Obtaining a motion dependent filter response map CRMtμ and σ are mean and standard deviation of the side lobe region, respectively, the side lobe region being an 11 × 11 region centered on the maximum peak response point; z is an offset coefficient and is set to 0.1. Use of
Figure BDA0002237441040000094
Calculating qt(ii) a Use of
Figure BDA0002237441040000095
And calculating a shielding judgment function value.
Such as if StIf the value is less than the threshold value δ (set to 1.55), no occlusion occurs, and normal tracking and filter updating are performed using the STRCF tracker.
Further, in step 3), the occlusion determination includes determining a degree of occlusion of the target using block matching. In step 2), if StIf the value is larger than the threshold value delta (set to 1.55), the target is occluded, and the object occlusion state is determined by using block matching.
Further, in step 4), the block matching algorithm determines the occlusion degree, which includes the following details:
in the current frame t, the target is shielded, and target information of the t-1 frame and a filter template are saved. Uniform four-segmentation is performed on t-1 and the target region of the current frame, and then comparison is performed using color histogram information.
Figure BDA0002237441040000101
The degree of matching of each block with the target block of the previous frame is calculated using the above formula, where the hist function is a function of image similarity comparison using color histogram information. Uniform four-partitioning of image blocks, BtA set of candidate four blocks for the current frame. B ist-1A target set of four blocks for a previous frame. Use ofAnd calculating the number of the block matching.
Further, in step 5), the threshold ξ is set as 3, and when Num is greater than 3, it indicates that the target is partially occluded; otherwise, complete occlusion occurs.
Further, in step 6), in the partial occlusion processing, the target tracking is performed by using the STRCF tracker, but the template update and the scale update of the STFCF tracker are not performed, so that the template is prevented from being polluted by the background information.
Further, in step 7), the full occlusion algorithm includes the following details:
the historical position information is combined with a Kalman filter to carry out position estimation, and a position estimation prediction formula and a correction formula of a specific formula are as follows:
Figure BDA0002237441040000103
Figure BDA0002237441040000104
Figure BDA0002237441040000105
Figure BDA0002237441040000106
Pk=(I-KkH)Pk
wherein ,xkIs the system state at time k. A is a state transition matrix, B is a state control matrix, uk-1Is a state control vector, PkIs the error covariance matrix and Q is the system state error. y iskIs an observation vector. H is the parameter matrix of the measurement system. The upper right hand "-" indicates that the state was inferred from the previous statekIs the kalman gain. The specific parameters are
Figure BDA0002237441040000111
B=[0.125 0.125 0.5 0.5]T,uk-1=0.004,
Figure BDA0002237441040000112
It is not plausible to use kalman filtering only for position estimation, and the present algorithm re-detects near the estimated position. According to research, only transverse change and no longitudinal change occur during the period that the object is completely shielded. Information is collected around the estimated location and matching is performed.
Figure BDA0002237441040000113
Figure BDA0002237441040000114
Is a search area with the position estimated by Kalman filtering as the center and the length equal to n1+n2+1 times the height of the target of more than one frame, with a width equal to m1+m2+m3Multiplying by the width of the above frame object. D ═ Area1,Area2,····,Areai]In the search Area, the size of the target in the previous frame is used as a search window, and sliding is carried out by step size 19 to obtain Areai。m1,m2,n1,n2The calculation formula is as follows:
Figure BDA0002237441040000115
Figure BDA0002237441040000121
Figure BDA0002237441040000122
Figure BDA0002237441040000123
wherein ,
Figure BDA0002237441040000124
for the position estimated by the Kalman filter, HEIGHT, WIDE is the size of the input image, (t)w,th) The width and height of the target of the previous frame.
Further, step (ii)8) And updating the target scale. When the central coordinate of the target is obtained in a new frame, the image blocks are divided into K scales,
Figure BDA0002237441040000125
k33, a 1.02. Each scale image block size is sM sN, where MN is the original target size, respectively. And then, performing related filtering operation on each size image block, and selecting the maximum value as a final target scale.
Further, step 8), the updating of the correlation filter includes extracting features from the target frame, and obtaining the latest correlation filter by minimizing the loss function. Minimizing a loss function of
Figure BDA0002237441040000126
Where y is the true value.
In summary, the working principle of the invention is as follows:
and when the object is not shielded, the STRCF is utilized to position the target and carry out normal template updating and scale updating. When occlusion occurs, whether the object is partially occluded or completely occluded is judged by utilizing block matching. The STRCF can realize good positioning for slight partial shielding, and on the basis, the STRCF is used as a basic filter to perform target positioning, but the filter is not updated, so that template pollution is avoided. When the object is completely shielded, according to historical position information, a Kalman filter is combined to carry out position estimation and define a secondary detection area, and accurate positioning is carried out in the secondary detection area through feature matching.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. An anti-occlusion target tracking method based on the STRCF is characterized by comprising the following steps:
1) reading an image sequence, framing a target to be tracked in an initial frame image, determining the central coordinate and the size of the target, initializing a tracker, forming an initial target template according to a target area, and initializing a Kalman filter;
2) calculating occlusion decision function value st
3) Judging the occlusion function value s obtained in the step 2tComparing the current value with a preset threshold value delta, and judging the shielding degree if the current value is greater than the preset threshold value delta; otherwise, acquiring the maximum correlation response value by using the STRCF tracker, outputting the target center coordinate corresponding to the maximum correlation response value, and performing the step 8 and the step 9;
4) and if the shielding judgment function value obtained in the step 3 is larger than the threshold value delta, determining the shielding degree by using a block matching algorithm, and calculating a block matching number Num.
5) If the block matching number Num calculated in the step 4 is larger than the threshold xi, judging that the target to be tracked is in a partial shielding state; otherwise, the screen is in a complete shielding state;
6) if the step 5 judges that the target to be tracked is in a partial shielding state, the STRCF tracker is used for obtaining a maximum correlation response value, target center coordinates corresponding to the maximum correlation response value are output, and the step 8 and the step 9 are carried out, otherwise, the step 7 is carried out;
7) and if the target to be tracked is judged to be in a completely shielded state in the step 5, performing position estimation on the target to be detected by using a Kalman filter, dividing a secondary rechecking area around the estimated position, and performing accurate positioning by using the characteristic information in the range area.
8) Updating the target scale;
9) a correlation filter update is performed.
2. The anti-occlusion target tracking method based on the correlation filtering as claimed in claim 1, characterized in that: in step 1), the framing of the target to be tracked includes image denoising.
3. A substrate as claimed in claim 1The anti-occlusion target tracking method based on the correlation filtering is characterized in that: in step 2), the occlusion decision function value calculation formula is as follows:
Figure FDA0002237441030000011
Figure FDA0002237441030000021
wherein qtA quality evaluation function CRM is a correlation filtering response diagram (correlation operation is carried out on a correlation filter and a candidate image block), mu and sigma are respectively a mean value and a standard deviation of a side lobe area, and the side lobe area is an 11 multiplied by 11 area taking a maximum peak response point as a center; z is an offset coefficient and is set to 0.1.
4. The anti-occlusion target tracking method based on the correlation filtering as claimed in claim 1, characterized in that: in step 3), the threshold δ is 1.55.
5. The anti-occlusion target tracking method based on the correlation filtering as claimed in claim 1, characterized in that: in step 4), the block matching number Num is calculated as follows:
Figure FDA0002237441030000022
where Num represents the number of the candidate targets of the previous frame and the current frame with higher block matching degree. hist is a function for comparing cosine similarity of two blocks by using color histogram information; uniform four-partitioning of image blocks, BtA set of candidate four blocks for the current frame. B ist-1A set of four blocks targeted by the previous frame.
6. The anti-occlusion target tracking method based on the correlation filtering as claimed in claim 1, characterized in that: in step 5), the threshold ξ is 3.
7. A process as claimed in claim 1The anti-occlusion target tracking method based on the correlation filtering is characterized by comprising the following steps: in step 7), the state transition matrix is set to
Figure FDA0002237441030000023
The state control matrix is set to B ═ 0.1250.1250.50.5]TThe state control vector is set to uk-10.004, parameter matrix of measurement system
Figure FDA0002237441030000024
Error of system state
Figure FDA0002237441030000025
8. The anti-occlusion target tracking method based on the correlation filtering as claimed in claim 1, characterized in that: in step 7), the secondary detection formula is
Figure FDA0002237441030000032
Is a search area with the position estimated by Kalman filtering as the center and the length equal to n1+n2+1 times the height of the target of more than one frame, with a width equal to m1+m2+m3Multiplying by the width of the above frame object. D ═ Area1,Area2,····,Areai]In the search Area, the size of the target in the previous frame is used as a search window, and sliding is carried out by step size 19 to obtain Areai。m1,m2,n1,n2The calculation formula is as follows:
Figure FDA0002237441030000034
Figure FDA0002237441030000035
Figure FDA0002237441030000036
wherein ,
Figure FDA0002237441030000037
for the position estimated by the Kalman filter, HEIGHT, WIDE is the size of the input image, (t)w,th) The width and height of the target of the previous frame.
CN201910988383.0A 2019-10-17 2019-10-17 Anti-occlusion target tracking method based on STRCF Active CN110717934B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910988383.0A CN110717934B (en) 2019-10-17 2019-10-17 Anti-occlusion target tracking method based on STRCF

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910988383.0A CN110717934B (en) 2019-10-17 2019-10-17 Anti-occlusion target tracking method based on STRCF

Publications (2)

Publication Number Publication Date
CN110717934A true CN110717934A (en) 2020-01-21
CN110717934B CN110717934B (en) 2023-04-28

Family

ID=69211806

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910988383.0A Active CN110717934B (en) 2019-10-17 2019-10-17 Anti-occlusion target tracking method based on STRCF

Country Status (1)

Country Link
CN (1) CN110717934B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111508002A (en) * 2020-04-20 2020-08-07 北京理工大学 Small-sized low-flying target visual detection tracking system and method thereof
CN111860189A (en) * 2020-06-24 2020-10-30 北京环境特性研究所 Target tracking method and device
CN113470074A (en) * 2021-07-09 2021-10-01 天津理工大学 Self-adaptive space-time regularization target tracking algorithm based on block discrimination
CN116228817A (en) * 2023-03-10 2023-06-06 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015163830A1 (en) * 2014-04-22 2015-10-29 Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi Target localization and size estimation via multiple model learning in visual tracking
CN105405151A (en) * 2015-10-26 2016-03-16 西安电子科技大学 Anti-occlusion target tracking method based on particle filtering and weighting Surf
CN107657630A (en) * 2017-07-21 2018-02-02 南京邮电大学 A kind of modified anti-shelter target tracking based on KCF
CN108198209A (en) * 2017-12-22 2018-06-22 天津理工大学 It is blocking and dimensional variation pedestrian tracking algorithm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015163830A1 (en) * 2014-04-22 2015-10-29 Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi Target localization and size estimation via multiple model learning in visual tracking
CN105405151A (en) * 2015-10-26 2016-03-16 西安电子科技大学 Anti-occlusion target tracking method based on particle filtering and weighting Surf
CN107657630A (en) * 2017-07-21 2018-02-02 南京邮电大学 A kind of modified anti-shelter target tracking based on KCF
CN108198209A (en) * 2017-12-22 2018-06-22 天津理工大学 It is blocking and dimensional variation pedestrian tracking algorithm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LI F, ET.AL: "Learning spatial一temporal regularized correlation filters for visual tracking" *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111508002A (en) * 2020-04-20 2020-08-07 北京理工大学 Small-sized low-flying target visual detection tracking system and method thereof
CN111860189A (en) * 2020-06-24 2020-10-30 北京环境特性研究所 Target tracking method and device
CN111860189B (en) * 2020-06-24 2024-01-19 北京环境特性研究所 Target tracking method and device
CN113470074A (en) * 2021-07-09 2021-10-01 天津理工大学 Self-adaptive space-time regularization target tracking algorithm based on block discrimination
CN116228817A (en) * 2023-03-10 2023-06-06 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering
CN116228817B (en) * 2023-03-10 2023-10-03 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering

Also Published As

Publication number Publication date
CN110717934B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN107292911B (en) Multi-target tracking method based on multi-model fusion and data association
CN110717934B (en) Anti-occlusion target tracking method based on STRCF
CN103325112B (en) Moving target method for quick in dynamic scene
CN107516321B (en) Video multi-target tracking method and device
CN105335986B (en) Method for tracking target based on characteristic matching and MeanShift algorithm
Fu et al. Centroid weighted Kalman filter for visual object tracking
CN107194408B (en) Target tracking method of mixed block sparse cooperation model
CN108876820B (en) Moving target tracking method under shielding condition based on mean shift
CN109961506A (en) A kind of fusion improves the local scene three-dimensional reconstruction method of Census figure
CN110610150B (en) Tracking method, device, computing equipment and medium of target moving object
CN105279769B (en) A kind of level particle filter tracking method for combining multiple features
CN106780567B (en) Immune particle filter extension target tracking method fusing color histogram and gradient histogram
CN110827262B (en) Weak and small target detection method based on continuous limited frame infrared image
CN115049954A (en) Target identification method, device, electronic equipment and medium
CN112164093A (en) Automatic person tracking method based on edge features and related filtering
CN113379789B (en) Moving target tracking method in complex environment
CN108647605B (en) Human eye gaze point extraction method combining global color and local structural features
Najafzadeh et al. Object tracking using Kalman filter with adaptive sampled histogram
CN110751671B (en) Target tracking method based on kernel correlation filtering and motion estimation
CN117252908A (en) Anti-occlusion multi-target tracking method based on attention
CN107392936B (en) Target tracking method based on meanshift
CN107067411B (en) Mean-shift tracking method combined with dense features
CN116193103A (en) Video picture jitter level assessment method
Liu et al. [Retracted] Mean Shift Fusion Color Histogram Algorithm for Nonrigid Complex Target Tracking in Sports Video
CN110322474B (en) Image moving target real-time detection method based on unmanned aerial vehicle platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant