CN110717934B - Anti-occlusion target tracking method based on STRCF - Google Patents

Anti-occlusion target tracking method based on STRCF Download PDF

Info

Publication number
CN110717934B
CN110717934B CN201910988383.0A CN201910988383A CN110717934B CN 110717934 B CN110717934 B CN 110717934B CN 201910988383 A CN201910988383 A CN 201910988383A CN 110717934 B CN110717934 B CN 110717934B
Authority
CN
China
Prior art keywords
target
strcf
shielding
occlusion
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910988383.0A
Other languages
Chinese (zh)
Other versions
CN110717934A (en
Inventor
张汗灵
谢悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN201910988383.0A priority Critical patent/CN110717934B/en
Publication of CN110717934A publication Critical patent/CN110717934A/en
Application granted granted Critical
Publication of CN110717934B publication Critical patent/CN110717934B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering

Abstract

The invention provides an anti-occlusion target tracking method based on STRCF, which can accurately judge whether an object is occluded or not through a constructed quality evaluation function and an occlusion judgment function, and when the object is not occluded, the STRCF is utilized for target positioning and normal updating. When the shielding occurs, the blocking matching is utilized to judge whether the object is partially shielding or completely shielding. The STRCF can realize good positioning for slight partial shielding, and on the basis, the STRCF is used as a basic filter for target positioning, but filter updating and scale updating are not performed, so that template pollution is avoided. When the object is completely shielded, according to the historical position information, combining a Kalman filter to perform position estimation and define a secondary detection area, and performing accurate positioning in the secondary detection area through feature matching.

Description

Anti-occlusion target tracking method based on STRCF
Technical Field
The invention relates to the technical field of target tracking of computer vision, in particular to an anti-occlusion target tracking method based on STRCF.
Background
Visual target tracking is an important branch of research in the field of computer vision. Object tracking has been studied extensively in recent years, but due to the high difficulty of the problems and the scarcity of high quality data, the research heat is still slightly lower than the basic visual tasks such as object detection, semantic segmentation and the like. Nevertheless, with the development of deep learning and related filtering and the improvement of computer power, the performance of the visual algorithm is suddenly advanced, so that the target tracking is quite available in the fields of video monitoring, virtual reality, man-machine interaction, medical diagnosis, unmanned driving and the like. In general, object tracking faces several difficulties: object deformation, brightness variation, rapid movement, background interference coverage, etc., are often irregular variations that are difficult to track objects and must be addressed for application in industry.
Occlusion problems are typical of object tracking, where a visual object may be occluded by a stationary object or multiple moving objects, and may also be occluded by deformation itself. According to the shielding range, the shielding can be divided into partial shielding and complete shielding. The shielding process is mainly divided into three parts: firstly, a visual target enters a shielding state, partial shielding occurs, and key information is gradually lost; secondly, the target is covered by a background or a shielding object, complete shielding appears, and the target information is completely lost; thirdly, the object is gradually separated from the shielding, and the object information is gradually recovered.
To solve the problem of tracking failure due to occlusion, document (Visual object tracking using adaptive correlation filters [ C ] CVPR 2010,San Francisco,CA,USA,13-18June 2010.IEEE,2010) proposes a tracking failure detection policy PSR for judging whether a target is occluded or not, but the actual effect is poor. The tracking failure detection strategy proposed in the literature (A Fully-Correlational Long-Term Tracker [ J ]. 2017) is PSR multiplied by the maximum response value. The literature (Switching Kalman Filter-Based Approach for Tracking and Event Detection at Traffic Interactions. IEEE, 2005) uses Kalman filters to predict the position of an object that is occluded, but this approach works poorly when the object is moving irregularly or the occlusion time is too long. Document (a Correlation-Based Layered Matching and Target Tracking method.2006,27 (4): 670-675.) proposes a hierarchical weighted matching algorithm that divides a target region into a plurality of rectangular region bands centered on a target center point, with a large center region weight and a small peripheral region weight, and can achieve robust tracking of partially occluded targets as well, but is incapable of complete occlusion. The literature (Real-time part-based visual tracking via adaptive correlation filters, in: proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2015, pp.4902-4912) divides a target into a plurality of blocks, and uses the blocks which are not occluded to effectively track, and no effective method can be completely solved at present for the situation that the target is completely occluded. Document (Occlusion-aware fragment-based tracking with spatial-temporal consistency, j.ieee trans.image process.25 (8) (2016) 3814-3825) proposes a CSR-DCF based filter that uses a quality assessment function to determine if an object is occluded and uses a different filter update strategy, with an Occlusion tracker tracking when occluded, but the filter is still drifting to the "background" when the object is fully occluded. The document (Effective Occlusion Handling for Fast Correlation Filter-based collectors [ J ] 2018.) uses an additional correlation filter to locate occluded objects, and this method is suitable for partial occlusion, has poor applicability to severe occlusion and long-term occlusion, and introduces additional computational complexity. The object tracking based on the correlation filter proposed by literature (Learning Spatial-Temporal Regularized Correlation Filters for Visual Tracking [ J ]. 2018.) and the tracking algorithm based on the combination of CNNS and CF proposed by literature (Robust visual tracking based on deep convolutional neural networks and kernelized correlation filters [ J ]. Journal of Electronic Imaging,2018,27 (2): 1) all achieve good performance, but do not solve the occlusion problem well, but rather limit the scale variation.
The target tracking based on DCF is greatly focused due to higher calculation efficiency and better robustness, and the STRCF proposed in literature (Learning spatial-temporal regularized correlation filters for visual tracking.2018:4904-4913.) introduces a space-time regularization term on the basis of DCF, so that the target tracking based on DCF has better robustness, higher accuracy and faster speed under the condition of greatly changing appearance, and the phenomenon of tracking drift to the background still occurs when the STRCF is used for long-time shielding.
Disclosure of Invention
Aiming at the defect that the current related filter can not well solve the problem of target shielding or is only effective for partial shielding and short-time shielding, the invention provides an anti-shielding target tracking method based on an STRCF (short-range continuous wave filter), wherein a shielding processing strategy is introduced on the basis of the STRCF tracker, and a quality evaluation function based on PSR is used for judging whether a target is shielded or not; when the target is not shielded, the STRCF is utilized to perform target positioning work, and the blocking matching thought is utilized to judge the shielding state of the target; different tracking strategies and updating strategies are adopted in partial occlusion and complete occlusion. For video target tracking with longer shielding time, the method has higher tracking speed, higher tracking precision and lower calculation complexity.
The invention is realized in the following way: an anti-occlusion target tracking method based on STRCF comprises the following steps:
step 1, reading an image sequence, selecting a target to be tracked in an initial frame image, determining the center coordinates and the size of the target, initializing a tracker, forming an initial target template according to a target area, and initializing a Kalman filter;
step 2, calculating the shielding judgment function value s t
Step 3, the shielding judgment function value s obtained in the step 2 t Comparing the measured value with a preset threshold delta, and judging the shielding degree if the measured value is larger than the preset threshold delta; otherwise, using an STRCF tracker to acquire the maximum correlation response value, outputting the target center coordinate and the scale corresponding to the maximum correlation response value, and performing the steps 8 and 9;
and step 4, if the shielding judgment function value obtained in the step 3 is larger than a threshold value, determining the shielding degree by using a block matching algorithm, and calculating a block matching number Num.
Step 5, if the block matching number Num calculated in the step 4 is larger than a threshold value xi, judging that the target to be tracked is in a partial shielding state; otherwise, the device is in a complete shielding state;
step 6, if the step 5 judges that the target to be tracked is in a partial shielding state, using an STRCF tracker to acquire a maximum correlation response value, outputting a target center coordinate and a scale corresponding to the maximum correlation response value, and performing the step 8 and the step 9, otherwise performing the step 7;
and 7, if the target to be tracked is judged to be in a complete shielding state in the step 5, estimating the position of the target to be detected by using a Kalman filter, defining a secondary re-detection area around the estimated position, and accurately positioning by utilizing feature matching in the range area.
Step 8, updating the target scale;
and 9, updating the correlation filter.
As a further aspect of the present invention, in step 2, the occlusion decision function value s t The calculation formula is as follows:
Figure BDA0002237441040000051
Figure BDA0002237441040000052
Figure BDA0002237441040000053
wherein the formula (2) is a quality evaluation function, CRM is a correlation filter response diagram (correlation operation is carried out on the correlation filter and the characteristics of the candidate image), mu and sigma are respectively the mean value and standard deviation of side lobe areas, and the side lobe areas are 11 multiplied by 11 areas (excluding the maximum peak response point) taking the maximum peak response point as the center; z is a bias coefficient and is set to 0.1.
As a further aspect of the present invention, in step 3, the threshold δ is set to 1.55.
In step 4, as a further scheme of the present invention, the calculation formula of the block matching number Num is as follows:
Figure BDA0002237441040000054
Figure BDA0002237441040000061
wherein Num represents the number of the previous frame target and the candidate target in the current frame with higher matching degree. hist is a function that performs cosine similarity comparison on two blocks using color histogram information; uniformly and quartering the image block, B t A set of four blocks that are candidate image blocks in the current frame. B (B) t-1 A set of four tiles for the object in the previous frame.
As a further aspect of the present invention, in step 5, the threshold ζ is set to 3.
In a further aspect of the present invention, in step 7, the length of the secondary review area is equal to n, centered on the position estimated by kalman filtering 1 +n 2 +1 times the height of the object in the previous frame, width being equal to m 1 +m 2 +m 3 Multiplying by the width of the object in the previous frame. m is m 1 ,m 2 ,n 1 ,n 2 The calculation formula is as follows:
Figure BDA0002237441040000062
Figure BDA0002237441040000063
/>
Figure BDA0002237441040000064
Figure BDA0002237441040000065
wherein ,
Figure BDA0002237441040000071
position estimated for Kalman filter, HEIGHT, WIDE is the size of the input image, (t) w ,t h ) The width and height of the target of the previous frame.
In summary, compared with the prior art, the invention has the following advantages:
after the invention is evaluated on OTB-2015, VOT-2016 and sample-color-128 data sets, the invention has lower calculation complexity and higher precision aiming at long-time shielding, and compared with an STRCF tracker, the average overlapping precision and the average overlapping expectation are respectively improved by 4 percent and 4.5 percent; in addition, the tracking speed is improved by 8.8%, and the tracking device has good effects on target out-of-bounds, in-plane rotation and out-of-plane rotation.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a response diagram of an object under non-occluded, partially occluded, and out-occluded in an embodiment.
FIG. 3 is a diagram of an occlusion decision function in one embodiment.
FIG. 4 is a schematic diagram of secondary detection in an embodiment.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly and completely described together with the drawings in the embodiments of the present invention. It will be apparent that the embodiments described are only some, but not all, of the embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
As shown in fig. 1, an anti-occlusion target tracking method based on STRCF includes the following steps:
1) Input picture frame: and acquiring a picture frame from the original video, and preprocessing.
2) Selecting a target to be tracked: and selecting a target to be tracked from the first frame image, and acquiring the position and scale information of the target.
3) And (3) shielding judgment: whether the target is occluded or not is determined by an occlusion determination function.
4) Judging the shielding degree: the object occlusion state (partial occlusion, complete occlusion) is determined by block matching.
5) And (3) shielding treatment: the partial shielding adopts an STRCF tracker, and the scale updating and the STRCF tracker updating are not performed; and (3) completely shielding, carrying out position estimation by adopting a Kalman filter, demarcating a secondary rechecking area and carrying out secondary accurate positioning.
Further, in step 1), the preprocessing includes obtaining a picture by extracting a frame from the video, and filtering and denoising the picture.
In the filtering and denoising process, the pixel value of the central point is replaced by the median in the neighborhood of the central point, so that the isolated point with the pixel value of 0 or 255 is removed, and the principle is as follows:
F(x,y)=Med(P(x,y))
wherein P (x, y) is a set of pixel points in the NXN template neighborhood at the point (x, y), med () represents a median function, gray values of all pixel points in the neighborhood centering on the pixel point (x, y) are arranged from large to small, and a pixel gray value F (x, y) at a middle position is output to represent a new pixel value after median filtering processing at the point (x, y).
Further, in step 2), the selecting the target area to be tracked includes defining the target to be tracked, and acquiring coordinate information (X, Y, W, H), where (X, Y) is a center point coordinate of the target, and (W, H) is a length and a width of the target.
Further, in step 3), the occlusion determination includes performing an occlusion determination using the occlusion determination function provided by the present invention.
And establishing a search area of the current frame, wherein the search area is the same as the position of the target frame of the previous frame, and the area is 1.8 times of the target frame. Using
Figure BDA0002237441040000091
Calculating PSR t Extracting feature vector of target->
Figure BDA0002237441040000092
And weighted by cosine window, using
Figure BDA0002237441040000093
Obtaining a motion-dependent filter response map CRM t Mu and sigma are the mean and standard deviation of the sidelobe region, respectively, which is a region of 11×11 centered on the maximum peak response point; z is a bias coefficient and is set to 0.1. Using
Figure BDA0002237441040000094
Calculate q t The method comprises the steps of carrying out a first treatment on the surface of the Use->
Figure BDA0002237441040000095
And calculating an occlusion judgment function value.
If S t If the value is smaller than the threshold value delta (set to 1.55), no occlusion occurs, and normal tracking and filter updating are performed using the STRCF tracker.
Further, in step 3), the occlusion determination includes determining a course of the object being occluded using block matchingAnd (5) judging the degree. When in step 2), if S t When the target is greater than the threshold delta (set to 1.55), the target is blocked at this time, and the blocking matching is used for determining the object blocking state.
Further, in step 4), the determining the shielding degree by the block matching algorithm includes the following details:
in the current frame t, the target is blocked, and we save the target information of t-1 frame, and the filter template. And uniformly and quarter-blocking the t-1 and the target area of the current frame, and then comparing by utilizing the color histogram information.
Figure BDA0002237441040000101
The matching degree of each block and the target block of the previous frame is calculated by using the above formula, wherein the hist function is a function for comparing the image similarity by using the color histogram information. Uniformly and quartering the image block, B t A set of four blocks of candidates for the current frame. B (B) t-1 Is the target set of quarter blocks for the frame of the previous frame. Using
Figure BDA0002237441040000102
And calculating the number of block matching.
Further, in step 5), the threshold ζ is set to 3, and when Num is greater than 3, it indicates that the target is partially blocked; otherwise, complete occlusion occurs.
Further, in step 6), the partial shielding process uses an STRCF tracker to track the target, but does not update the template and scale of the STFCF tracker, so as to avoid the pollution of the template by background information.
Further, in step 7), the complete occlusion algorithm includes the following details:
the position estimation is carried out by combining the historical position information with a Kalman filter, and a specific formula position estimation prediction formula and a correction formula are shown as follows:
Figure BDA0002237441040000103
Figure BDA0002237441040000104
Figure BDA0002237441040000105
Figure BDA0002237441040000106
P k =(I-K k H)P k
wherein ,xk Is the system state at time k. A is a state transition matrix, B is a state control matrix, u k-1 For state control vector, P k And Q is a system state error and is an error covariance matrix. y is k Is an observation vector. H is a parameter matrix of the measurement system. The upper right sign "-" indicates that the state was inferred from the previous state k Is the kalman gain. The specific parameters are
Figure BDA0002237441040000111
B=[0.125 0.125 0.5 0.5] T ,u k-1 =0.004,
Figure BDA0002237441040000112
The use of only kalman filtering for position estimation is not trusted, and the present algorithm re-detects in the vicinity of the estimated position. According to the research, only the transverse change and no longitudinal change occur during the period that the object is completely shielded. Information is collected around the estimated location for matching.
Figure BDA0002237441040000113
Figure BDA0002237441040000114
Is a search area with a length equal to n centered on the position estimated by Kalman filtering 1 +n 2 +1 times the height of the previous frame object, width being equal to m 1 +m 2 +m 3 Multiplying by the width of the last frame object. D= [ Area ] 1 ,Area 2 ,····,Area i ]In the search Area, the size of the previous frame of target is taken as a search window, and the sliding is performed by a step length 19 to obtain Area i 。m 1 ,m 2 ,n 1 ,n 2 The calculation formula is as follows:
Figure BDA0002237441040000115
/>
Figure BDA0002237441040000121
Figure BDA0002237441040000122
Figure BDA0002237441040000123
wherein ,
Figure BDA0002237441040000124
position estimated for Kalman filter, HEIGHT, WIDE is the size of the input image, (t) w ,t h ) The width and height of the target of the previous frame.
Further, step 8), the target scale is updated. When a new frame obtains the center coordinates of the target, the image block is divided into K scales,
Figure BDA0002237441040000125
k=33, a=1.02. Each scale image block size is sm×sn, where MN is the size of the original target, respectively.And then carrying out relevant filtering operation on each size image block, and selecting the maximum value as a final target scale.
Further, step 8), the updating of the correlation filter includes extracting features on the target frame, and obtaining the latest correlation filter by minimizing the loss function. Minimizing the loss function as
Figure BDA0002237441040000126
Where y is a true value.
In summary, the working principle of the invention is as follows:
through the constructed quality evaluation function, whether the object is blocked or not can be accurately judged, and when the object is not blocked, the STRCF is utilized to perform target positioning and normal template updating and scale updating. When the shielding occurs, the blocking matching is utilized to judge whether the object is partially shielding or completely shielding. The STRCF can realize good positioning for slight partial shielding, and on the basis, the STRCF is used as a basic filter for target positioning, but the filter updating is not performed, so that the template pollution is avoided. When the object is completely shielded, according to the historical position information, combining a Kalman filter to perform position estimation and define a secondary detection area, and performing accurate positioning in the secondary detection area through feature matching.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (2)

1. An anti-occlusion target tracking method based on STRCF is characterized by comprising the following steps:
1) Reading an image sequence, selecting a target to be tracked in an initial frame image, determining the center coordinates and the size of the target, initializing a tracker, forming an initial target template according to a target area, and initializing a Kalman filter;
2) Calculating occlusion judgmentConstant function value s t
3) The shading judgment function value s obtained in the step 2 is used for t Comparing the measured value with a preset threshold delta, and judging the shielding degree if the measured value is larger than the preset threshold delta; otherwise, using an STRCF tracker to acquire the maximum correlation response value, outputting a target center coordinate corresponding to the maximum correlation response value, and performing step 8 and step 9;
4) If the shielding judgment function value obtained in the step 3 is larger than the threshold delta, determining the shielding degree by using a block matching algorithm, and calculating a block matching number Num;
5) If the block matching number Num calculated in the step 4 is larger than the threshold value xi, judging that the target to be tracked is in a partial shielding state; otherwise, the device is in a complete shielding state;
6) If the step 5 judges that the target to be tracked is in a partial shielding state, using an STRCF tracker to acquire a maximum correlation response value, outputting a target center coordinate corresponding to the maximum correlation response value, and performing the step 8 and the step 9, otherwise, performing the step 7;
7) If the target to be tracked is judged to be in a complete shielding state in the step 5, a Kalman filter is used for estimating the position of the target to be detected, a secondary rechecking area is defined around the estimated position, and the accurate positioning is carried out in the range area by utilizing the characteristic information;
8) Updating the target scale;
9) Updating a correlation filter;
2. the anti-occlusion target tracking method based on STRCF of claim 1, wherein: in the step 1), the frame selection of the target to be tracked comprises image denoising;
in step 2), the occlusion decision function value calculation formula is as follows:
Figure QLYQS_1
Figure QLYQS_2
wherein qt Quality assessment function, CRM is a correlation filter response map (correlation filter and candidate image blockRow correlation operation), μ and σ are the mean and standard deviation of the sidelobe region, respectively, which is a region of 11×11 centered on the maximum peak response point; z is a bias coefficient and is set to 0.1;
in step 3), the threshold δ is 1.55;
in step 4), the calculation formula of the block matching number Num is as follows:
Figure QLYQS_3
wherein Num represents the number of the previous frame target and the candidate target block of the current frame with higher matching degree, hist is a function for performing cosine similarity comparison on the two blocks by utilizing color histogram information; uniformly and quartering the image block, B t Four-block set as candidate object of current frame, B t-1 A set of quarter blocks for a previous frame target;
in step 5), the threshold ζ is 3;
in step 7), the state transition matrix is set to
Figure QLYQS_4
The state control matrix is set to b= [0.125 0.125 0.5 0.5 ]] T The state control vector is set to u k-1 =0.004, parameter matrix of measurement system +.>
Figure QLYQS_5
Systematic status error->
Figure QLYQS_6
The secondary detection formula is
Figure QLYQS_7
,/>
Figure QLYQS_8
Is a search area with a length equal to n centered on the position estimated by Kalman filtering 1 +n 2 +1 times the height of the previous frame targetDegree, width equal to m 1 +m 2 +m 3 Multiplied by the width of the last frame target, d= [ Area ] 1 ,Area 2 ,····,Area i ]In the search Area, the size of the previous frame of target is taken as a search window, and the sliding is performed by a step length 19 to obtain Area i ,m 1 ,m 2 ,n 1 ,n 2 The calculation formula is as follows:
Figure QLYQS_9
Figure QLYQS_10
Figure QLYQS_11
Figure QLYQS_12
wherein ,
Figure QLYQS_13
position estimated for Kalman filter, HEIGHT, WIDE is the size of the input image, (t) w ,t h ) The width and height of the target of the previous frame. />
CN201910988383.0A 2019-10-17 2019-10-17 Anti-occlusion target tracking method based on STRCF Active CN110717934B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910988383.0A CN110717934B (en) 2019-10-17 2019-10-17 Anti-occlusion target tracking method based on STRCF

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910988383.0A CN110717934B (en) 2019-10-17 2019-10-17 Anti-occlusion target tracking method based on STRCF

Publications (2)

Publication Number Publication Date
CN110717934A CN110717934A (en) 2020-01-21
CN110717934B true CN110717934B (en) 2023-04-28

Family

ID=69211806

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910988383.0A Active CN110717934B (en) 2019-10-17 2019-10-17 Anti-occlusion target tracking method based on STRCF

Country Status (1)

Country Link
CN (1) CN110717934B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111508002B (en) * 2020-04-20 2020-12-25 北京理工大学 Small-sized low-flying target visual detection tracking system and method thereof
CN111860189B (en) * 2020-06-24 2024-01-19 北京环境特性研究所 Target tracking method and device
CN113470074B (en) * 2021-07-09 2022-07-29 天津理工大学 Self-adaptive space-time regularization target tracking method based on block discrimination
CN116228817B (en) * 2023-03-10 2023-10-03 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015163830A1 (en) * 2014-04-22 2015-10-29 Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi Target localization and size estimation via multiple model learning in visual tracking
CN105405151A (en) * 2015-10-26 2016-03-16 西安电子科技大学 Anti-occlusion target tracking method based on particle filtering and weighting Surf
CN107657630A (en) * 2017-07-21 2018-02-02 南京邮电大学 A kind of modified anti-shelter target tracking based on KCF
CN108198209A (en) * 2017-12-22 2018-06-22 天津理工大学 It is blocking and dimensional variation pedestrian tracking algorithm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015163830A1 (en) * 2014-04-22 2015-10-29 Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi Target localization and size estimation via multiple model learning in visual tracking
CN105405151A (en) * 2015-10-26 2016-03-16 西安电子科技大学 Anti-occlusion target tracking method based on particle filtering and weighting Surf
CN107657630A (en) * 2017-07-21 2018-02-02 南京邮电大学 A kind of modified anti-shelter target tracking based on KCF
CN108198209A (en) * 2017-12-22 2018-06-22 天津理工大学 It is blocking and dimensional variation pedestrian tracking algorithm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Li F, et.al. Learning spatial一temporal regularized correlation filters for visual tracking.《Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition》.2018,全文. *

Also Published As

Publication number Publication date
CN110717934A (en) 2020-01-21

Similar Documents

Publication Publication Date Title
CN110717934B (en) Anti-occlusion target tracking method based on STRCF
CN108876820B (en) Moving target tracking method under shielding condition based on mean shift
CN110728697A (en) Infrared dim target detection tracking method based on convolutional neural network
CN109434251B (en) Welding seam image tracking method based on particle filtering
CN109785366B (en) Related filtering target tracking method for shielding
CN110610150B (en) Tracking method, device, computing equipment and medium of target moving object
CN110555870B (en) DCF tracking confidence evaluation and classifier updating method based on neural network
CN110175649B (en) Rapid multi-scale estimation target tracking method for re-detection
CN103106667A (en) Motion target tracing method towards shielding and scene change
CN111612817A (en) Target tracking method based on depth feature adaptive fusion and context information
CN111192296A (en) Pedestrian multi-target detection and tracking method based on video monitoring
CN111652790B (en) Sub-pixel image registration method
CN106780567B (en) Immune particle filter extension target tracking method fusing color histogram and gradient histogram
CN109255799B (en) Target tracking method and system based on spatial adaptive correlation filter
CN112164093A (en) Automatic person tracking method based on edge features and related filtering
CN112580476A (en) Sperm identification and multi-target track tracking method
CN111369570A (en) Multi-target detection tracking method for video image
CN115049954A (en) Target identification method, device, electronic equipment and medium
KR101690050B1 (en) Intelligent video security system
CN111161308A (en) Dual-band fusion target extraction method based on key point matching
CN111091583B (en) Long-term target tracking method
Najafzadeh et al. Object tracking using Kalman filter with adaptive sampled histogram
CN113092807A (en) Urban elevated road vehicle speed measuring method based on multi-target tracking algorithm
CN110751671B (en) Target tracking method based on kernel correlation filtering and motion estimation
CN112200831B (en) Dynamic template-based dense connection twin neural network target tracking method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant