CN110717934B - Anti-occlusion target tracking method based on STRCF - Google Patents
Anti-occlusion target tracking method based on STRCF Download PDFInfo
- Publication number
- CN110717934B CN110717934B CN201910988383.0A CN201910988383A CN110717934B CN 110717934 B CN110717934 B CN 110717934B CN 201910988383 A CN201910988383 A CN 201910988383A CN 110717934 B CN110717934 B CN 110717934B
- Authority
- CN
- China
- Prior art keywords
- target
- strcf
- shielding
- occlusion
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 238000001514 detection method Methods 0.000 claims abstract description 11
- 230000004044 response Effects 0.000 claims description 17
- 238000004364 calculation method Methods 0.000 claims description 10
- 238000001914 filtration Methods 0.000 claims description 9
- 238000004422 calculation algorithm Methods 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 7
- 238000005259 measurement Methods 0.000 claims description 2
- 238000001303 quality assessment method Methods 0.000 claims description 2
- 230000007704 transition Effects 0.000 claims description 2
- 230000009897 systematic effect Effects 0.000 claims 1
- 230000000903 blocking effect Effects 0.000 abstract description 5
- 238000013441 quality evaluation Methods 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 18
- 230000000007 visual effect Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000011160 research Methods 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G06T5/70—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/223—Analysis of motion using block-matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20032—Median filtering
Abstract
The invention provides an anti-occlusion target tracking method based on STRCF, which can accurately judge whether an object is occluded or not through a constructed quality evaluation function and an occlusion judgment function, and when the object is not occluded, the STRCF is utilized for target positioning and normal updating. When the shielding occurs, the blocking matching is utilized to judge whether the object is partially shielding or completely shielding. The STRCF can realize good positioning for slight partial shielding, and on the basis, the STRCF is used as a basic filter for target positioning, but filter updating and scale updating are not performed, so that template pollution is avoided. When the object is completely shielded, according to the historical position information, combining a Kalman filter to perform position estimation and define a secondary detection area, and performing accurate positioning in the secondary detection area through feature matching.
Description
Technical Field
The invention relates to the technical field of target tracking of computer vision, in particular to an anti-occlusion target tracking method based on STRCF.
Background
Visual target tracking is an important branch of research in the field of computer vision. Object tracking has been studied extensively in recent years, but due to the high difficulty of the problems and the scarcity of high quality data, the research heat is still slightly lower than the basic visual tasks such as object detection, semantic segmentation and the like. Nevertheless, with the development of deep learning and related filtering and the improvement of computer power, the performance of the visual algorithm is suddenly advanced, so that the target tracking is quite available in the fields of video monitoring, virtual reality, man-machine interaction, medical diagnosis, unmanned driving and the like. In general, object tracking faces several difficulties: object deformation, brightness variation, rapid movement, background interference coverage, etc., are often irregular variations that are difficult to track objects and must be addressed for application in industry.
Occlusion problems are typical of object tracking, where a visual object may be occluded by a stationary object or multiple moving objects, and may also be occluded by deformation itself. According to the shielding range, the shielding can be divided into partial shielding and complete shielding. The shielding process is mainly divided into three parts: firstly, a visual target enters a shielding state, partial shielding occurs, and key information is gradually lost; secondly, the target is covered by a background or a shielding object, complete shielding appears, and the target information is completely lost; thirdly, the object is gradually separated from the shielding, and the object information is gradually recovered.
To solve the problem of tracking failure due to occlusion, document (Visual object tracking using adaptive correlation filters [ C ] CVPR 2010,San Francisco,CA,USA,13-18June 2010.IEEE,2010) proposes a tracking failure detection policy PSR for judging whether a target is occluded or not, but the actual effect is poor. The tracking failure detection strategy proposed in the literature (A Fully-Correlational Long-Term Tracker [ J ]. 2017) is PSR multiplied by the maximum response value. The literature (Switching Kalman Filter-Based Approach for Tracking and Event Detection at Traffic Interactions. IEEE, 2005) uses Kalman filters to predict the position of an object that is occluded, but this approach works poorly when the object is moving irregularly or the occlusion time is too long. Document (a Correlation-Based Layered Matching and Target Tracking method.2006,27 (4): 670-675.) proposes a hierarchical weighted matching algorithm that divides a target region into a plurality of rectangular region bands centered on a target center point, with a large center region weight and a small peripheral region weight, and can achieve robust tracking of partially occluded targets as well, but is incapable of complete occlusion. The literature (Real-time part-based visual tracking via adaptive correlation filters, in: proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2015, pp.4902-4912) divides a target into a plurality of blocks, and uses the blocks which are not occluded to effectively track, and no effective method can be completely solved at present for the situation that the target is completely occluded. Document (Occlusion-aware fragment-based tracking with spatial-temporal consistency, j.ieee trans.image process.25 (8) (2016) 3814-3825) proposes a CSR-DCF based filter that uses a quality assessment function to determine if an object is occluded and uses a different filter update strategy, with an Occlusion tracker tracking when occluded, but the filter is still drifting to the "background" when the object is fully occluded. The document (Effective Occlusion Handling for Fast Correlation Filter-based collectors [ J ] 2018.) uses an additional correlation filter to locate occluded objects, and this method is suitable for partial occlusion, has poor applicability to severe occlusion and long-term occlusion, and introduces additional computational complexity. The object tracking based on the correlation filter proposed by literature (Learning Spatial-Temporal Regularized Correlation Filters for Visual Tracking [ J ]. 2018.) and the tracking algorithm based on the combination of CNNS and CF proposed by literature (Robust visual tracking based on deep convolutional neural networks and kernelized correlation filters [ J ]. Journal of Electronic Imaging,2018,27 (2): 1) all achieve good performance, but do not solve the occlusion problem well, but rather limit the scale variation.
The target tracking based on DCF is greatly focused due to higher calculation efficiency and better robustness, and the STRCF proposed in literature (Learning spatial-temporal regularized correlation filters for visual tracking.2018:4904-4913.) introduces a space-time regularization term on the basis of DCF, so that the target tracking based on DCF has better robustness, higher accuracy and faster speed under the condition of greatly changing appearance, and the phenomenon of tracking drift to the background still occurs when the STRCF is used for long-time shielding.
Disclosure of Invention
Aiming at the defect that the current related filter can not well solve the problem of target shielding or is only effective for partial shielding and short-time shielding, the invention provides an anti-shielding target tracking method based on an STRCF (short-range continuous wave filter), wherein a shielding processing strategy is introduced on the basis of the STRCF tracker, and a quality evaluation function based on PSR is used for judging whether a target is shielded or not; when the target is not shielded, the STRCF is utilized to perform target positioning work, and the blocking matching thought is utilized to judge the shielding state of the target; different tracking strategies and updating strategies are adopted in partial occlusion and complete occlusion. For video target tracking with longer shielding time, the method has higher tracking speed, higher tracking precision and lower calculation complexity.
The invention is realized in the following way: an anti-occlusion target tracking method based on STRCF comprises the following steps:
step 2, calculating the shielding judgment function value s t ;
Step 3, the shielding judgment function value s obtained in the step 2 t Comparing the measured value with a preset threshold delta, and judging the shielding degree if the measured value is larger than the preset threshold delta; otherwise, using an STRCF tracker to acquire the maximum correlation response value, outputting the target center coordinate and the scale corresponding to the maximum correlation response value, and performing the steps 8 and 9;
and step 4, if the shielding judgment function value obtained in the step 3 is larger than a threshold value, determining the shielding degree by using a block matching algorithm, and calculating a block matching number Num.
Step 5, if the block matching number Num calculated in the step 4 is larger than a threshold value xi, judging that the target to be tracked is in a partial shielding state; otherwise, the device is in a complete shielding state;
step 6, if the step 5 judges that the target to be tracked is in a partial shielding state, using an STRCF tracker to acquire a maximum correlation response value, outputting a target center coordinate and a scale corresponding to the maximum correlation response value, and performing the step 8 and the step 9, otherwise performing the step 7;
and 7, if the target to be tracked is judged to be in a complete shielding state in the step 5, estimating the position of the target to be detected by using a Kalman filter, defining a secondary re-detection area around the estimated position, and accurately positioning by utilizing feature matching in the range area.
Step 8, updating the target scale;
and 9, updating the correlation filter.
As a further aspect of the present invention, in step 2, the occlusion decision function value s t The calculation formula is as follows:
wherein the formula (2) is a quality evaluation function, CRM is a correlation filter response diagram (correlation operation is carried out on the correlation filter and the characteristics of the candidate image), mu and sigma are respectively the mean value and standard deviation of side lobe areas, and the side lobe areas are 11 multiplied by 11 areas (excluding the maximum peak response point) taking the maximum peak response point as the center; z is a bias coefficient and is set to 0.1.
As a further aspect of the present invention, in step 3, the threshold δ is set to 1.55.
In step 4, as a further scheme of the present invention, the calculation formula of the block matching number Num is as follows:
wherein Num represents the number of the previous frame target and the candidate target in the current frame with higher matching degree. hist is a function that performs cosine similarity comparison on two blocks using color histogram information; uniformly and quartering the image block, B t A set of four blocks that are candidate image blocks in the current frame. B (B) t-1 A set of four tiles for the object in the previous frame.
As a further aspect of the present invention, in step 5, the threshold ζ is set to 3.
In a further aspect of the present invention, in step 7, the length of the secondary review area is equal to n, centered on the position estimated by kalman filtering 1 +n 2 +1 times the height of the object in the previous frame, width being equal to m 1 +m 2 +m 3 Multiplying by the width of the object in the previous frame. m is m 1 ,m 2 ,n 1 ,n 2 The calculation formula is as follows:
wherein ,position estimated for Kalman filter, HEIGHT, WIDE is the size of the input image, (t) w ,t h ) The width and height of the target of the previous frame.
In summary, compared with the prior art, the invention has the following advantages:
after the invention is evaluated on OTB-2015, VOT-2016 and sample-color-128 data sets, the invention has lower calculation complexity and higher precision aiming at long-time shielding, and compared with an STRCF tracker, the average overlapping precision and the average overlapping expectation are respectively improved by 4 percent and 4.5 percent; in addition, the tracking speed is improved by 8.8%, and the tracking device has good effects on target out-of-bounds, in-plane rotation and out-of-plane rotation.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a response diagram of an object under non-occluded, partially occluded, and out-occluded in an embodiment.
FIG. 3 is a diagram of an occlusion decision function in one embodiment.
FIG. 4 is a schematic diagram of secondary detection in an embodiment.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly and completely described together with the drawings in the embodiments of the present invention. It will be apparent that the embodiments described are only some, but not all, of the embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
As shown in fig. 1, an anti-occlusion target tracking method based on STRCF includes the following steps:
1) Input picture frame: and acquiring a picture frame from the original video, and preprocessing.
2) Selecting a target to be tracked: and selecting a target to be tracked from the first frame image, and acquiring the position and scale information of the target.
3) And (3) shielding judgment: whether the target is occluded or not is determined by an occlusion determination function.
4) Judging the shielding degree: the object occlusion state (partial occlusion, complete occlusion) is determined by block matching.
5) And (3) shielding treatment: the partial shielding adopts an STRCF tracker, and the scale updating and the STRCF tracker updating are not performed; and (3) completely shielding, carrying out position estimation by adopting a Kalman filter, demarcating a secondary rechecking area and carrying out secondary accurate positioning.
Further, in step 1), the preprocessing includes obtaining a picture by extracting a frame from the video, and filtering and denoising the picture.
In the filtering and denoising process, the pixel value of the central point is replaced by the median in the neighborhood of the central point, so that the isolated point with the pixel value of 0 or 255 is removed, and the principle is as follows:
F(x,y)=Med(P(x,y))
wherein P (x, y) is a set of pixel points in the NXN template neighborhood at the point (x, y), med () represents a median function, gray values of all pixel points in the neighborhood centering on the pixel point (x, y) are arranged from large to small, and a pixel gray value F (x, y) at a middle position is output to represent a new pixel value after median filtering processing at the point (x, y).
Further, in step 2), the selecting the target area to be tracked includes defining the target to be tracked, and acquiring coordinate information (X, Y, W, H), where (X, Y) is a center point coordinate of the target, and (W, H) is a length and a width of the target.
Further, in step 3), the occlusion determination includes performing an occlusion determination using the occlusion determination function provided by the present invention.
And establishing a search area of the current frame, wherein the search area is the same as the position of the target frame of the previous frame, and the area is 1.8 times of the target frame. UsingCalculating PSR t Extracting feature vector of target->And weighted by cosine window, usingObtaining a motion-dependent filter response map CRM t Mu and sigma are the mean and standard deviation of the sidelobe region, respectively, which is a region of 11×11 centered on the maximum peak response point; z is a bias coefficient and is set to 0.1. UsingCalculate q t The method comprises the steps of carrying out a first treatment on the surface of the Use->And calculating an occlusion judgment function value.
If S t If the value is smaller than the threshold value delta (set to 1.55), no occlusion occurs, and normal tracking and filter updating are performed using the STRCF tracker.
Further, in step 3), the occlusion determination includes determining a course of the object being occluded using block matchingAnd (5) judging the degree. When in step 2), if S t When the target is greater than the threshold delta (set to 1.55), the target is blocked at this time, and the blocking matching is used for determining the object blocking state.
Further, in step 4), the determining the shielding degree by the block matching algorithm includes the following details:
in the current frame t, the target is blocked, and we save the target information of t-1 frame, and the filter template. And uniformly and quarter-blocking the t-1 and the target area of the current frame, and then comparing by utilizing the color histogram information.
The matching degree of each block and the target block of the previous frame is calculated by using the above formula, wherein the hist function is a function for comparing the image similarity by using the color histogram information. Uniformly and quartering the image block, B t A set of four blocks of candidates for the current frame. B (B) t-1 Is the target set of quarter blocks for the frame of the previous frame. UsingAnd calculating the number of block matching.
Further, in step 5), the threshold ζ is set to 3, and when Num is greater than 3, it indicates that the target is partially blocked; otherwise, complete occlusion occurs.
Further, in step 6), the partial shielding process uses an STRCF tracker to track the target, but does not update the template and scale of the STFCF tracker, so as to avoid the pollution of the template by background information.
Further, in step 7), the complete occlusion algorithm includes the following details:
the position estimation is carried out by combining the historical position information with a Kalman filter, and a specific formula position estimation prediction formula and a correction formula are shown as follows:
P k =(I-K k H)P k
wherein ,xk Is the system state at time k. A is a state transition matrix, B is a state control matrix, u k-1 For state control vector, P k And Q is a system state error and is an error covariance matrix. y is k Is an observation vector. H is a parameter matrix of the measurement system. The upper right sign "-" indicates that the state was inferred from the previous state k Is the kalman gain. The specific parameters areB=[0.125 0.125 0.5 0.5] T ,u k-1 =0.004,
The use of only kalman filtering for position estimation is not trusted, and the present algorithm re-detects in the vicinity of the estimated position. According to the research, only the transverse change and no longitudinal change occur during the period that the object is completely shielded. Information is collected around the estimated location for matching.
Is a search area with a length equal to n centered on the position estimated by Kalman filtering 1 +n 2 +1 times the height of the previous frame object, width being equal to m 1 +m 2 +m 3 Multiplying by the width of the last frame object. D= [ Area ] 1 ,Area 2 ,····,Area i ]In the search Area, the size of the previous frame of target is taken as a search window, and the sliding is performed by a step length 19 to obtain Area i 。m 1 ,m 2 ,n 1 ,n 2 The calculation formula is as follows:
wherein ,position estimated for Kalman filter, HEIGHT, WIDE is the size of the input image, (t) w ,t h ) The width and height of the target of the previous frame.
Further, step 8), the target scale is updated. When a new frame obtains the center coordinates of the target, the image block is divided into K scales,k=33, a=1.02. Each scale image block size is sm×sn, where MN is the size of the original target, respectively.And then carrying out relevant filtering operation on each size image block, and selecting the maximum value as a final target scale.
Further, step 8), the updating of the correlation filter includes extracting features on the target frame, and obtaining the latest correlation filter by minimizing the loss function. Minimizing the loss function asWhere y is a true value.
In summary, the working principle of the invention is as follows:
through the constructed quality evaluation function, whether the object is blocked or not can be accurately judged, and when the object is not blocked, the STRCF is utilized to perform target positioning and normal template updating and scale updating. When the shielding occurs, the blocking matching is utilized to judge whether the object is partially shielding or completely shielding. The STRCF can realize good positioning for slight partial shielding, and on the basis, the STRCF is used as a basic filter for target positioning, but the filter updating is not performed, so that the template pollution is avoided. When the object is completely shielded, according to the historical position information, combining a Kalman filter to perform position estimation and define a secondary detection area, and performing accurate positioning in the secondary detection area through feature matching.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (2)
1. An anti-occlusion target tracking method based on STRCF is characterized by comprising the following steps:
1) Reading an image sequence, selecting a target to be tracked in an initial frame image, determining the center coordinates and the size of the target, initializing a tracker, forming an initial target template according to a target area, and initializing a Kalman filter;
2) Calculating occlusion judgmentConstant function value s t ;
3) The shading judgment function value s obtained in the step 2 is used for t Comparing the measured value with a preset threshold delta, and judging the shielding degree if the measured value is larger than the preset threshold delta; otherwise, using an STRCF tracker to acquire the maximum correlation response value, outputting a target center coordinate corresponding to the maximum correlation response value, and performing step 8 and step 9;
4) If the shielding judgment function value obtained in the step 3 is larger than the threshold delta, determining the shielding degree by using a block matching algorithm, and calculating a block matching number Num;
5) If the block matching number Num calculated in the step 4 is larger than the threshold value xi, judging that the target to be tracked is in a partial shielding state; otherwise, the device is in a complete shielding state;
6) If the step 5 judges that the target to be tracked is in a partial shielding state, using an STRCF tracker to acquire a maximum correlation response value, outputting a target center coordinate corresponding to the maximum correlation response value, and performing the step 8 and the step 9, otherwise, performing the step 7;
7) If the target to be tracked is judged to be in a complete shielding state in the step 5, a Kalman filter is used for estimating the position of the target to be detected, a secondary rechecking area is defined around the estimated position, and the accurate positioning is carried out in the range area by utilizing the characteristic information;
8) Updating the target scale;
9) Updating a correlation filter;
2. the anti-occlusion target tracking method based on STRCF of claim 1, wherein: in the step 1), the frame selection of the target to be tracked comprises image denoising;
in step 2), the occlusion decision function value calculation formula is as follows: wherein qt Quality assessment function, CRM is a correlation filter response map (correlation filter and candidate image blockRow correlation operation), μ and σ are the mean and standard deviation of the sidelobe region, respectively, which is a region of 11×11 centered on the maximum peak response point; z is a bias coefficient and is set to 0.1;
in step 3), the threshold δ is 1.55;
in step 4), the calculation formula of the block matching number Num is as follows:
wherein Num represents the number of the previous frame target and the candidate target block of the current frame with higher matching degree, hist is a function for performing cosine similarity comparison on the two blocks by utilizing color histogram information; uniformly and quartering the image block, B t Four-block set as candidate object of current frame, B t-1 A set of quarter blocks for a previous frame target;
in step 5), the threshold ζ is 3;
in step 7), the state transition matrix is set toThe state control matrix is set to b= [0.125 0.125 0.5 0.5 ]] T The state control vector is set to u k-1 =0.004, parameter matrix of measurement system +.>Systematic status error->
The secondary detection formula is,/>Is a search area with a length equal to n centered on the position estimated by Kalman filtering 1 +n 2 +1 times the height of the previous frame targetDegree, width equal to m 1 +m 2 +m 3 Multiplied by the width of the last frame target, d= [ Area ] 1 ,Area 2 ,····,Area i ]In the search Area, the size of the previous frame of target is taken as a search window, and the sliding is performed by a step length 19 to obtain Area i ,m 1 ,m 2 ,n 1 ,n 2 The calculation formula is as follows:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910988383.0A CN110717934B (en) | 2019-10-17 | 2019-10-17 | Anti-occlusion target tracking method based on STRCF |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910988383.0A CN110717934B (en) | 2019-10-17 | 2019-10-17 | Anti-occlusion target tracking method based on STRCF |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110717934A CN110717934A (en) | 2020-01-21 |
CN110717934B true CN110717934B (en) | 2023-04-28 |
Family
ID=69211806
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910988383.0A Active CN110717934B (en) | 2019-10-17 | 2019-10-17 | Anti-occlusion target tracking method based on STRCF |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110717934B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111508002B (en) * | 2020-04-20 | 2020-12-25 | 北京理工大学 | Small-sized low-flying target visual detection tracking system and method thereof |
CN111860189B (en) * | 2020-06-24 | 2024-01-19 | 北京环境特性研究所 | Target tracking method and device |
CN113470074B (en) * | 2021-07-09 | 2022-07-29 | 天津理工大学 | Self-adaptive space-time regularization target tracking method based on block discrimination |
CN116228817B (en) * | 2023-03-10 | 2023-10-03 | 东南大学 | Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015163830A1 (en) * | 2014-04-22 | 2015-10-29 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | Target localization and size estimation via multiple model learning in visual tracking |
CN105405151A (en) * | 2015-10-26 | 2016-03-16 | 西安电子科技大学 | Anti-occlusion target tracking method based on particle filtering and weighting Surf |
CN107657630A (en) * | 2017-07-21 | 2018-02-02 | 南京邮电大学 | A kind of modified anti-shelter target tracking based on KCF |
CN108198209A (en) * | 2017-12-22 | 2018-06-22 | 天津理工大学 | It is blocking and dimensional variation pedestrian tracking algorithm |
-
2019
- 2019-10-17 CN CN201910988383.0A patent/CN110717934B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015163830A1 (en) * | 2014-04-22 | 2015-10-29 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | Target localization and size estimation via multiple model learning in visual tracking |
CN105405151A (en) * | 2015-10-26 | 2016-03-16 | 西安电子科技大学 | Anti-occlusion target tracking method based on particle filtering and weighting Surf |
CN107657630A (en) * | 2017-07-21 | 2018-02-02 | 南京邮电大学 | A kind of modified anti-shelter target tracking based on KCF |
CN108198209A (en) * | 2017-12-22 | 2018-06-22 | 天津理工大学 | It is blocking and dimensional variation pedestrian tracking algorithm |
Non-Patent Citations (1)
Title |
---|
Li F, et.al. Learning spatial一temporal regularized correlation filters for visual tracking.《Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition》.2018,全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN110717934A (en) | 2020-01-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110717934B (en) | Anti-occlusion target tracking method based on STRCF | |
CN108876820B (en) | Moving target tracking method under shielding condition based on mean shift | |
CN110728697A (en) | Infrared dim target detection tracking method based on convolutional neural network | |
CN109434251B (en) | Welding seam image tracking method based on particle filtering | |
CN109785366B (en) | Related filtering target tracking method for shielding | |
CN110610150B (en) | Tracking method, device, computing equipment and medium of target moving object | |
CN110555870B (en) | DCF tracking confidence evaluation and classifier updating method based on neural network | |
CN110175649B (en) | Rapid multi-scale estimation target tracking method for re-detection | |
CN103106667A (en) | Motion target tracing method towards shielding and scene change | |
CN111612817A (en) | Target tracking method based on depth feature adaptive fusion and context information | |
CN111192296A (en) | Pedestrian multi-target detection and tracking method based on video monitoring | |
CN111652790B (en) | Sub-pixel image registration method | |
CN106780567B (en) | Immune particle filter extension target tracking method fusing color histogram and gradient histogram | |
CN109255799B (en) | Target tracking method and system based on spatial adaptive correlation filter | |
CN112164093A (en) | Automatic person tracking method based on edge features and related filtering | |
CN112580476A (en) | Sperm identification and multi-target track tracking method | |
CN111369570A (en) | Multi-target detection tracking method for video image | |
CN115049954A (en) | Target identification method, device, electronic equipment and medium | |
KR101690050B1 (en) | Intelligent video security system | |
CN111161308A (en) | Dual-band fusion target extraction method based on key point matching | |
CN111091583B (en) | Long-term target tracking method | |
Najafzadeh et al. | Object tracking using Kalman filter with adaptive sampled histogram | |
CN113092807A (en) | Urban elevated road vehicle speed measuring method based on multi-target tracking algorithm | |
CN110751671B (en) | Target tracking method based on kernel correlation filtering and motion estimation | |
CN112200831B (en) | Dynamic template-based dense connection twin neural network target tracking method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |