CN113537241A - Long-term correlation filtering target tracking method based on adaptive feature fusion - Google Patents

Long-term correlation filtering target tracking method based on adaptive feature fusion Download PDF

Info

Publication number
CN113537241A
CN113537241A CN202110807192.7A CN202110807192A CN113537241A CN 113537241 A CN113537241 A CN 113537241A CN 202110807192 A CN202110807192 A CN 202110807192A CN 113537241 A CN113537241 A CN 113537241A
Authority
CN
China
Prior art keywords
target
apce
detection
hog
response
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110807192.7A
Other languages
Chinese (zh)
Other versions
CN113537241B (en
Inventor
甘玲
马煜
何鹏
刘菊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN202110807192.7A priority Critical patent/CN113537241B/en
Publication of CN113537241A publication Critical patent/CN113537241A/en
Application granted granted Critical
Publication of CN113537241B publication Critical patent/CN113537241B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features

Abstract

The invention relates to a long-term correlation filtering target tracking method based on self-adaptive feature fusion, and belongs to the field of computer vision. The method comprises the following steps: in the target position prediction stage, respectively extracting HOG characteristics and color histogram characteristics of each frame of image, respectively obtaining a response graph of each characteristic by using a kernel correlation filter and a color histogram model, normalizing the peak value of each characteristic response graph to dynamically adjust the weights of different characteristics, realizing the self-adaptive fusion of the characteristics, and estimating the tracking target position according to the fused characteristic response graph; meanwhile, two confidence detection indexes are set, and a detection filter and an SVM classifier are additionally trained to respectively detect whether the current tracking target is reliable or not, namely whether the target is lost or not and reposition the position of the lost target, so that the long-time tracking of the target is realized.

Description

Long-term correlation filtering target tracking method based on adaptive feature fusion
Technical Field
The invention belongs to the field of computer vision, and relates to a long-term correlation filtering target tracking method based on self-adaptive feature fusion.
Background
Target tracking is one of key technologies in the field of computational vision, and has a lot of applications in real life, including the fields of intelligent monitoring, human-computer interaction, medical diagnosis, national defense and military and the like. Due to the fact that actual tracking environments are complex and changeable, targets are susceptible to various factors, and the situation that the tracking fails due to target loss finally occurs. Therefore, how to achieve stable tracking of the target for a long time under the influence of multiple factors remains a challenging problem.
In recent years, a target tracking algorithm based on correlation filtering has attracted attention of many researchers due to its high accuracy and high real-time performance. In 2010, the MOSSE algorithm proposed by Bolme et al firstly introduces the correlation filtering into the target tracking field, and therefore, the research hot tide of the correlation filtering tracking algorithm is started. Henriques et al propose a CSK algorithm on the basis of an MOSSE algorithm, and utilize the property of a cyclic matrix to perform intensive sampling on samples, thereby solving the problem of insufficient training samples. Subsequently, the KCF algorithm proposed by Henriques et al replaces the single-channel gray scale feature with the multi-channel HOG feature, which greatly improves the tracking performance. Because the single feature has limited capability of describing the appearance of the target, Li and the like propose SAMF algorithm for the purpose of describing the appearance of the target, and the gray feature, the CN feature and the HOG feature are connected in series; li et al propose an LDES algorithm, weight-fuse HOG features and color histogram features, and estimate target scale and rotation angle in a log polar coordinate system by using a phase correlation method. In addition, for the condition that the target is lost in the long-time tracking process, Kalal and the like propose a TLD algorithm to combine detection and tracking; ma and the like propose an LCT algorithm, and random fern classification is utilized to carry out redetection on the target; ma et al propose the LCT2 algorithm, using SVM instead of random fern classification.
However, the above-described methods have disadvantages:
1) due to the limited ability of a single feature to describe the appearance of the target, most methods currently adopt a mode of fusing multiple features by fixed weight to track the target. However, the target is easily interfered by various factors in the actual tracking process, that is, each frame of tracking image is varied, the advantages of each feature cannot be well exerted by adopting fixed weight to fuse the features, and the weight distribution method has certain limitations.
2) Most of the existing methods directly update the model without effectively judging the tracking result of each frame when the target is tracked for a long time, when the target is lost due to the influence of various factors, the model learns wrong target appearance information, and finally the model drifts to cause tracking failure along with the accumulation of the wrong information.
Therefore, a method capable of solving the problem of target loss in the long-time tracking process of the target in a complex environment is needed.
Disclosure of Invention
In view of this, the present invention provides a long-term correlation filtering target tracking method based on adaptive feature fusion, which solves the problem of target loss in a long-term tracking process of a target in a complex environment. Specifically, the invention adopts a dynamic characteristic weight distribution mode to realize the self-adaptive fusion of the characteristics; and setting two effective confidence detection indexes and a target re-detection module, and recovering the tracking of the target after the target is lost.
In order to achieve the purpose, the invention provides the following technical scheme:
a long-time correlation filtering target tracking method based on self-adaptive feature fusion is characterized in that an adopted tracking frame is a basic tracking frame of an LDES algorithm, and the time complexity and the instantaneity of the algorithm are high and poor in consideration of methods for estimating target rotation change and iteratively solving an optimal solution in the LDES algorithm, so that the basic tracking frame in the algorithm, namely a kernel correlation filter, a color histogram model and a scale estimation model, is adopted under the condition that the performances are sacrificed. The method specifically comprises the following steps:
1) a target position prediction stage: respectively extracting HOG characteristics and color histogram characteristics of each frame of image, obtaining a response graph of each characteristic by using a kernel correlation filter and a color histogram model, calculating a normalization value of a response peak value of each characteristic, and dynamically adjusting the weight of each characteristic according to the normalization values of the response peak values of different characteristics, thereby realizing the self-adaptive fusion of the characteristics and estimating the position of a tracking target according to the fused characteristic response graph;
2) and (3) a target confidence detection stage: meanwhile, two confidence detection indexes are set to judge whether the current tracking result is reliable, namely a normalized value and a response peak value of APCE (average peak energy), a detection filter (a kernel correlation filter trained only by HOG characteristics) and an SVM classifier are additionally trained to respectively detect whether the current tracking target is reliable, namely whether the target is lost and the position of the lost target is relocated, the detection filter and the SVM classifier are trained by the tracking result with high confidence, long-term memory of the appearance of the target with high confidence can be kept, and the tracking of the lost target can be recovered, namely:
after the tracking target position is obtained in the target position prediction stage, detecting the tracking target position by using a detection filter, obtaining a detection response diagram of the tracking target position, then calculating to obtain a normalized value and a detection response peak value of APCE (advanced fine grain) of the response diagram, and judging the confidence of a detection result by using the detection response peak value; if the current frame is in a low confidence state, the current frame enters a target re-detection module, an SVM classifier is used for re-positioning the target, then the final position of the target is determined, finally the target is converted into a logarithmic polar coordinate system, the scale change of the target is estimated by using a phase correlation method, and various models are updated.
Further, in the target position prediction stage, the weight of each feature is dynamically adjusted according to the normalized value of the response peak value of different features, so as to realize the adaptive fusion of the features, specifically comprising the following steps:
(1) the weights for initializing the HOG feature and the color histogram feature are whog,1=0.6、whist,1=0.4;
(2) Respectively calculating response results of HOG characteristics and color histogram characteristics of the current frame according to the kernel correlation filter and the color histogram model, namely fhog,t、fhist,t
(3) Respectively normalizing the response peak values of the HOG characteristic and the color histogram characteristic according to a normalization value calculation formula to obtain muhog,t、μhist,t(ii) a Due to HOG characteristicsHas strong characterization capability, so when mu is measuredhog,tOr muhist,tLess than threshold τ1When, w can beho,tSet to a higher fixed value gamma1(ii) a Otherwise, the weight calculation formula of the feature is as follows:
Figure BDA0003167037140000031
Figure BDA0003167037140000032
(4) in order to prevent the characteristic weight from drifting, the characteristic weight of the tth frame is updated according to the formula:
who,t=θ×whog,t+(1-θ)×whog,1
whist,t=θ×whist,t+(1-θ)×whist,1
wherein θ represents a feature weight update rate;
(5) the final response result after the fusion of the image of the t-th frame is as follows: f. oft=fhog,t×who,t+fhist,t×whist,t
Further, the normalized value calculation formula is:
Figure BDA0003167037140000033
wherein f ismax,tRepresents the response peak value, mu, in the characteristic response diagram of the t-th frametAnd (4) a normalized value representing the characteristic response peak of the t-th frame.
Further, in a target confidence detection stage, in a characteristic response diagram, the higher the response peak is, the more stable the tracking result is, and the lower the response peak is, the unreliable the tracking result is; in addition, the APCE value can reflect the fluctuation of the response map, generally, the larger the APCE value is, the more stable the tracking result is, and when the APCE value is suddenly reduced, the target is considered to be lost. Therefore, the reliability of the current tracking target can be measured by using the normalized value and the response peak value of the APCE, namely, the reliability can be used as an index for detecting the confidence of the target, wherein the average peak valueValue energy APCE and its normalized value muAPCE,tThe calculation formula of (2) is as follows:
Figure BDA0003167037140000034
Figure BDA0003167037140000035
wherein, Fmax、FminRespectively representing the maximum and minimum values, F, in the response mapw,hDenotes a value, μ, at which the coordinate position is the (w, h) position in the response diagramAPCE,tNormalized value of APCE representing the detection response map of the t-th frame, APCEtAPCE value representing the t-th frame detection response diagram;
after the tracking target passes through the detection filter, if the tracking target detects a response peak value Fmax,tLess than threshold Tf1Or normalized value μ of APCEAPCE,tLess than threshold Tapce1When the target is in the low confidence state, the target is lost and unreliable; the rest of the cases represent that the target is reliable.
Further, in the target confidence detection stage, the target re-detection specifically includes: after the target is repositioned by using the SVM classifier, the repositioned target is detected by using a detection filter and confidence judgment is carried out, and if the current frame detects a response peak value Fmax,tGreater than a threshold value Tf2Or normalized value μ of APCEAPCE,tGreater than a threshold value Tapce2If so, the relocated target is considered to be reliable, otherwise, the original tracking target is adopted; meanwhile, in the tracking process, the detection filter and the SVM classifier also need to be updated, and in order to keep the effectiveness of the detection filter and the SVM classifier, the detection filter and the SVM classifier are updated only when the target is in a high-confidence-degree state, namely the detection response peak value F of the current framemax,tGreater than a threshold value Tf3Or normalized value μ of APCEAPCE,tGreater than a threshold value Tapce3
The invention has the beneficial effects that:
(1) in the position prediction stage of the target, the invention adopts a method for dynamically distributing the feature weight on the basis of fusing multiple features by fixed weight, realizes the self-adaptive fusion of the features, can better play the advantages of each feature and realizes the stable tracking of the target in a complex scene.
(2) After the predicted position of the target is obtained, a detection filter and an SVM classifier are additionally trained, two confidence detection indexes are set to carry out confidence judgment on the tracking result, if the confidence of the tracking result is too low, the current target is considered to be lost, the current target enters a re-detection module, the SVM classifier is used for re-positioning the target, and long-term tracking of the target is achieved.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention may be realized and attained by the means of the instrumentalities and combinations particularly pointed out hereinafter.
Drawings
For the purposes of promoting a better understanding of the objects, aspects and advantages of the invention, reference will now be made to the following detailed description taken in conjunction with the accompanying drawings in which:
FIG. 1 is a frame diagram of a long-term correlation filtering target tracking algorithm with adaptive feature fusion according to the present invention;
fig. 2 shows the evaluation results on the data set OTB-2015 in the verification experiment, wherein fig. 2(a) is a success rate chart and fig. 2(b) is a precision chart.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention in a schematic way, and the features in the following embodiments and examples may be combined with each other without conflict.
Wherein the showings are for the purpose of illustrating the invention only and not for the purpose of limiting the same, and in which there is shown by way of illustration only and not in the drawings in which there is no intention to limit the invention thereto; to better illustrate the embodiments of the present invention, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
Referring to fig. 1 to 2, a general framework of a long-term correlation filtering target tracking method based on adaptive feature fusion is shown in fig. 1, and the method includes the following steps:
the method comprises the following steps: the initial state of the target is b0=(x0,y0,s0) Initializing feature weights whog,1、whist,1A nuclear correlation filter FkoColor histogram model FhistScale estimation model FsDetection filter FRDetecting the module R;
step two: at t frame image ItIn cutting out with (x)t-1,yt-1) Taking the image block as a center, and extracting HOG characteristics and color histogram characteristics of the image block;
step three: respectively using Fhog、FhistCalculating a response map f of the corresponding featureshog,t、fhist,tAccording to the formula
Figure BDA0003167037140000051
The response peak values of different characteristics are normalized to obtain muhog,t、μhi,t
Step four: mu.s ofhhog,tOr muhist,tLess than τ1Then w isho,t=γ1Else according to formula
Figure BDA0003167037140000052
Figure BDA0003167037140000053
whog,t=θ×whog,t+(1-θ)×whog,1、whis,t=θ×whist,t+(1-θ)×whist,1Calculate who,t、whis,t
Step five: according to the formula ft=fhog,t×whog,t+fhist,t×whist,tCalculating to obtain a feature response graph f after fusiontAnd estimating a location pos ═ x't,y′t);
Step six: to be (x't,y′t) Is centered at ItCutting out new image block, extracting HOG feature, and utilizing FRCalculating to obtain a detection response graph according to a formula
Figure BDA0003167037140000054
Figure BDA0003167037140000055
Find current frame F1max,tAnd μ 1APCE,t
Step seven: if μ 1APCE,tLess than Tapce1Or F1max,tLess than Tf1Then go to R, get relocated target position pos1 ═ (x't1,y′t1) And step six is repeated to obtain the current frame F2max,tAnd μ 2APCE,tIf μ 2APCE,tGreater than Tapce2Or F2max,tGreater than Tf2If yes, pos is changed to pos1, otherwise, the original tracking result pos is adopted;
step eight: using FsEstimating a scale s of the targett
Step nine: update Fhog、FhistS, if the current frame is mu 3APCE,tGreater than Tapce3Or F3max,tGreater than Tf3Then update FR、R;
Step ten: if the sequence is not over, go to step two, otherwise the algorithm terminates.
And (3) verification experiment:
the data set used in this experiment was: OTB-2015;
the evaluation indexes are as follows: accuracy and success rate in OTB-2015 data set;
the experimental parameters were set as: detection filter FRHas a learning rate of 0.01, a feature weight update rate θ of 0.06, and a threshold τ10.23, weight γ10.7, confidence detection threshold Tapce1=0.15、Tapce=0.45、Tap=0.65、Tf1=0.15、Tf2=0.35、Tf3The rest parameters are consistent with the parameters in the base framework of the LDES algorithm as 0.55.
The experimental environment is as follows: intel (R) core (TM) i5-9300H CPU @2.4GHz, a 16GB memory, a Win 1064 bit operating system notebook computer, MATLAB 2017 b.
The experimental results are as follows: as can be seen from FIG. 2, the accuracy and success rate of the target tracking of the present invention are superior to other existing methods.
Finally, the above embodiments are only intended to illustrate the technical solutions of the present invention and not to limit the present invention, and although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions, and all of them should be covered by the claims of the present invention.

Claims (5)

1. A long-time correlation filtering target tracking method based on self-adaptive feature fusion is characterized by comprising the following steps:
1) a target position prediction stage: respectively extracting HOG characteristics and color histogram characteristics of each frame of image, obtaining a response graph of each characteristic by using a kernel correlation filter and a color histogram model, normalizing the peak value of each characteristic response graph, and dynamically adjusting the weight of each characteristic according to the normalized value of the response peak value of different characteristics, thereby realizing the self-adaptive fusion of the characteristics and estimating the position of a tracking target according to the fused characteristic response graph;
2) and (3) a target confidence detection stage: after a tracking target position is obtained in a target position prediction stage, detecting the tracking target position by using a detection filter, obtaining a detection response diagram of the tracking target position, calculating to obtain a normalized value and a detection response peak value of APCE of the response diagram, judging confidence of a detection result by using the detection filter, entering a target re-detection module if a current frame is in a low confidence state, re-positioning the target by using an SVM classifier, determining the final position of the target, finally converting the final position into a logarithmic polar coordinate system, estimating the scale change of the target by using a phase correlation method, and updating various models; where APCE represents the average peak energy.
2. The long-term correlation filtering target tracking method according to claim 1, wherein in the target position prediction stage, the weight of each feature is dynamically adjusted according to the normalized value of the response peak value of different features, so as to implement adaptive fusion of the features, specifically comprising the following steps:
(1) the weights for initializing the HOG feature and the color histogram feature are whog,1、whist,1
(2) Respectively calculating response results of HOG characteristics and color histogram characteristics of the current frame according to the kernel correlation filter and the color histogram model, namely fhog,t、fhist,t
(3) Respectively normalizing the response peak values of the HOG characteristic and the color histogram characteristic according to a normalization value calculation formula to obtain muhog,t、μhist,t(ii) a When mu ishog,tOr muhis,tLess than threshold τ1When, w is to behog,tSet to a high fixed value gamma1(ii) a Otherwise, the weight calculation formula of the feature is as follows:
Figure FDA0003167037130000011
(4) in order to prevent the characteristic weight from drifting, the characteristic weight of the tth frame is updated according to the formula:
whog,t=θ×whog,t+(1-θ)×whog,1
whis,t=θ×whist,t+(1-θ)×whist,1
wherein θ represents a feature weight update rate;
(5) the final response result after the fusion of the image of the t-th frame is as follows: f. oft=fhog,t×whog,t+fhist,t×whist,t
3. The long-term correlation filtering target tracking method according to claim 1 or 2, wherein the normalized value calculation formula is as follows:
Figure FDA0003167037130000012
wherein f ismax,tRepresents the response peak value, mu, in the characteristic response diagram of the t-th frametAnd (4) a normalized value representing the characteristic response peak of the t-th frame.
4. The long-term correlation filtering target tracking method according to claim 1, wherein in the target confidence detection stage, the confidence of the currently tracked target is measured by using the normalized value and the response peak value of APCE, i.e. the APCE is used as the index of target confidence detection, wherein the average peak energy APCE and the normalized value μ thereofAPCE,tThe calculation formula of (2) is as follows:
Figure FDA0003167037130000021
Figure FDA0003167037130000022
wherein, Fmax、FminRespectively representing the maximum and minimum values, F, in the response mapw,hIndicating that the coordinate position in the response diagram is (w, h) positionValue of set, μAPCE,tNormalized value of APCE representing the detection response map of the t-th frame, APCEtAPCE value representing the t-th frame detection response diagram;
after the tracking target passes through the detection filter, if the tracking target detects a response peak value Fmax,tLess than threshold Tf1Or normalized value μ of APCEAPCE,tLess than threshold Tapce1When the target is in the low confidence state, the target is lost and unreliable; the rest of the cases represent that the target is reliable.
5. The long-term correlation filtering target tracking method according to claim 1, wherein in the target confidence detection stage, the target re-detection specifically comprises: after the target is repositioned by using the SVM classifier, the repositioned target is detected by using a detection filter and confidence judgment is carried out, and if the current frame detects a response peak value Fmax,tGreater than a threshold value Tf2Or normalized value μ of APCEAPCE,tGreater than a threshold value TapceIf so, the relocated target is considered to be reliable, otherwise, the original tracking target is adopted; meanwhile, in the tracking process, the detection filter and the SVM classifier also need to be updated, and in order to keep the effectiveness of the detection filter and the SVM classifier, the detection filter and the SVM classifier are updated only when the target is in a high-confidence-degree state, namely the detection response peak value F of the current framemax,tGreater than a threshold value Tf3Or normalized value μ of APCEAPCE,tGreater than a threshold value Tapce
CN202110807192.7A 2021-07-16 2021-07-16 Long-term correlation filtering target tracking method based on adaptive feature fusion Active CN113537241B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110807192.7A CN113537241B (en) 2021-07-16 2021-07-16 Long-term correlation filtering target tracking method based on adaptive feature fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110807192.7A CN113537241B (en) 2021-07-16 2021-07-16 Long-term correlation filtering target tracking method based on adaptive feature fusion

Publications (2)

Publication Number Publication Date
CN113537241A true CN113537241A (en) 2021-10-22
CN113537241B CN113537241B (en) 2022-11-08

Family

ID=78099800

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110807192.7A Active CN113537241B (en) 2021-07-16 2021-07-16 Long-term correlation filtering target tracking method based on adaptive feature fusion

Country Status (1)

Country Link
CN (1) CN113537241B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117635665A (en) * 2024-01-25 2024-03-01 浙江航天润博测控技术有限公司 Anti-occlusion target tracking method based on correlation filtering

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3001353A2 (en) * 2014-09-29 2016-03-30 Ricoh Company, Ltd. Object tracking method and device as well as tracking feature selection method
WO2017152794A1 (en) * 2016-03-10 2017-09-14 Zhejiang Shenghui Lighting Co., Ltd. Method and device for target tracking
CN108665481A (en) * 2018-03-27 2018-10-16 西安电子科技大学 Multilayer depth characteristic fusion it is adaptive resist block infrared object tracking method
CN110796676A (en) * 2019-10-10 2020-02-14 太原理工大学 Target tracking method combining high-confidence updating strategy with SVM (support vector machine) re-detection technology
CN111008996A (en) * 2019-12-09 2020-04-14 华侨大学 Target tracking method through hierarchical feature response fusion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3001353A2 (en) * 2014-09-29 2016-03-30 Ricoh Company, Ltd. Object tracking method and device as well as tracking feature selection method
WO2017152794A1 (en) * 2016-03-10 2017-09-14 Zhejiang Shenghui Lighting Co., Ltd. Method and device for target tracking
CN108665481A (en) * 2018-03-27 2018-10-16 西安电子科技大学 Multilayer depth characteristic fusion it is adaptive resist block infrared object tracking method
CN110796676A (en) * 2019-10-10 2020-02-14 太原理工大学 Target tracking method combining high-confidence updating strategy with SVM (support vector machine) re-detection technology
CN111008996A (en) * 2019-12-09 2020-04-14 华侨大学 Target tracking method through hierarchical feature response fusion

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
XIAOTIAN WANG,KAI ZHANG: "An Optimal Logn-Term Aerial Infrared Object Tracking Algorithm With Re-Detection", 《IEEE ACCESS》 *
XIAOTIAN WANG,KAI ZHANG: "An Optimal Logn-Term Aerial Infrared Object Tracking Algorithm With Re-Detection", 《IEEE ACCESS》, 18 July 2019 (2019-07-18) *
卢湖川,李佩霞: "目标跟踪算法综述", 《模式识别与人工智能》 *
卢湖川,李佩霞: "目标跟踪算法综述", 《模式识别与人工智能》, vol. 31, no. 1, 6 March 2018 (2018-03-06) *
甘玲,杨梦: "聚合支持向量机分类器的行人检测方法", 《计算机工程与应用》 *
甘玲,杨梦: "聚合支持向量机分类器的行人检测方法", 《计算机工程与应用》, 25 June 2018 (2018-06-25) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117635665A (en) * 2024-01-25 2024-03-01 浙江航天润博测控技术有限公司 Anti-occlusion target tracking method based on correlation filtering

Also Published As

Publication number Publication date
CN113537241B (en) 2022-11-08

Similar Documents

Publication Publication Date Title
CN109583342B (en) Human face living body detection method based on transfer learning
US8774499B2 (en) Embedded optical flow features
CN108027972B (en) System and method for object tracking
CN108447080B (en) Target tracking method, system and storage medium based on hierarchical data association and convolutional neural network
CN111461038B (en) Pedestrian re-identification method based on layered multi-mode attention mechanism
US9798923B2 (en) System and method for tracking and recognizing people
CN112016445B (en) Monitoring video-based remnant detection method
CN105976397B (en) A kind of method for tracking target
CN110414439A (en) Anti- based on multi-peak detection blocks pedestrian tracting method
CN113516005B (en) Dance action evaluation system based on deep learning and gesture estimation
CN113537241B (en) Long-term correlation filtering target tracking method based on adaptive feature fusion
CN114330572A (en) Anomaly detection method and system based on contrast learning and computer storage medium
CN112200021A (en) Target crowd tracking and monitoring method based on limited range scene
CN101789128B (en) Target detection and tracking method based on DSP and digital image processing system
CN114972735A (en) Anti-occlusion moving target tracking device and method based on ROI prediction and multi-module learning
CN114155512A (en) Fatigue detection method and system based on multi-feature fusion of 3D convolutional network
CN107315997B (en) Sight orientation judgment method and system based on rapid feature point positioning
CN113436228A (en) Anti-blocking and target recapturing method of correlation filtering target tracking algorithm
CN111862160A (en) Target tracking method, medium and system based on ARM platform
CN108596057A (en) A kind of Information Security Management System based on recognition of face
CN116563345A (en) Multi-target tracking method based on point-surface matching association
CN113743231B (en) Video target detection avoidance system and method
EP3955166A2 (en) Training in neural networks
CN114494972A (en) Target tracking method and system combining channel selection and position optimization
CN114757967A (en) Multi-scale anti-occlusion target tracking method based on manual feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant