CN111931722B - Correlated filtering tracking method combining color ratio characteristics - Google Patents

Correlated filtering tracking method combining color ratio characteristics Download PDF

Info

Publication number
CN111931722B
CN111931722B CN202011006205.2A CN202011006205A CN111931722B CN 111931722 B CN111931722 B CN 111931722B CN 202011006205 A CN202011006205 A CN 202011006205A CN 111931722 B CN111931722 B CN 111931722B
Authority
CN
China
Prior art keywords
target
model
color
frame
scale
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011006205.2A
Other languages
Chinese (zh)
Other versions
CN111931722A (en
Inventor
姜山
郭晓龙
丁大勇
顾鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Shiyu Intelligent Vision System Technology Co ltd
Original Assignee
Hangzhou Shiyu Intelligent Vision System Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Shiyu Intelligent Vision System Technology Co ltd filed Critical Hangzhou Shiyu Intelligent Vision System Technology Co ltd
Priority to CN202011006205.2A priority Critical patent/CN111931722B/en
Publication of CN111931722A publication Critical patent/CN111931722A/en
Application granted granted Critical
Publication of CN111931722B publication Critical patent/CN111931722B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a correlated filtering tracking method combining color ratio characteristics, which comprises the following steps: initializing a video target area frame; initializing a target model and a target candidate model; extracting multi-channel features at the position of the target area frame; training a position filter and a scale filter; extracting multi-channel characteristics from a next frame target region frame of the video; calculating a correlation response graph, a color response graph and a fusion response graph; acquiring the displacement of the target relative to the previous frame, and updating the position of the target through the displacement; calculating scale response, acquiring the scale change of the target relative to the previous frame, and restoring a target area frame of the current frame; updating parameters of the target model and the target candidate model; extracting multi-channel characteristics at the position of a target area frame to update a position filter and a scale filter; and repeating the steps to realize the tracking of the target. The color information of the target and the background is fully utilized while the real-time performance of a related filtering tracking algorithm is kept, and the tracking robustness is improved.

Description

Correlated filtering tracking method combining color ratio characteristics
[ technical field ] A method for producing a semiconductor device
The invention relates to the field of video moving target visual tracking, in particular to a related filtering tracking method combining color ratio characteristics.
[ background of the invention ]
The moving target visual tracking technology has great significance in the field of unmanned aerial vehicle vision. Visual tracking techniques refer to estimating the state of an object in subsequent frames of a video based on the state of the object, such as position and size, in the first frame of a given video. Visual tracking is a challenging problem in the field of computer vision in that the prior knowledge is only the state of the target in the first frame of the video, and there is no explicit modeling of the target. The loss of the tracked target can be caused by motion blur, occlusion, shape and scale change in the target motion process. In addition, the real-time requirement of visual tracking of the unmanned aerial vehicle also provides a challenge to the complexity of the tracking algorithm.
At present, a discriminant tracking algorithm based on the correlation filtering obtains a better effect in the field of target tracking. The correlation filtering tracking algorithm utilizes the property of the diagonalization of the Fourier transform of the circulant matrix, greatly reduces the calculation complexity while expanding the number of samples, and obtains better tracking effect and higher speed. However, the edge effect is brought by the cyclic shift assumption of the traditional correlation filtering tracking algorithm, so that a large number of unreal negative samples are learned by the model, and the discrimination of the model is reduced. Various regularization related filtering tracking algorithms for solving the edge effect destroy the closed solution of related filtering, and the real-time performance of the algorithms is reduced. Meanwhile, the related filtering tracking algorithm realizes discriminant tracking by utilizing the template shape characteristics of the target, and cannot well cope with the deformation of the target in the tracking process. Therefore, further research and improvement on the related filtering tracking algorithm are needed, information of the target and the background is fully utilized, and the tracking performance is further improved while the real-time performance is maintained.
[ summary of the invention ]
The invention mainly aims to provide a correlation filtering tracking algorithm combined with color ratio characteristics, which fuses the color ratio characteristics into a correlation filtering tracking framework so as to fully utilize color information of a target and a background, and improves the robustness of the tracking algorithm while keeping the real-time performance of the tracking algorithm
In order to achieve the purpose, the invention provides the following technical scheme:
the application discloses a correlated filtering tracking method combining color ratio characteristics, which comprises the following steps:
a) initializing a video target area frame;
b) required for initialising the extraction of colour ratio featuresParameters of a target model and a target candidate model, wherein the target model and the target candidate model are both 16 multiplied by 16 color histograms, and the target model counts target centers
Figure DEST_PATH_IMAGE001
Is a central bandwidth
Figure DEST_PATH_IMAGE002
Inner pixel
Figure DEST_PATH_IMAGE003
The color distribution of (a) is calculated as follows:
Figure DEST_PATH_IMAGE004
wherein C is a normalization factor, k is a kernel function,
Figure DEST_PATH_IMAGE005
in order to be an impulse function,
Figure DEST_PATH_IMAGE006
for the histogram subscript corresponding to the pixel color, the target model adopts Epanechnikov kernel weighting, and the statistical region of the target candidate model is enlarged compared with that of the target model
Figure DEST_PATH_IMAGE007
Times, without weighting, are calculated as follows:
Figure DEST_PATH_IMAGE008
wherein C issTo normalize the factor, nsTo be centered on the target x0Centered, the number of pixels within bandwidth s.h;
c) extracting 13 channel gradient histograms, color ratios and gray features at the positions of the target area frames, and respectively multiplying the extracted 13 channel gradient histograms, the color ratios and the gray features by corresponding weights to form multi-channel features;
d) training a position filter and a scale filter by adopting the multi-channel characteristics;
e) extracting multi-channel characteristics from a next frame target region frame of the video;
f) calculating a correlation response graph according to the multi-channel characteristics and the position filter, calculating a color response graph for the color ratio characteristics through an integral graph, and performing linear weighted fusion to obtain a fusion response graph;
g) obtaining the displacement of the target relative to the previous frame through the fused response image, updating the position of the target through the displacement, and extracting the multi-scale gradient histogram feature at the position;
h) calculating a scale response through the multi-scale gradient histogram feature and the scale filter, acquiring the scale change of a target relative to the previous frame through the scale response, and restoring a target region frame of the current frame through the displacement and the scale change relative to the previous frame;
i) updating parameters of a target model and a target candidate model required by extracting the color ratio characteristics;
j) extracting a multi-channel feature update position filter consisting of a gradient histogram, a color ratio and a gray level feature at the position of a target area frame, and extracting a multi-scale gradient histogram feature update scale filter;
k) and repeating the steps e) to j) to track the target.
Preferably, the region for extracting features in step c) and step j) includes a target and a surrounding background region, and is expanded in four directions
Figure DEST_PATH_IMAGE009
Where m and n are the length and width of the target, respectively.
Preferably, in steps c) and j), the calculation formula for the color ratio feature of the pixel point using the target model and the target candidate model is
Figure DEST_PATH_IMAGE010
Wherein CRs[u]Is a color ratio to a pixel color of uCharacteristic value of rate, q [ u ]],ps[u]And (4) values are taken for the corresponding target model and the target candidate model, s is the expansion multiple of the statistical region of the target candidate model compared with the region of the target model, and the obtained feature graph is sampled to one fourth of the original size when multi-channel features composed of multiple features are added.
Preferably, in the step f), the fusion mode of the correlation response map and the color response map is linear weighted fusion, and the formula is as follows:
Figure DEST_PATH_IMAGE011
wherein
Figure DEST_PATH_IMAGE012
In order to be a graph of the correlation response,
Figure DEST_PATH_IMAGE013
in the form of a color ratio response map,
Figure DEST_PATH_IMAGE014
is a fusion scaling factor.
In the step i), the target model and the target candidate model are updated linearly to cope with the target and background changes, and the linear updating formula is as follows:
Figure DEST_PATH_IMAGE015
Figure DEST_PATH_IMAGE016
wherein
Figure DEST_PATH_IMAGE017
For the update rate, t is time, qtAnd ptTarget model and target candidate model, q, for the t-th frame, respectivelyt-1And pt-1Respectively, the target model and the target candidate model of the t-1 th frame.
The invention has the beneficial effects that: the invention provides a related filtering tracking method combining color ratio characteristics, which comprises the steps of establishing a color histogram model for a target area and a background area, extracting the color ratio characteristics from an image, fusing the color ratio characteristics into a related filtering tracking frame, fully utilizing the color information of the target area and the background area while keeping the real-time performance of a related filtering tracking algorithm, and improving the tracking robustness.
The features and advantages of the present invention will be described in detail by embodiments in conjunction with the accompanying drawings.
[ description of the drawings ]
FIG. 1 is a flow chart illustrating the steps of a correlation filtering tracking method incorporating color ratio characterization according to the present invention;
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood, however, that the description herein of specific embodiments is only intended to illustrate the invention and not to limit the scope of the invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
Referring to fig. 1, the related filtering tracking method combining color ratio features of the present invention includes the following steps:
1) initializing a video target area frame;
2) initializing parameters of a target model and a target candidate model required for extracting color ratio features,
the target model and the target candidate model are both 16 × 16 × 16 color histograms, and the target model counts the center of the target
Figure 294628DEST_PATH_IMAGE001
Is a central bandwidth
Figure 680610DEST_PATH_IMAGE002
Inner pixel
Figure 303090DEST_PATH_IMAGE003
The color distribution of (a) is calculated as follows:
Figure 167141DEST_PATH_IMAGE004
wherein C is a normalization factor, k is a kernel function,
Figure 840568DEST_PATH_IMAGE005
in order to be an impulse function,
Figure 131872DEST_PATH_IMAGE006
for the histogram subscript corresponding to the pixel color, the target model adopts Epanechnikov kernel weighting, and the statistical region of the target candidate model is enlarged compared with that of the target model
Figure 743113DEST_PATH_IMAGE007
Times, without weighting, are calculated as follows:
Figure 145276DEST_PATH_IMAGE008
wherein C issTo normalize the factor, nsTo be centered on the target x0Centered, the number of pixels within the bandwidth s.h, and in step 2), the target model counts the color distribution in the target area and the target candidate model expansion coefficient
Figure DEST_PATH_IMAGE018
That is, the color distribution in a region with 3 times of size, which includes the target and the background, is counted;
3) extracting multi-channel features composed of gradient histograms, color ratios and gray features at the position of a target region frame, wherein the region for extracting the features comprises a target and a surrounding background region, and the region is expanded in four directions
Figure DEST_PATH_IMAGE019
Where m and n are the length and width of the target, respectively.
In step 3), the target model and the target candidate model are used for calculating the color ratio characteristics of the pixel points
Figure 990653DEST_PATH_IMAGE010
Wherein CRs[u]Is a color ratio feature value for a pixel color of u, q [ u [ ]],ps[u]And (3) values are taken for corresponding target models and target candidate models, s is a target candidate model statistical region and is compared with a target model region by a multiple expansion factor, when the obtained feature diagram is added with multi-channel features consisting of multiple features, the feature diagram is down-sampled to one fourth of the original size so as to keep the same size with other features, and the 13-channel HOG features remove contrast-sensitive feature channels on the basis of the existing 31-channel HOG features, so that the number of channels is reduced, and the calculation efficiency is improved. The three types of features are multiplied by corresponding weights when forming the multi-channel features respectively, so that the proportion of each type of features in the multi-channel features is increased, and the feature discrimination is improved. The three features constitute a 15-channel multi-channel tracking feature. Recording multiple channels is characterized by
Figure DEST_PATH_IMAGE020
The trained correlation filter is
Figure DEST_PATH_IMAGE021
The loss function is as follows:
Figure DEST_PATH_IMAGE022
wherein, the operation is the cycle-dependent operation,
Figure DEST_PATH_IMAGE023
is a training label for the correlation filter and,
Figure DEST_PATH_IMAGE024
is a regular term. By converting the loss function to the frequency domain by the pascal's theorem, a closed solution can be obtained:
Figure DEST_PATH_IMAGE025
wherein, therein
Figure DEST_PATH_IMAGE026
To represent
Figure DEST_PATH_IMAGE027
The fourier transform of (1), where multiplication and division are both element-by-element multiplication and division. In the multi-scale feature, the number of scales is 33;
4) training a position filter and a scale filter by adopting the multi-channel characteristics;
5) extracting multi-channel characteristics from a next frame target region frame of the video;
6) calculating a correlation response map according to the multi-channel feature and the position filter, calculating a color response map for the color ratio feature through an integral map, performing linear weighted fusion to obtain a fusion response map,
noting that step 4) the multichannel characteristics are
Figure DEST_PATH_IMAGE028
Passing through and correlating filters
Figure DEST_PATH_IMAGE029
Cross-correlation results in a correlation response:
Figure DEST_PATH_IMAGE030
wherein
Figure DEST_PATH_IMAGE031
Is an inverse fourier transform.
The fusion mode of the correlation response graph and the color response graph is linear weighted fusion, namely:
Figure 905388DEST_PATH_IMAGE011
wherein
Figure DEST_PATH_IMAGE032
In order to be a graph of the correlation response,
Figure DEST_PATH_IMAGE033
in the form of a color ratio response map,
Figure DEST_PATH_IMAGE034
is a fusion scaling factor.
By passing
Figure DEST_PATH_IMAGE035
The position of the peak value obtains the displacement of the target relative to the previous frame;
7) obtaining the displacement of the target relative to the previous frame through the fused response image, updating the position of the target through the displacement, and extracting the multi-scale gradient histogram feature at the position;
8) calculating a scale response through the multi-scale gradient histogram feature and the scale filter, acquiring the scale change of a target relative to the previous frame through the scale response, and restoring a target region frame of the current frame through the displacement and the scale change relative to the previous frame;
9) updating target model and target candidate model parameters required for extracting color ratio features, wherein the model parameters are updated in a linear mode:
Figure 768040DEST_PATH_IMAGE015
Figure 849259DEST_PATH_IMAGE016
wherein
Figure DEST_PATH_IMAGE036
For the update rate, t is time, qtAnd ptTarget model and target candidate model, q, for the t-th frame, respectivelyt-1And pt-1Respectively a target model and a target candidate model of the t-1 frame;
10) extracting a multi-channel feature update position filter consisting of a gradient histogram, a color ratio and a gray level feature at the position of a target area frame, extracting a multi-scale gradient histogram feature update scale filter, and recording the molecules of the multi-channel related filter as
Figure DEST_PATH_IMAGE037
Denominator is
Figure DEST_PATH_IMAGE038
The update of the multichannel correlation filter is as follows:
Figure DEST_PATH_IMAGE039
Figure DEST_PATH_IMAGE040
Figure DEST_PATH_IMAGE041
11) and repeating the steps 5) to 10) to realize the tracking of the target.
To verify the effectiveness of the improvements of the present invention, verification algorithm performance was tested on the open data set OTB-2015 and compared to some existing algorithms, including SRDCF, stage, LCT, SAMF, fdst, KCF, CN and CSK. The experimental machine is configured as an Intel Core (TM) i7-8700 processor with a main frequency of 3.19GHz and a memory of 16.00 GB.
Tracking algorithm P20 AUC FPS
CRCF 0.804 0.587 111.85
CRCF* 0.817 0.605 143.72
SRDCF 0.788 0.597 8.68
Staple 0.784 0.579 100.33
LCT2 0.762 0.527 32.48
SAMF 0.752 0.541 28.49
fDSST 0.725 0.554 113.11
KCF 0.696 0.477 354.57
CN 0.594 0.422 307.35
CSK 0.518 0.382 602.82
The table above shows the performance indexes of the above algorithm on the open data set OTB-2015, wherein the accuracy of the P20 precision curve under the 20-pixel threshold is used for evaluating the positioning accuracy of the center point of the tracker, and AUC is the area under the curve of the success rate curve and is used for evaluating the estimation accuracy of the tracker on the target area. The position accuracy of the CRCF and CRCF algorithm is respectively 2.03 percent and 3.68 percent higher than that of the SRDCF algorithm introducing space domain regularization, and the relevant filtering tracking method combining the color ratio characteristics provided by the invention is proved to be capable of effectively utilizing the color information of the target and the background and improving the tracking robustness.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents or improvements made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (4)

1. A correlation filtering tracking method in combination with color ratio characterization, comprising the steps of:
a) initializing a video target area frame;
b) initializing parameters of a target model and a target candidate model required for extracting color ratio features, wherein the target model and the target candidate model are both 16 multiplied by 16 color histograms, and the target model counts target centers
Figure 306725DEST_PATH_IMAGE001
Is a central bandwidth
Figure 442171DEST_PATH_IMAGE002
Inner pixel
Figure 973516DEST_PATH_IMAGE003
The color distribution of (a) is calculated as follows:
Figure 674756DEST_PATH_IMAGE004
wherein C is a normalization factor, k is a kernel function,
Figure 915244DEST_PATH_IMAGE005
in order to be an impulse function,
Figure 335990DEST_PATH_IMAGE006
for the histogram subscript corresponding to the pixel color, the target model adopts Epanechnikov kernel weighting, and the statistical region of the target candidate model is enlarged compared with that of the target model
Figure 687337DEST_PATH_IMAGE007
Times, without weighting, are calculated as follows:
Figure 508663DEST_PATH_IMAGE008
wherein C issTo normalize the factor, nsTo be centered on the targetx0Centered, the number of pixels within bandwidth s.h;
c) extracting 13 channel gradient histograms, color ratios and gray features at the positions of the target area frames, and respectively multiplying the extracted 13 channel gradient histograms, the color ratios and the gray features by corresponding weights to form multi-channel features;
d) training a position filter and a scale filter by adopting the multi-channel characteristics;
e) extracting multi-channel characteristics from a next frame target region frame of the video;
f) calculating a correlation response graph according to the multi-channel characteristics and the position filter, calculating a color response graph for the color ratio characteristics through an integral graph, and performing linear weighted fusion to obtain a fusion response graph;
g) obtaining the displacement of the target relative to the previous frame through the fused response image, updating the position of the target through the displacement, and extracting the multi-scale gradient histogram feature at the position;
h) calculating a scale response through the multi-scale gradient histogram feature and the scale filter, acquiring the scale change of a target relative to the previous frame through the scale response, and restoring a target region frame of the current frame through the displacement and the scale change relative to the previous frame;
i) updating parameters of a target model and a target candidate model required by extracting the color ratio characteristics;
j) extracting a multi-channel characteristic updating position filter consisting of a gradient histogram, a color ratio and a gray characteristic from the position of the target area frame of the current frame in the step h), and extracting a multi-scale gradient histogram characteristic updating scale filter;
k) and repeating the steps e) to j) to track the target.
2. A method for correlation filter tracking in conjunction with color ratio characterization, as claimed in claim 1, wherein: the regions of the extracted features in the step c) and the step j) comprise a target and a surrounding background region, and the regions are expanded in four directions
Figure 920053DEST_PATH_IMAGE009
Wherein m and n are each independentlyOther than the target length and width.
3. A method of correlation filter tracking incorporating color ratio characterization as claimed in claim 1 wherein: in steps c) and j), using the target model and the target candidate model, the calculation formula of the color ratio characteristic of the pixel point is
Figure 76096DEST_PATH_IMAGE010
Wherein CRs[u]Is a color ratio feature value for a pixel color of u, q [ u [ ]],ps[u]And (4) values are taken for the corresponding target model and the target candidate model, s is the expansion multiple of the statistical region of the target candidate model compared with the region of the target model, and the obtained feature graph is sampled to one fourth of the original size when multi-channel features composed of multiple features are added.
4. A method for correlation filter tracking in conjunction with color ratio characterization, as claimed in claim 1, wherein: the fusion mode of the correlation response graph and the color response graph in the step f) is linear weighted fusion, and the formula is as follows:
Figure 965555DEST_PATH_IMAGE011
wherein
Figure 906966DEST_PATH_IMAGE012
In order to be a graph of the correlation response,
Figure 207366DEST_PATH_IMAGE013
in the form of a color ratio response map,
Figure 335859DEST_PATH_IMAGE014
in order to fuse the scaling factors, in said step i), the object model and the object candidate model are updated linearly to cope with the object and background changes,the linear update formula is:
Figure 481539DEST_PATH_IMAGE015
Figure 11877DEST_PATH_IMAGE016
wherein
Figure 765070DEST_PATH_IMAGE017
For the update rate, t is time, qtAnd ptTarget model and target candidate model, q, for the t-th frame, respectivelyt-1And pt-1Respectively, the target model and the target candidate model of the t-1 th frame.
CN202011006205.2A 2020-09-23 2020-09-23 Correlated filtering tracking method combining color ratio characteristics Active CN111931722B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011006205.2A CN111931722B (en) 2020-09-23 2020-09-23 Correlated filtering tracking method combining color ratio characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011006205.2A CN111931722B (en) 2020-09-23 2020-09-23 Correlated filtering tracking method combining color ratio characteristics

Publications (2)

Publication Number Publication Date
CN111931722A CN111931722A (en) 2020-11-13
CN111931722B true CN111931722B (en) 2021-02-12

Family

ID=73334038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011006205.2A Active CN111931722B (en) 2020-09-23 2020-09-23 Correlated filtering tracking method combining color ratio characteristics

Country Status (1)

Country Link
CN (1) CN111931722B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112329784A (en) * 2020-11-23 2021-02-05 桂林电子科技大学 Correlation filtering tracking method based on space-time perception and multimodal response
CN113393493B (en) * 2021-05-28 2024-04-05 京东科技信息技术有限公司 Target object tracking method and device
CN113706580B (en) * 2021-08-11 2022-12-09 西安交通大学 Target tracking method, system, equipment and medium based on relevant filtering tracker

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107748873A (en) * 2017-10-31 2018-03-02 河北工业大学 A kind of multimodal method for tracking target for merging background information
CN109285179A (en) * 2018-07-26 2019-01-29 昆明理工大学 A kind of motion target tracking method based on multi-feature fusion

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408579B (en) * 2016-10-25 2019-01-29 华南理工大学 A kind of kneading finger tip tracking based on video
CN110827319B (en) * 2018-08-13 2022-10-28 中国科学院长春光学精密机械与物理研究所 Improved Staple target tracking method based on local sensitive histogram
CN109410246B (en) * 2018-09-25 2021-06-11 杭州视语智能视觉系统技术有限公司 Visual tracking method and device based on correlation filtering
CN111260689B (en) * 2020-01-16 2022-10-11 东华大学 Confidence enhancement-based correlation filtering visual tracking method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107748873A (en) * 2017-10-31 2018-03-02 河北工业大学 A kind of multimodal method for tracking target for merging background information
CN109285179A (en) * 2018-07-26 2019-01-29 昆明理工大学 A kind of motion target tracking method based on multi-feature fusion

Also Published As

Publication number Publication date
CN111931722A (en) 2020-11-13

Similar Documents

Publication Publication Date Title
CN111931722B (en) Correlated filtering tracking method combining color ratio characteristics
CN108665481B (en) Self-adaptive anti-blocking infrared target tracking method based on multi-layer depth feature fusion
CN111354017B (en) Target tracking method based on twin neural network and parallel attention module
CN107358623B (en) Relevant filtering tracking method based on significance detection and robustness scale estimation
CN110135500B (en) Target tracking method under multiple scenes based on self-adaptive depth characteristic filter
CN110120064B (en) Depth-related target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning
CN106127688B (en) A kind of super-resolution image reconstruction method and its system
CN112800876B (en) Super-spherical feature embedding method and system for re-identification
CN111639564B (en) Video pedestrian re-identification method based on multi-attention heterogeneous network
Huang et al. Joint blur kernel estimation and CNN for blind image restoration
CN111260738A (en) Multi-scale target tracking method based on relevant filtering and self-adaptive feature fusion
CN107194366B (en) Behavior identification method based on dense track covariance descriptor
CN107730536B (en) High-speed correlation filtering object tracking method based on depth features
CN107169994A (en) Correlation filtering tracking based on multi-feature fusion
Sun et al. Separable Markov random field model and its applications in low level vision
CN112734809B (en) On-line multi-pedestrian tracking method and device based on Deep-Sort tracking framework
CN109410246B (en) Visual tracking method and device based on correlation filtering
CN112329784A (en) Correlation filtering tracking method based on space-time perception and multimodal response
CN115937693A (en) Road identification method and system based on remote sensing image
CN109448024B (en) Visual tracking method and system for constructing constraint correlation filter by using depth data
CN108280845B (en) Scale self-adaptive target tracking method for complex background
CN110751670A (en) Target tracking method based on fusion
CN113888586A (en) Target tracking method and device based on correlation filtering
CN110598614B (en) Related filtering target tracking method combined with particle filtering
CN110097579B (en) Multi-scale vehicle tracking method and device based on pavement texture context information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant