CN102930558B - Real-time tracking method for infrared image target with multi-feature fusion - Google Patents
Real-time tracking method for infrared image target with multi-feature fusion Download PDFInfo
- Publication number
- CN102930558B CN102930558B CN201210397686.3A CN201210397686A CN102930558B CN 102930558 B CN102930558 B CN 102930558B CN 201210397686 A CN201210397686 A CN 201210397686A CN 102930558 B CN102930558 B CN 102930558B
- Authority
- CN
- China
- Prior art keywords
- rho
- target
- prime
- textural characteristics
- gray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a real-time tracking method for an infrared image target with multi-feature fusion. The real-time tracking method comprises the following steps of: initializing a target tracking point position; initializing a target model; calculating a target candidate model; calculating a coefficient of unified feature Bhattacharyya and weight coefficients between feature; calculating a new tracking position of a current frame target; estimating the coefficient of the unified feature Bhattacharyya at the new position; and comparing the two coefficients of the unified feature Bhattacharyya and outputting results. The real-time tracking method can be used for adaptively calculating the weight coefficients between multiple features, enhances the robustness of target tracking, ensures the stability of target tracking, solves the problem of tracking point drifting caused by unstable single feature, and effectively improves the accuracy of target tracking.
Description
Technical field
The present invention designs a kind of infrared image method for tracking target of multiple features fusion, particularly a kind of infrared image method for tracking target of applicable hardware real-time implementation.
Background technology
In recent years, along with the development of integrated circuit technology and infra-red material, infrared imagery technique achieves very much progress, is used widely in national defense construction and national economy field.But compared with visible images, infrared image signal to noise ratio (S/N ratio) is relatively low, therefore limited information can only be provided when carrying out infrared image target detection and following the tracks of.Because in infrared image, target signature is not obvious, there is the problems such as large background clutter, cause the accurate tracking of infrared image target to become more difficult.
At present, target tracking algorism is divided into the tracking based on model and the large class of the tracking based on outward appearance two.Compared with model following method, outward appearance tracing avoids the complex process of Modling model, has wider engineering practical value.Wherein, average drifting track algorithm is used widely because of the feature that it is simple, robust, real-time are good in target following.Average drifting is a kind of without ginseng density calculation method, by the distribution pattern that successive ignition search is the most similar to sample distribution.The people such as Comaniciu, by finding the maximum value of color of object histogram and candidate target color histogram similarity, propose a kind of average drifting target tracking algorism.Kalman filter is used for the primary iteration position of predicting Mean Shift by the people such as Chu, but when target is seriously blocked, the source location searched out due to Mean Shift algorithm is inaccurate, there is certain deviation.Collins etc. propose a kind of adaptive tracking method choosing easy identification color characteristic, wherein candidate color feature set comprises 49 stack features calculated by the linear combination of pixel R, G, B value. and because the Candidate Set adopted is comparatively large, the computing overhead of Feature Selection is also very large.Therefore, there is following shortcoming in existing target tracking algorism: the target tracking algorism of (1) classics adopts single features to describe target, poor anti jamming capability; (2) most existing multiple features target tracking algorism only utilizes the weight coefficient between present frame calculating feature, when target carries out complexity change, and track algorithm poor robustness; (3) most existing track algorithm when target occur non-rigid deformation, partial occlusion and overlapping, tracking accuracy decline, even there is track rejection; (4) most existing track algorithm is while raising target tracking accuracy, considerably increases algorithm complex, not easily hardware real-time implementation.
Summary of the invention
Goal of the invention: technical matters to be solved by this invention is for the deficiencies in the prior art, provides a kind of infrared image object real-time tracking method of multiple features fusion.
In order to solve the problems of the technologies described above, the invention discloses a kind of infrared image object real-time tracking method of multiple features fusion, comprising the following steps:
(1) initialized target trace point position y
0.Initial trace point is by manually specifying;
(2) initialized target model, with initial trace point y
0centered by set up target gray model q
1with target LBP texture model q
2; (local binary pattern, LBP) local binary patterns.
(3) target candidate model is calculated, according to the trace point position y of target
0, calculated candidate target gray model p
1(y
0) and candidate target LBP texture model p
2(y
0);
(4) utilize gray feature Bhattacharyya (Batachelia Bhattacharyya, see Visual C++ Digital Image Processing, the 466th page, author: Xie Fengying, first published in 2008, Electronic Industry Press.) coefficient ρ
1with the Bhattacharyya coefficient ρ of LBP textural characteristics
2, and the weight coefficient α of gray feature
1with the weight coefficient α of LBP textural characteristics
2, calculate position y
0place union feature Bhattacharyya coefficient ρ, expression formula is as follows:
ρ=α
1·ρ
1+α
2·ρ
2;
(5) present frame target reposition y is calculated
1;
(6) gray feature Bhattacharyya coefficient ρ ' is utilized
1with LBP textural characteristics Bhattacharyya coefficient ρ '
2, and the weight coefficient α ' of gray feature
1with the weight coefficient α ' of LBP textural characteristics
2, calculate position y
1place union feature Bhattacharyya coefficient ρ ', expression formula is as follows;
ρ′=α′
1·ρ′
1+α′
2·ρ′
2,
(7) as ρ ' < ρ,
otherwise y
1remain unchanged;
(8) if | (y
0-y
1) | < ε, stops calculating, otherwise, by y
1assignment is to y
0and perform step (3), wherein, ε is error constant coefficient.
In step (2), target gray model q
1for:
Target LBP texture model q
2for:
Wherein, q
ufor the probability density at different levels of target gray model gray feature, q
vfor the probability density at different levels of LBP textural characteristics, m
1for the maximum quantization progression scope of target gray model gray feature, m
2for the maximum quantization progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
In step (3), candidate target gray level model p
1for:
Candidate target LBP texture model p
2for:
P
ufor the probability density at different levels of target gray model gray feature, p
vfor the probability density at different levels of target gray model gray feature and LBP textural characteristics, m
1for the maximum quantization progression scope of target gray model gray feature, m
2for the maximum quantization progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
The weight coefficient α of gray feature in step (4)
1with the weight coefficient α of LBP textural characteristics
2, the weight coefficient α ' of gray feature in step (6)
1with the weight coefficient α ' of LBP textural characteristics
2employing iterative manner upgrades, and calculating formula is as follows respectively:
α
1=(1-λ)·α
1,old+λ·α
1,cur,
α
2=(1-λ)·α
2,old+λ·α
2,cur,
α'
1=(1-λ)·α'
1,old+λ·α'
1,cur,
α'
2=(1-λ)·α'
2,old+λ·α'
2,cur,
Wherein, α
1, oldand α
2, oldthe weight coefficient of previous frame gray feature and LBP textural characteristics in step (4) respectively, α
1, curand α
2, curthe weight coefficient of present frame gray characteristic sum LBP textural characteristics in step (4) respectively, α '
1, oldwith α '
2, oldthe weight coefficient of previous frame gray feature and LBP textural characteristics in step (6) respectively, α '
1, curwith α '
2, curbe the weight coefficient of present frame gray characteristic sum LBP textural characteristics in step (6), λ is scale-up factor.0≤λ≤1, determine the speed of convergence of weight coefficient, λ value is larger, and speed of convergence is faster, and tracking maneuverability is stronger, and λ value is less, and speed of convergence is slower, and tracking stability is better.
The weight coefficient α of present frame gray feature in step (4)
1, curwith the weight coefficient α of LBP textural characteristics
2, cur, the weight coefficient α ' of present frame gray feature in step (6)
1, curwith the weight coefficient α ' of LBP textural characteristics
2, cur, calculating formula is as follows respectively:
Wherein, ρ
1the Bhattacharyya coefficient of gray feature in step (4), ρ
2the Bhattacharyya coefficient of LBP textural characteristics in step (4), ρ '
1the Bhattacharyya coefficient of gray feature in step (6), ρ '
2it is the Bhattacharyya coefficient of LBP textural characteristics in step (6).
Gray feature Bhattacharyya coefficient ρ ' in step (6)
1with the Bhattacharyya coefficient ρ ' of LBP textural characteristics
2, and the weight coefficient α ' of gray feature
1with the weight coefficient α ' of LBP textural characteristics
2obtained by following formula:
P'
uposition y
1the probability density at different levels of the target gray model gray feature at place, p'
vposition y
1the probability density at different levels of place LBP textural characteristics;
Wherein, α '
1, oldthe weight coefficient of previous frame gray feature, α '
2, oldbe the weight coefficient of previous frame LBP textural characteristics, λ is scale-up factor.0≤λ≤1, determine the speed of convergence of weight coefficient, λ value is larger, and speed of convergence is faster, and tracking maneuverability is stronger, and λ value is less, and speed of convergence is slower, and tracking stability is better.
In the infrared image object real-time tracking method of multiple features fusion of the present invention, Yi Pannieqiekefu (Epanechnikov) kernel function is used to calculate gray feature probability histogram and LBP textural characteristics probability histogram.
The present invention compared with prior art, has following remarkable advantage: (1), according to the feature significance of object and background and similarity, the weight coefficient between adaptive polo placement multiple features, enhances target following robustness; (2) adopt the weight coefficient between iterative manner renewal multiple features, ensure that target following stability; (3) utilize multiple features fusion method for tracking target to follow the tracks of infrared image target, solve the trace point drifting problem that single features instability causes, effectively improve target tracking accuracy; (4) there is not high exponent arithmetic(al) and labyrinth in the multiple features fusion method for tracking target that the present invention proposes, algorithm operation quantity is little, is easy to hardware real-time implementation.
Accompanying drawing explanation
To do the present invention below in conjunction with the drawings and specific embodiments and further illustrate, the advantage of above and other aspect of the present invention will become apparent.
Fig. 1 is process flow diagram of the present invention.
Fig. 2 a ~ 2d is traditional single feature (gray scale) infrared image target following result.
Fig. 3 a ~ 3d is the infrared image target following result of multiple features fusion of the present invention.
Embodiment
In the infrared image object real-time tracking method of multiple features fusion of the present invention, gray feature and LBP textural characteristics is utilized to describe infrared image clarification of objective.
Eight neighborhood LBP textural characteristics expression formula LBP
8,1as follows:
Wherein, g
ccurrent point, g
nsurrounding neighbors point, n=0..7.
In the infrared image object real-time tracking method of multiple features fusion of the present invention, Bhattacharyya coefficient is utilized to describe similarity between object module and target candidate model.
Bhattacharyya coefficient ρ
bhaexpression formula is as follows:
Wherein, p is target candidate model, and q is object module.
In the infrared image object real-time tracking method of multiple features fusion of the present invention, the weight coefficient α of gray feature
1with the weight coefficient α of LBP textural characteristics
2employing iterative manner upgrades, and expression formula is as follows:
α
1=(1-λ)·α
1,old+λ·α
1,cur
α
2=(1-λ)·α
2,old+λ·α
2,cur
Wherein, α
1, oldand α
2, oldthe weight coefficient of previous frame gray feature and LBP textural characteristics respectively, α
1, curand α
2, curbe the weight coefficient of present frame gray characteristic sum LBP textural characteristics, λ is scale-up factor.
In the infrared image object real-time tracking method of multiple features fusion of the present invention, the weight coefficient α of present frame gray feature
1, curwith the weight coefficient α of LBP textural characteristics
2, curexpression formula is as follows respectively:
Wherein, ρ
1the Bhattacharyya coefficient of gray feature, ρ
2it is the Bhattacharyya coefficient of LBP textural characteristics.
Embodiment 1
As shown in Figure 1, the infrared image object real-time tracking method of multiple features fusion of the present invention is described with example below.The number of pixels 320 × 240 of infrared image, frame frequency 25HZ.Thermal infrared imager imaging passes to the special image disposable plates of DSP+FPGA framework by optical fiber, and the infrared image target following of multiple features fusion realizes in dsp processor, and meet the demand of process in real time, concrete implementation step is as follows:
(1) initialized target trace point position y
0, initial trace point is by manually specifying.
Artificial appointment initial target trace point position (i, j), i=80, j=100 (shown in Fig. 2), setting Yi Pannieqiekefu (Epanechnikov) kernel function bandwidth h=10.
(2) initialized target model, sets up target gray model q according to gray feature
1, calculate the LBP textural characteristics of target, set up target texture model q in conjunction with LBP textural characteristics
2;
Calculate centered by initial trace point position (80,100), bandwidth h=10 is the image I of scope
lBPlBP textural characteristics, expression formula is as follows:
Eight neighborhood LBP textural characteristics expression formula LBP
8,1as follows:
Wherein, g
ccurrent goal point, c=k2*320+k1, g
ng
csurrounding neighbors point, n=0..7.
Target gray model q
1for:
Target LBP texture model q
2for:
Wherein, q
uand q
vrepresent the probability density at different levels of object module gray feature and LBP textural characteristics respectively, m
1=255 and m
2=255 quantification progression representing object module gray feature and LBP textural characteristics respectively, function b
1() is positioned at x
ipixel to the reflection of gray feature index, function b
2() is positioned at x
ipixel to the reflection of LBP textural characteristics index, δ is Delta function, and C is normalization coefficient, μ=1...255, v=1...255.
(3) target candidate model is calculated.According to trace point position y
0, calculated candidate target gray model p
1(y
0) and target texture candidate family p
2(y
0);
Candidate target gray level model p
1for:
Candidate target LBP texture model p
2for:
P
uand p
vrepresent the probability density at different levels of object module gray feature and LBP textural characteristics respectively, m
1=255 and m
2=255 quantification progression representing object module gray feature and LBP textural characteristics respectively, function b
1() is positioned at x
ipixel to the reflection of gray feature index, function b
2() is positioned at x
ipixel to the reflection of LBP textural characteristics index, δ is Delta function, and C is normalization coefficient, μ=1...255, v=1...255.
(4) the Bhattacharyya coefficient ρ of gray feature and LBP textural characteristics is calculated respectively
1, ρ
2and weight coefficient α
1, α
2, utilize ρ
1, α
1, ρ
2, α
2calculate position y
0the union feature Bhattacharyya coefficient ρ at place;
Union feature Bhattacharyya coefficient ρ is described as:
ρ=α
1·ρ
1+α
2·ρ
2,
Gray feature weight coefficient α
1, LBP textural characteristics α
2renewal expression formula is as follows:
Wherein, α
1, oldand α
2, oldprevious frame weight coefficient, α
1, curand α
2, curbe present frame weight coefficient, λ is scale-up factor.
(5) present frame target following reposition y is calculated
1;
Wherein,
n' be candidate target pixel number, h is kernel function bandwidth, q
u, q
v, p
u, p
v, α
1, α
2, x
i, b
1(), b
2the implication of () is identical with the definition in step (2), (3), (4).
(6) the Bhattacharyya coefficient ρ ' of gray feature and LBP textural characteristics is utilized
1, ρ '
2and weight coefficient α '
1, α '
2, calculate position y
1place union feature Bhattacharyya coefficient ρ ', expression formula is as follows;
ρ′=α′
1·ρ′
1+α′
2·ρ′
2,
Gray feature weight coefficient α '
1, LBP textural characteristics α '
2renewal expression formula is as follows:
Wherein, α '
1, oldwith α '
2, oldbe previous frame weight coefficient, λ is scale-up factor.
(7) as ρ ' < ρ,
otherwise y
1remain unchanged;
(8) if abs is (y
0-y
1) <0.01 then stops, otherwise, y
0← y
1, perform step (3).
Fig. 2 is conventional art, and Fig. 3 is the infrared image target following result of only utilization list feature (gray feature) and the multiple features fusion obtained according to the present embodiment, because be infrared image, unavoidably occurs greyscale color.Wherein Fig. 2 a, 2b, 2c, 2d represents the image of the 20th frame, the 80th frame, the 140th frame and the 200th frame respectively, and Fig. 3 a, 3b, 3c, 3d represent the image of the 20th frame, the 80th frame, the 140th frame and the 200th frame respectively.Comparison diagram 2 and Fig. 3 find: only utilize single feature tracing process can be caused unstable to target following, tracking accuracy is poor, as Fig. 2 tracking gate swings at random; Utilize the tracking of multiple features fusion effectively can improve tracking accuracy, if Fig. 3 tracking gate is all the time near the object type heart.
The invention provides a kind of infrared image object real-time tracking method of multiple features fusion; the method and access of this technical scheme of specific implementation is a lot; the above is only the preferred embodiment of the present invention; should be understood that; for those skilled in the art; under the premise without departing from the principles of the invention, can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.The all available prior art of each ingredient not clear and definite in the present embodiment is realized.
Claims (5)
1. an infrared image object real-time tracking method for multiple features fusion, is characterized in that, comprise the following steps:
(1) to the initial trace point position y that sets the goal
0;
(2) initialized target model, with initial trace point y
0centered by set up target gray model q
1with target LBP texture model q
2;
(3) target candidate model is calculated, according to the trace point position y of target
0, calculated candidate target gray model p
1(y
0) and candidate target LBP texture model p
2(y
0);
(4) gray feature Bhattacharyya coefficient ρ is utilized
1with the Bhattacharyya coefficient ρ of LBP textural characteristics
2, and the weight coefficient α of gray feature
1with the weight coefficient α of LBP textural characteristics
2, calculate position y
0place union feature Bhattacharyya coefficient ρ, expression formula is as follows:
ρ=α
1·ρ
1+α
2·ρ
2;
(5) present frame target reposition y is calculated
1;
(6) gray feature Bhattacharyya coefficient ρ ' is utilized
1with LBP textural characteristics Bhattacharyya coefficient ρ '
2, and the weight coefficient α ' of gray feature
1with the weight coefficient α ' of LBP textural characteristics
2, calculate position y
1place union feature Bhattacharyya coefficient ρ ', expression formula is as follows;
ρ′=α′
1·ρ′
1+α′
2·ρ′
2,
(7) as ρ ' < ρ,
otherwise y
1remain unchanged;
(8) if | (y
0-y
1) | < ε, stops calculating, otherwise, by y
1assignment is to y
0and perform step (3), wherein, ε is error constant coefficient;
The weight coefficient α of gray feature in step (4)
1with the weight coefficient α of LBP textural characteristics
2, the weight coefficient α ' of gray feature in step (6)
1with the weight coefficient α ' of LBP textural characteristics
2employing iterative manner upgrades, and calculating formula is as follows respectively:
α
1=(1-λ)·α
1,old+λ·α
1,cur,
α
2=(1-λ)·α
2,old+λ·α
2,cur,
α'
1=(1-λ)·α'
1,old+λ·α'
1,cur,
α'
2=(1-λ)·α'
2,old+λ·α'
2,cur,
Wherein, α
1, oldand α
2, oldthe weight coefficient of previous frame gray feature and LBP textural characteristics in step (4) respectively, α
1, curand α
2, curthe weight coefficient of present frame gray characteristic sum LBP textural characteristics in step (4) respectively, α '
1, oldwith α '
2, oldthe weight coefficient of previous frame gray feature and LBP textural characteristics in step (6) respectively, α '
1, curwith α '
2, curbe the weight coefficient of present frame gray characteristic sum LBP textural characteristics in step (6), λ is scale-up factor.
2. the infrared image object real-time tracking method of a kind of multiple features fusion according to claim 1, is characterized in that, in step (2), and target gray model q
1for:
Target LBP texture model q
2for:
Wherein, q
ufor the probability density at different levels of target gray model gray feature, q
vfor the probability density at different levels of LBP textural characteristics, m
1for the maximum quantization progression scope of target gray model gray feature, m
2for the maximum quantization progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
3. the infrared image object real-time tracking method of a kind of multiple features fusion according to claim 1, is characterized in that, in step (3), and candidate target gray level model p
1for:
Candidate target LBP texture model p
2for:
P
ufor the probability density at different levels of target gray model gray feature, p
vfor the probability density at different levels of target gray model gray feature and LBP textural characteristics, m
1for the maximum quantization progression scope of target gray model gray feature, m
2for the maximum quantization progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
4. the infrared image object real-time tracking method of a kind of multiple features fusion according to claim 1, is characterized in that, the weight coefficient α of present frame gray feature in step (4)
1, curwith the weight coefficient α of LBP textural characteristics
2, cur, the weight coefficient α ' of present frame gray feature in step (6)
1, curwith the weight coefficient α ' of LBP textural characteristics
2, cur, calculating formula is as follows respectively:
Wherein, ρ
1the Bhattacharyya coefficient of gray feature in step (4), ρ
2the Bhattacharyya coefficient of LBP textural characteristics in step (4), ρ '
1the Bhattacharyya coefficient of gray feature in step (6), ρ '
2it is the Bhattacharyya coefficient of LBP textural characteristics in step (6).
5. the infrared image object real-time tracking method of a kind of multiple features fusion according to claim 4, is characterized in that, gray feature Bhattacharyya coefficient ρ ' in step (6)
1with the Bhattacharyya coefficient ρ ' of LBP textural characteristics
2, and the weight coefficient α ' of gray feature
1with the weight coefficient α ' of LBP textural characteristics
2obtained by following formula:
P'
uposition y
1the probability density at different levels of the target gray model gray feature at place, p'
vposition y
1the probability density at different levels of place LBP textural characteristics.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210397686.3A CN102930558B (en) | 2012-10-18 | 2012-10-18 | Real-time tracking method for infrared image target with multi-feature fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210397686.3A CN102930558B (en) | 2012-10-18 | 2012-10-18 | Real-time tracking method for infrared image target with multi-feature fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102930558A CN102930558A (en) | 2013-02-13 |
CN102930558B true CN102930558B (en) | 2015-04-01 |
Family
ID=47645348
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210397686.3A Active CN102930558B (en) | 2012-10-18 | 2012-10-18 | Real-time tracking method for infrared image target with multi-feature fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102930558B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI628624B (en) * | 2017-11-30 | 2018-07-01 | 國家中山科學研究院 | Improved thermal image feature extraction method |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109215062B (en) * | 2017-06-29 | 2022-02-08 | 沈阳新松机器人自动化股份有限公司 | Motion capture method based on image vision, binocular positioning device and system |
US10304207B2 (en) * | 2017-07-07 | 2019-05-28 | Samsung Electronics Co., Ltd. | System and method for optical tracking |
CN109902578B (en) * | 2019-01-25 | 2021-01-08 | 南京理工大学 | Infrared target detection and tracking method |
CN113379789B (en) * | 2021-06-11 | 2022-12-27 | 天津大学 | Moving target tracking method in complex environment |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6590999B1 (en) * | 2000-02-14 | 2003-07-08 | Siemens Corporate Research, Inc. | Real-time tracking of non-rigid objects using mean shift |
-
2012
- 2012-10-18 CN CN201210397686.3A patent/CN102930558B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6590999B1 (en) * | 2000-02-14 | 2003-07-08 | Siemens Corporate Research, Inc. | Real-time tracking of non-rigid objects using mean shift |
Non-Patent Citations (1)
Title |
---|
汪首坤,郭俊杰,王军政.基于自适应特征融合的均值迁移目标跟踪.《北京理工大学学报》.2011,第31卷(第7期),第804-805页,第807页. * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI628624B (en) * | 2017-11-30 | 2018-07-01 | 國家中山科學研究院 | Improved thermal image feature extraction method |
Also Published As
Publication number | Publication date |
---|---|
CN102930558A (en) | 2013-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109816641B (en) | Multi-scale morphological fusion-based weighted local entropy infrared small target detection method | |
CN111681197B (en) | Remote sensing image unsupervised change detection method based on Siamese network structure | |
CN102103748B (en) | Method for detecting and tracking infrared small target in complex background | |
CN102930558B (en) | Real-time tracking method for infrared image target with multi-feature fusion | |
CN106991686B (en) | A kind of level set contour tracing method based on super-pixel optical flow field | |
CN101924871A (en) | Mean shift-based video target tracking method | |
CN107944354B (en) | Vehicle detection method based on deep learning | |
Qi et al. | FTC-Net: Fusion of transformer and CNN features for infrared small target detection | |
CN110135312B (en) | Rapid small target detection method based on hierarchical LCM | |
Shen et al. | Adaptive pedestrian tracking via patch-based features and spatial–temporal similarity measurement | |
Li et al. | Visual object tracking using spatial context information and global tracking skills | |
CN108038856B (en) | Infrared small target detection method based on improved multi-scale fractal enhancement | |
Yuan et al. | A moving objects tracking method based on a combination of local binary pattern texture and hue | |
CN106250687B (en) | Go the deposit gravel roundness calculation method of fasciation IPP | |
CN108573236B (en) | Method for detecting infrared weak and small target under cloud background based on discrete fraction Brown random field | |
Bao et al. | Solar panel segmentation under low contrast condition | |
CN107067411B (en) | Mean-shift tracking method combined with dense features | |
Li et al. | Moving target tracking via particle filter based on color and contour features | |
CN115131240A (en) | Target identification method and system for three-dimensional point cloud data | |
Wang et al. | Comparison and Analysis of Several Clustering Algorithms for Pavement Crack Segmentation Guided by Computational Intelligence | |
Chen et al. | A mean shift algorithm based on modified Parzen window for small target tracking | |
Yu et al. | Research on video face detection based on AdaBoost algorithm training classifier | |
CN113888428A (en) | Infrared dim target detection method and device based on local contrast | |
Li et al. | An Infrared small target detection method based on local contrast measure and gradient property | |
CN113393395A (en) | High-dynamic infrared image segmentation threshold self-adaptive calculation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |