CN103839273A - Real-time detection tracking frame and tracking method based on compressed sensing feature selection - Google Patents
Real-time detection tracking frame and tracking method based on compressed sensing feature selection Download PDFInfo
- Publication number
- CN103839273A CN103839273A CN201410113641.8A CN201410113641A CN103839273A CN 103839273 A CN103839273 A CN 103839273A CN 201410113641 A CN201410113641 A CN 201410113641A CN 103839273 A CN103839273 A CN 103839273A
- Authority
- CN
- China
- Prior art keywords
- sample
- represent
- tracking
- lambda
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Abstract
The invention provides a real-time detection tracking frame and tracking method based on compressed sensing feature selection. According to the tracking frame and tracking method, compressed features can be selected, and only the sample characteristic with a high distinction degree is used for classification. Real-time tracking can be achieved through the frame and method, the tracking failure phenomenon caused by selection of wrong features is avoided, the influence of bad features on the tracking result is effectively restrained, the tracking speed is obviously increased, and the tracking precision is obviously improved.
Description
Technical field
The present invention relates to target tracking domain, relate in particular to a kind of real-time detection based on compressed sensing feature selecting and follow the tracks of framework and tracking.
Background technology
Known, target following is a key areas in computer vision technique, and in military affairs, medical treatment, has important application in monitoring and man-machine interaction.There are many methods recent years for solving the problem of target following, but due to the deformation of target, the variation of illumination, and the target reason such as be blocked, target following remains a difficult point.
The method for real time tracking of main flow all has adaptivity at present.In general tracking can be divided into two classes: generation method and method of discrimination.Generation method can learning objective characteristic model, then search for the region that target may place, the region that uses the model of having learnt to rebuild with least error is target position.In order to solve target deformation problems, WSL and IVT method are successively suggested.Recently, rarefaction representation method is with solving target by the problem of partial occlusion.But these generation models all do not utilize target background information around, these background informations can be in the time detecting target better by target and background separation out.
Discrimination model is regarded target following as a kind of by target and background separation test problems out.In discrimination model, use the good feature of separating capacity can effectively improve tracking accuracy.Use the boosting method of multiple Weak Classifier composition strong classifiers to be widely used at present.But many boosting methods have only been utilized the information of target itself, do not utilize the information of target background, so after target is not accurately detected, the precision of target following after will affecting, finally causes following the tracks of unsuccessfully.Recently, MIL method shows to utilize background information to use high dimensional feature Random Maps to become low dimensional feature to classify, and can effectively target and background area be separated.But because the feature of Random Maps is not likely suitable for classification, this produces adverse influence by causing using the bad feature of discrimination to carry out classification meeting to the effect of target following.
Summary of the invention
The present invention, in order to solve above-mentioned technical matters, has proposed a kind of real-time detection based on compressed sensing feature selecting and has followed the tracks of framework and tracking.
Technical scheme of the present invention is: framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting, comprise the steps:
Step 1, initiation parameter, determines the target of t frame
described target location is a rectangle frame, is the target that needs tracking in frame;
comprise four parameters: the row-coordinate of target in this frame
row coordinate
width width and height height.
Step 2, makes l (x)={ row (x), col (x) } represent the position of sample x, comprises row-coordinate and the row coordinate of center of a sample,
in radius s pixel, gather positive sample set around
?
radius r around, gathers negative sample collection between β pixel
Wherein s, r, β is empirical parameter, unit is pixel.
Step 3, aligns negative sample and concentrates each sample
extract n class Lis Hartel and levy f (x
t)={ f
1(x
t), f
2(x
t) ..., f
n(x
t).
Step 4, is used the features training Weak Classifier of positive sample and negative sample.
Step 5, defining classification interval is assessed each Weak Classifier, and selects Weak Classifier to be used for classification.
Step 7, at t+1 frame
collecting test sample in radius λ around
use H
t(x) test sample book is classified.
Step 8, classification results is the position at t+1 frame as target
Step 9, if t+1 frame is not last frame, makes t=t+1, returns to step 1.
And, s=4 in described step 2, the span that the span of r is 6~10, β is 15~30.
And, further comprising the steps of in described step 3:
Being w for width, is highly the sample of h, carries out convolution with template set T, altogether can generate m=(wh)
2individual feature;
3.2 use stochastic matrix R
n × mthe m of a sample feature is compressed, and n represents the vectorial dimension after compression.Element r in R
ijmeet the following conditions:
Use this matrix to compress the high dimensional feature of sample x.After compression, be characterized as f (x
t)={ f
1(x
t), f
2(x
t) ..., f
n(x
t)=RT (x
t), wherein T (x
t) represent that practical template set T is to sample x
tcarry out the vector of m feature of convolution extraction; Wherein, empirical parameter ζ represents degree of rarefication.
And, in described step 3.2, represent degree of rarefication empirical parameter ζ=4.
And described step 4 is further comprising the steps of:
The feature of the 4.1 positive samples of hypothesis and negative sample meets normal distribution, and positive sample obedience sample characteristics average is μ
1, sample characteristics standard deviation is σ
1normal distribution N (μ
1, σ
1), be designated as p (f (x) | y=1)~N (μ
1, σ
1), it is μ that negative sample is obeyed sample characteristics average
0, sample characteristics standard deviation is σ
0normal distribution N (μ
0, σ
0), be designated as p (f (x) | y=0)~N (μ
0, σ
0); Utilize positive negative sample to obtain parameter μ
1, σ
1, μ
0and σ
0;
The parameter update mode of positive sample is:
Wherein, λ is learning rate,
represent the positive sample characteristics average of t moment of directly obtaining,
represent the study positive sample characteristics average in rear t moment,
represent the study positive sample characteristics average in rear t-1 moment,
represent the standard deviation of the positive sample characteristics in rear t moment of study,
the standard deviation of the positive sample characteristics that expression Direct Sampling is obtained,
represent the standard deviation of the positive sample in t-1 moment;
The parameter update mode of negative sample is:
Wherein, λ is learning rate,
represent the t moment negative sample characteristic mean of directly obtaining,
represent the study negative sample characteristic mean in rear t moment,
represent the study negative sample characteristic mean in rear t-1 moment,
represent the standard deviation of the negative sample feature in rear t moment of study,
the standard deviation of the negative sample feature that expression Direct Sampling is obtained,
represent the standard deviation of the negative sample in t-1 moment;
4.2 Weak Classifier
h
k(x) represent sample x to use k tagsort.Wherein
f
k(x) k the feature of expression sample x.
And the learning rate λ in described step 4.1 and 4.2 gets 0.85.
And, in described step 5, comprise the following steps:
5.1 definition
Wherein n
prepresent the number of positive sample, n
nrepresent the number of negative sample,
5.2 select front K maximum feature of margin value
composition strong classifier, wherein i
krepresent to select the feature for classifying;
And, in described step 7, comprise the following steps:
As preferably, in described step 2, the quantity of the negative sample of getting is 1.5 times of positive sample collection quantity.
The invention has the beneficial effects as follows: the present invention (CFS) contrasts with the result of current other popular trackings, is better than other the whole bag of tricks from precision.The present invention can reach real-time follow-up, and speed is obviously better than most tracking.Secondly, the present invention is also obviously better than most of trackings in tracking accuracy.Finally, the present invention has avoided due to the tracking failure phenomenon of having selected wrong feature to cause, has effectively suppressed the impact on tracking results of bad feature.
Brief description of the drawings
Fig. 1 is process flow diagram of the present invention;
Fig. 2 is Margin schematic diagram of the present invention;
Fig. 3 is test video sequence david result comparison diagram.
Embodiment
In order to overcome above-mentioned the deficiencies in the prior art, the invention provides a kind of real-time detection based on compressed sensing feature selecting and follow the tracks of framework and tracking.The present invention can select the feature after compression, only uses the sample characteristics that discrimination is high to classify, and can obviously improve tracking velocity and tracking accuracy.
Below in conjunction with embodiment and accompanying drawing, the present invention is described in further detail, but embodiments of the present invention are not limited to this.
Framework and a tracking are followed the tracks of in real-time detection based on compressed sensing feature selecting, comprise the steps:
Step 1, initiation parameter, determines the target of t frame
described target location is a rectangle frame, is the target that needs tracking in frame;
comprise four parameters: the row-coordinate of target in this frame
row coordinate
width width and height height.
Step 2, makes l (x)={ row (x), col (x) } represent the position of sample x, comprises row-coordinate and the row coordinate of center of a sample,
in radius s pixel, gather positive sample set around
?
radius r around, gathers negative sample collection between β pixel
Wherein s, r, β is empirical parameter, unit is pixel, s=4 in the present invention, the span that the span of r is 6~10, β is 15~30, and the collection quantity of the negative sample of getting is generally 1.5 times of positive sample collection quantity, and this test gathers 70 negative samples.
Step 3, aligns negative sample and concentrates each sample
extract n class Lis Hartel and levy f (x
t)={ f
1(x
t), f
2(x
t) ..., f
n(x
t), step 3 is further comprising the steps of:
A stochastic matrix R ∈ R
n × mcan be by a high dimension vector x ∈ R
mbe mapped to the vector space v ∈ R of low-dimensional
n, v=Rx, here n < < m.If stochastic matrix R meets the limited isometry of RIP() condition, the position relationship of former vector in vector space that can preserve value of the vector after shining upon.Gauss's matrix meets the limited isometry of RIP() condition, but Gauss's matrix sparse not (in matrix neutral element number be directly proportional to the sparse degree of matrix), if can increase calculated amount for compressing.
3.1 are provided with template set T={h
ij}
i=1:w; J=1:h, wherein element definition is:
Being w for width, is highly the sample of h, carries out convolution with template set T, altogether can generate m=(wh)
2individual feature;
3.2 use stochastic matrix R
n × mthe m of a sample feature is compressed, and n represents the vectorial dimension after compression.Element r in R
ijmeet the following conditions:
Use this matrix to compress the high dimensional feature of sample x.After compression, be characterized as f (x
t)={ f
1(x
t), f
2(x
t) ..., f
n(x
t)=RT (x
t), wherein T (x
t) represent that practical template set T is to sample x
tcarry out the vector of m feature of convolution extraction; Wherein, empirical parameter ζ represents degree of rarefication, empirical parameter ζ=4 in the present invention.
Step 4, is used the features training Weak Classifier of positive sample and negative sample, and step 4 is further comprising the steps of:
The feature of the 4.1 positive samples of hypothesis and negative sample meets normal distribution, and positive sample obedience sample characteristics average is μ
1, sample characteristics standard deviation is σ
1normal distribution N (μ
1, σ
1), be designated as p (f (x) | y=1)~N (μ
1, σ
1), it is μ that negative sample is obeyed sample characteristics average
0, sample characteristics standard deviation is σ
0normal distribution N (μ
0, σ
0), be designated as p (f (x) | y=0)~N (μ
0, σ
0); Utilize positive negative sample to obtain parameter μ
1, σ
1, μ
0and σ
0;
The parameter update mode of positive sample is:
Wherein, λ is learning rate,
represent the positive sample characteristics average of t moment of directly obtaining,
represent the study positive sample characteristics average in rear t moment,
represent the study positive sample characteristics average in rear t-1 moment,
represent the standard deviation of the positive sample characteristics in rear t moment of study,
the standard deviation of the positive sample characteristics that expression Direct Sampling is obtained,
represent the standard deviation of the positive sample in t-1 moment;
The parameter update mode of negative sample is:
Wherein, λ is learning rate,
represent the t moment negative sample characteristic mean of directly obtaining,
represent the study negative sample characteristic mean in rear t moment,
represent the study negative sample characteristic mean in rear t-1 moment,
represent the standard deviation of the negative sample feature in rear t moment of study,
the standard deviation of the negative sample feature that expression Direct Sampling is obtained,
represent the standard deviation of the negative sample in t-1 moment;
4.2 Weak Classifier
h
k(x) represent sample x to use k tagsort.Wherein
f
k(x) k the feature of expression sample x;
Wherein said learning rate λ gets 0.85.
Step 5, defining classification interval is assessed each Weak Classifier, and selects Weak Classifier to be used for classification, and step 5 comprises the following steps:
5.1 definition
Wherein n
prepresent the number of positive sample, n
nrepresent the number of negative sample,
As shown in Figure 2, circle represents positive sample, and triangle represents negative sample.Horizontal ordinate represents first feature of sample, and ordinate represents second feature of sample.In the time using sample Second Characteristic to classify, can better positive and negative sample area be separated.The Weak Classifier that margin is larger, shows that the separating capacity of this sorter is stronger.
5.2 select front K maximum feature of margin value
composition strong classifier, wherein i
krepresent to select the feature for classifying;
Step 7, at t+1 frame
collecting test sample in radius λ around
use H
t(x) test sample book is classified, in step 7, comprises the following steps:
Step 9, if t+1 frame is not last frame, makes t=t+1, returns to step 1.
As shown in table 1, framework and tracking (CFS) are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting, and on each tracking and testing collection, to carry out test chart now good, average accuracy (average) is higher than additive method, wherein (CT) is compressed sensing tracking, is (FRAG) piecemeal tracking.Illustrate that the present invention (CFS) is better than additive method in tracking accuracy.
Table 1 is followed the tracks of accuracy
Cycle tests | CT | FRAG | CFS |
bolt | 0.0067 | 0.1233 | 0.8523 |
david | 0.3843 | 0.0955 | 0.7622 |
Tiger | 0.3894 | 0.3037 | 0.5473 |
FaceOcc1 | 0.8789 | 0.9899 | 0.9092 |
FaceOcc2 | 0.6995 | 0.7217 | 0.6897 |
Fish | 0.8739 | 0.5294 | 0.6933 |
Jumping | 0.0064 | 0.8371 | 0.8037 |
Tigger2 | 0.4188 | 0.1406 | 0.6423 |
Shaking | 0.0411 | 0.0877 | 0.5863 |
average | 0.411 | 0.425423 | 0.7207 |
Claims (9)
1. framework and a tracking are followed the tracks of in the real-time detection based on compressed sensing feature selecting, it is characterized in that: comprise the steps:
Step 1, initiation parameter, determines the target of t frame
described target location is a rectangle frame, is the target that needs tracking in frame;
comprise four parameters: the row-coordinate of target in this frame
row coordinate
width width and height height;
Step 2, makes l (x)={ row (x), col (x) } represent the position of sample x, comprises row-coordinate and the row coordinate of center of a sample,
in radius s pixel, gather positive sample set around
?
radius r around, gathers negative sample collection between β pixel
Wherein s, r, β is empirical parameter, unit is pixel;
Step 3, aligns negative sample and concentrates each sample
extract n class Lis Hartel and levy f (x
t)={ f
1(x
t), f
2(x
t) ..., f
n(x
t);
Step 4, is used the features training Weak Classifier of positive sample and negative sample;
Step 5, defining classification interval is assessed each Weak Classifier, and selects Weak Classifier to be used for classification;
Step 7, at t+1 frame
collecting test sample in radius λ around
use H
t(x) test sample book is classified;
Step 9, if t+1 frame is not last frame, makes t=t+1, returns to step 1.
2. framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting according to claim 1, it is characterized in that: s=4 in described step 2, the span that the span of r is 6~10, β is 15~30.
3. framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting according to claim 1, it is characterized in that: further comprising the steps of in described step 3:
3.1 are provided with template set T={h
ij}
i=1:w; J=1:h, wherein element definition is:
Being w for width, is highly the sample of h, carries out convolution with template set T, altogether can generate m=(wh)
2individual feature;
3.2 use stochastic matrix R
n × mthe m of a sample feature is compressed, and n represents the vectorial dimension after compression, the element r in R
ijmeet the following conditions:
Use this matrix to compress the high dimensional feature of sample x, after compression, be characterized as f (x
t)={ f
1(x
t), f
2(x
t) ..., f
n(x
t)=RT (x
t), wherein T (x
t) represent that practical template set T is to sample x
tcarry out the vector of m feature of convolution extraction; Wherein, empirical parameter ζ represents degree of rarefication.
4. framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting according to claim 3, it is characterized in that: in described step 3.2, represent degree of rarefication empirical parameter ζ=4.
5. framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting according to claim 1, it is characterized in that: described step 4 is further comprising the steps of:
The feature of the 4.1 positive samples of hypothesis and negative sample meets normal distribution, and positive sample obedience sample characteristics average is μ
1, sample characteristics standard deviation is σ
1normal distribution N (μ
1, σ
1), be designated as p (f (x) | y=1)~N (μ
1, σ
1), it is μ that negative sample is obeyed sample characteristics average
0, sample characteristics standard deviation is σ
0normal distribution N (μ
0, σ
0), be designated as p (f (x) | y=0)~N (μ
0, σ
0); Utilize positive negative sample to obtain parameter μ
1, σ
1, μ
0and σ
0;
The parameter update mode of positive sample is:
Wherein, λ is learning rate,
represent the positive sample characteristics average of t moment of directly obtaining,
represent the study positive sample characteristics average in rear t moment,
represent the study positive sample characteristics average in rear t-1 moment,
represent the standard deviation of the positive sample characteristics in rear t moment of study,
the standard deviation of the positive sample characteristics that expression Direct Sampling is obtained,
represent the standard deviation of the positive sample in t-1 moment;
The parameter update mode of negative sample is:
Wherein, λ is learning rate,
represent the t moment negative sample characteristic mean of directly obtaining,
represent the study negative sample characteristic mean in rear t moment,
represent the study negative sample characteristic mean in rear t-1 moment,
represent the standard deviation of the negative sample feature in rear t moment of study,
the standard deviation of the negative sample feature that expression Direct Sampling is obtained,
represent the standard deviation of the negative sample in t-1 moment;
6. framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting according to claim 5, it is characterized in that: the learning rate λ in described step 4.1 and 4.2 gets 0.85.
7. framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting according to claim 1, it is characterized in that: in described step 5, comprise the following steps:
5.1 definition
Wherein n
prepresent the number of positive sample, n
nrepresent the number of negative sample,
5.2 select front K maximum feature of margin value
composition strong classifier, wherein ik represents to select the feature for classifying;
8. framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting according to claim 1, it is characterized in that: in described step 7, comprise the following steps:
9. framework and tracking are followed the tracks of in a kind of real-time detection based on compressed sensing feature selecting according to claim 1, it is characterized in that: in described step 2, the quantity of the negative sample of getting is 1.5 times of positive sample collection quantity.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410113641.8A CN103839273B (en) | 2014-03-25 | 2014-03-25 | Real-time detection tracking frame and tracking method based on compressed sensing feature selection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410113641.8A CN103839273B (en) | 2014-03-25 | 2014-03-25 | Real-time detection tracking frame and tracking method based on compressed sensing feature selection |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103839273A true CN103839273A (en) | 2014-06-04 |
CN103839273B CN103839273B (en) | 2017-02-22 |
Family
ID=50802739
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410113641.8A Expired - Fee Related CN103839273B (en) | 2014-03-25 | 2014-03-25 | Real-time detection tracking frame and tracking method based on compressed sensing feature selection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103839273B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104200493A (en) * | 2014-09-05 | 2014-12-10 | 武汉大学 | Similarity measurement based real-time target tracking algorithm |
CN104200216A (en) * | 2014-09-02 | 2014-12-10 | 武汉大学 | High-speed moving target tracking algorithm for multi-feature extraction and step-wise refinement |
CN104680554A (en) * | 2015-01-08 | 2015-06-03 | 深圳大学 | SURF-based compression tracing method and system |
CN104809466A (en) * | 2014-11-28 | 2015-07-29 | 安科智慧城市技术(中国)有限公司 | Method and device for detecting specific target rapidly |
CN105427337A (en) * | 2015-10-30 | 2016-03-23 | 西北工业大学 | Time-delay video sequence motor cell tracking method based on compression perception |
CN106023257A (en) * | 2016-05-26 | 2016-10-12 | 南京航空航天大学 | Target tracking method based on rotor UAV platform |
CN106651912A (en) * | 2016-11-21 | 2017-05-10 | 广东工业大学 | Compressed sensing-based robust target tracking method |
CN106815862A (en) * | 2017-01-24 | 2017-06-09 | 武汉大学 | A kind of target tracking algorism based on convolution contour feature |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020133499A1 (en) * | 2001-03-13 | 2002-09-19 | Sean Ward | System and method for acoustic fingerprinting |
CN103325125A (en) * | 2013-07-03 | 2013-09-25 | 北京工业大学 | Moving target tracking method based on improved multi-example learning algorithm |
CN103345735A (en) * | 2013-07-16 | 2013-10-09 | 上海交通大学 | Compressed space-time multi-sensor fusion tracking method based on Kalman filter |
-
2014
- 2014-03-25 CN CN201410113641.8A patent/CN103839273B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020133499A1 (en) * | 2001-03-13 | 2002-09-19 | Sean Ward | System and method for acoustic fingerprinting |
CN103325125A (en) * | 2013-07-03 | 2013-09-25 | 北京工业大学 | Moving target tracking method based on improved multi-example learning algorithm |
CN103345735A (en) * | 2013-07-16 | 2013-10-09 | 上海交通大学 | Compressed space-time multi-sensor fusion tracking method based on Kalman filter |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104200216A (en) * | 2014-09-02 | 2014-12-10 | 武汉大学 | High-speed moving target tracking algorithm for multi-feature extraction and step-wise refinement |
CN104200493A (en) * | 2014-09-05 | 2014-12-10 | 武汉大学 | Similarity measurement based real-time target tracking algorithm |
CN104200493B (en) * | 2014-09-05 | 2017-02-01 | 武汉大学 | Similarity measurement based real-time target tracking algorithm |
CN104809466A (en) * | 2014-11-28 | 2015-07-29 | 安科智慧城市技术(中国)有限公司 | Method and device for detecting specific target rapidly |
CN104680554A (en) * | 2015-01-08 | 2015-06-03 | 深圳大学 | SURF-based compression tracing method and system |
CN104680554B (en) * | 2015-01-08 | 2017-10-31 | 深圳大学 | Compression tracking and system based on SURF |
CN105427337A (en) * | 2015-10-30 | 2016-03-23 | 西北工业大学 | Time-delay video sequence motor cell tracking method based on compression perception |
CN106023257A (en) * | 2016-05-26 | 2016-10-12 | 南京航空航天大学 | Target tracking method based on rotor UAV platform |
CN106023257B (en) * | 2016-05-26 | 2018-10-12 | 南京航空航天大学 | A kind of method for tracking target based on rotor wing unmanned aerial vehicle platform |
CN106651912A (en) * | 2016-11-21 | 2017-05-10 | 广东工业大学 | Compressed sensing-based robust target tracking method |
CN106815862A (en) * | 2017-01-24 | 2017-06-09 | 武汉大学 | A kind of target tracking algorism based on convolution contour feature |
CN106815862B (en) * | 2017-01-24 | 2020-03-10 | 武汉大学 | Target tracking method based on convolution contour features |
Also Published As
Publication number | Publication date |
---|---|
CN103839273B (en) | 2017-02-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103839273A (en) | Real-time detection tracking frame and tracking method based on compressed sensing feature selection | |
CN104537647B (en) | A kind of object detection method and device | |
CN102903122B (en) | Video object tracking method based on feature optical flow and online ensemble learning | |
CN109800628A (en) | A kind of network structure and detection method for reinforcing SSD Small object pedestrian detection performance | |
CN105069472A (en) | Vehicle detection method based on convolutional neural network self-adaption | |
CN105488456A (en) | Adaptive rejection threshold adjustment subspace learning based human face detection method | |
CN103258213A (en) | Vehicle model dynamic identification method used in intelligent transportation system | |
CN104361351B (en) | A kind of diameter radar image sorting technique based on range statistics similarity | |
CN107025420A (en) | The method and apparatus of Human bodys' response in video | |
CN106408030A (en) | SAR image classification method based on middle lamella semantic attribute and convolution neural network | |
CN106295532B (en) | A kind of human motion recognition method in video image | |
CN105426895A (en) | Prominence detection method based on Markov model | |
CN106127161A (en) | Fast target detection method based on cascade multilayer detector | |
CN101655914A (en) | Training device, training method and detection method | |
CN104182985A (en) | Remote sensing image change detection method | |
CN107808376A (en) | A kind of detection method of raising one's hand based on deep learning | |
CN103971106A (en) | Multi-view human facial image gender identification method and device | |
CN103593672A (en) | Adaboost classifier on-line learning method and Adaboost classifier on-line learning system | |
CN106780552A (en) | Anti-shelter target tracking based on regional area joint tracing detection study | |
CN105654516A (en) | Method for detecting small moving object on ground on basis of satellite image with target significance | |
CN105046259A (en) | Coronal mass ejection (CME) detection method based on multi-feature fusion | |
CN108734200A (en) | Human body target visible detection method and device based on BING features | |
CN107092878A (en) | It is a kind of based on hybrid classifer can autonomous learning multi-target detection method | |
Ibrahem et al. | Real-time weakly supervised object detection using center-of-features localization | |
CN106250913A (en) | A kind of combining classifiers licence plate recognition method based on local canonical correlation analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20170222 Termination date: 20180325 |
|
CF01 | Termination of patent right due to non-payment of annual fee |