CN104463192B - Dark situation video object method for real time tracking based on textural characteristics - Google Patents

Dark situation video object method for real time tracking based on textural characteristics Download PDF

Info

Publication number
CN104463192B
CN104463192B CN201410610358.6A CN201410610358A CN104463192B CN 104463192 B CN104463192 B CN 104463192B CN 201410610358 A CN201410610358 A CN 201410610358A CN 104463192 B CN104463192 B CN 104463192B
Authority
CN
China
Prior art keywords
mrow
msub
calculated
target
vectorial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410610358.6A
Other languages
Chinese (zh)
Other versions
CN104463192A (en
Inventor
孙继平
杜东璧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology Beijing CUMTB
Original Assignee
China University of Mining and Technology Beijing CUMTB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology Beijing CUMTB filed Critical China University of Mining and Technology Beijing CUMTB
Priority to CN201410610358.6A priority Critical patent/CN104463192B/en
Publication of CN104463192A publication Critical patent/CN104463192A/en
Application granted granted Critical
Publication of CN104463192B publication Critical patent/CN104463192B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • G06F18/24155Bayesian classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of dark situation video object method for real time tracking based on textural characteristics., can rapid extraction sample characteristics by vectorial integrogram algorithm using sparse random Gaussian matrix as compressed sensing matrix using multiple dimensioned rectangular filter as signal sampling matrix;Vectorial integrogram step effectively reduces redundant computation using template method is cut.The present invention utilizes invariable rotary pattern ULBP operator extraction features, and poor suitable for night, underground illumination condition, target may rotate, the target following of deformation, and discrimination is high, and reliable result is provided for target following.

Description

Dark situation video object method for real time tracking based on textural characteristics
Technical field
The present invention relates to a kind of dark situation video object method for real time tracking based on textural characteristics, belongs to image model knowledge Other technical field.
Background technology
Computer vision target tracking domain is generally using detection formula tracking framework, and the framework is by generating on a small quantity positive and negative Sample on-line training grader, Detection task is converted into by tracing task.Because object detection field achieve it is great enter Exhibition, classifier technique also make progress by numerous studies and constantly, and the success rate of tracking has been effectively ensured.Detection task needs pair The sample that collects carries out feature extraction, the characteristics of with reflected sample, just can carry out sample classification and differentiation, traditional feature Extracting method needs dependence experience to be constructed, and K.H.Zhang etc. proposes a kind of feature extracting method based on compressed sensing (Compressive Tracking), by the way that broad sense Haar features are ensured into feature with a series of multi-scale filtering device convolution Multiple dimensioned property, Random sparseness Gaussian matrix is recycled to carry out dimensionality reduction to feature to ensure the live effect of tracking.But broad sense There is the characteristic to illumination brightness, target rotational sensitive in Haar features, the present invention is improved using invariable rotary pattern ULBP operators The flow of feature extraction, on the basis of real-time and stability is ensured, target tracking algorism is made to can adapt to low-light (level), target All kinds of scenes for easily causing target to be lost such as rotation, illumination variation.
The content of the invention
The problem of in order to overcome existing track algorithm to handle target following in extreme illumination scene.The present invention proposes A kind of object real-time tracking method based on textural characteristics of the particular surroundings such as suitable underground, night, this method utilize rotation not Change pattern ULBP operators carry out texture feature extraction, the feature after extraction is contained the texture information of abundant sample, utilize Texture information enables tracking system to reach higher tracking success rate in dim environment the characteristic of illumination-insensitive.
The invention discloses a kind of dark situation video object method for real time tracking based on textural characteristics, including initialization rank Section and target tracking stage, the initial phase comprise the following steps:
1) sparse sampling matrix Θ is calculated in initialization:
A) signal sampling matrix Φ is calculated;
B) sparse perception matrix Ψ is calculated;
C) sparse sampling matrix Θ, wherein Θ=Ψ Φ are calculated;
2) the two classification Naive Bayes Classifier H (x) formed by 50 Bayes classifier cascades are created, it is each Individual Weak Classifier hc(xc) it is all based on representing positive label y=0 and negative label y=1 two normal distributionsWherein (μY, c, σY, c) represent label for Weak Classifier corresponding to y c dimensional features just The parameter value of state Critical curve;
The target tracking stage comprises the following steps:
1) to the target acquisition of the kth frame of video
A) the target O traced into the frame of kth -1k-1Centered on carry out candidate samples collection, n is collected in kth frameyIt is individual Euclidean distance meetsCandidate samples;
B) minimum rectangular area ∪ z (the z ∈ z for including whole candidate samples are calculatedy), to the rectangular region image piece successively Gray processing, invariable rotary pattern ULBP codings, vector integration are carried out, finally gives vectorial integrogram I;
C) using the diagonal of the nonzero element in sparse sampling matrix Θ as scale, with diagonal subtraction from vectorial integrogram Compressed encoding characteristic value z → x (z ∈ z of each candidate samples are extracted in Iy);
D) by the compressed encoding characteristic value x of each candidate samplesrThe classifier calculated that the input frame of kth -1 trains is classified PointThe maximum x of score of classifyingrEven if the mesh that corresponding r-th of sample kth frame traces into Mark Ok
2) the grader renewal of kth frame
A) the target O traced into kth framekCentered on carry out positive and negative sample collection, n is collected in kth frame1Individual Euclidean Distance meetsPositive sample, n is collected in kth frame0Individual Euclidean distance meetsNegative sample;
B) minimum rectangular area ∪ z (the z ∈ z for including all positive negative samples are calculated1∪z0), to the rectangular region image piece Gray processing, invariable rotary pattern ULBP codings, vector integration are carried out successively, finally give vectorial integrogram I;
C) using the diagonal of the nonzero element in sparse sampling matrix Θ as scale, with diagonal subtraction from vectorial integrogram Compressed encoding characteristic value z → x (z ∈ z of each positive negative sample are extracted in I1∪z0);
D) grader is updated
WhereinWithBe respectively for positive sample (y=1) and The average and variance of negative sample (y=0).
The present invention further discloses the target tracking stage by gray-scale map IgrayTo uniform pattern ULBP code patterns IRULBPCoding method comprises the following steps:
1) pending pixel is as central pixel point, its gray value g0With the p pixel g that distance is Ri∈ΓpAsh The difference of angle value carries out binaryzation, and connects into the binary number of a p position, 0~1 or 1~0 saltus step time in the binary number If number U is not more than 2, with the number of element 1, the uniform pattern ULBP as the pending pixel is encoded, wherein
The present invention further discloses the target tracking stage by uniform pattern ULBP code patterns IRULBPIntegrated to vector Figure I integration method comprises the following steps:
1) statistics with histogram, the 9 dimension each coding occurrence numbers of statistics with histogram of construction corresponding 0~8;
2) longitudinal direction is cumulative, and its step is
A) H is carried out flattening by row, a dimensional vector V is obtained after flatteningC
B) to VCAdded up, an obtained cumulative dimensional vector meets
C) to VΣCFractureed by row, obtain with the equal-sized images of H, be calculated as Hi
3) laterally cumulative, its step is
A) to HiFlattened by row, one-dimensional row vector V is obtained after flatteningR
B) to VRAdded up, an obtained cumulative dimensional vector meets
C) to VΣRFractureed by row, obtain with the equal-sized images of H, be calculated as Hii
HiiThe statistic histogram of any submatrix can be by H in image I, H after as handlingiiCarry out diagonal subtraction Try to achieve.
Brief description of the drawings
The present invention is described in further detail with reference to the accompanying drawings and detailed description.
Fig. 1 is the dark situation video object real-time tracking flow chart based on textural characteristics;
Fig. 2 is sparse sampling matrix Θ and sample convolution schematic diagram;
Fig. 3 is the ROI region figure after coding;
Embodiment
The specific embodiment of the invention is described in detail with reference to Figure of description, first to special based on texture The basic procedure of the dark situation video object method for real time tracking of sign is described.Reference picture 1, process be divided into initial phase, Target tracking stage, the grader more new stage, it is comprised the following steps that, initial phase:
1) sparse sampling matrix Θ is calculated;
A) calculated using signal sampling matrix Φ with the Monte Carlo simulation of the sparse perception matrix Ψ products being multiplied, with initial Target O1Rectangle is action scope, generates the rectangle frame of 2~3 random sizes of random site, and these rectangle frames will be included in O1, will A line nonzero element of these rectangle frames as Θ;
B) repeat step 1a) d times, reference picture 2, obtain whole nonzero elements of Θ d rows;
2) d dimension Naive Bayes Classifier h (x) are generated;
A) generate one two classification Bayes classifier is as a Weak Classifier, its positive label Critical curve parameter μ1, c=0, σ1, c=1, its negative label Critical curve parameter is μ0, c=0, σ0, c=1;
B) repeat step 2a) d times, obtain h (x) d cascade Weak Classifier hc(xc), wherein c=1,2 ..., d;
3) the calculating masterplate that gray value encodes to invariable rotary pattern ULBP is created;
Radius is used as calculating masterplate of the 1 invariable rotary pattern ULBP operators as RULBP, generation includes 0~255 Length is 256 array, and each element subscript value in array is converted into binary digit, counts the binary system subscript value phase The transition times of adjacent bit 0~1 or 1~0 are calculated as U, if U≤2, by the number that bit 1 occurs in the binary system subscript value As element value, otherwise using 0 as element value, the array that the length after operation is 256 includes 0~8 9 kinds of codings;
Target tracking stage:
1) to the target acquisition of the kth frame of video
A) the target O traced into the frame of kth -1k-1Top left corner apex centered on, calculate to its distance meet All pixels point { py, with { pyIt is top left corner apex, with Ok-1Size be size, the rectangle of gained is candidate samples zy, it is calculated as
B) the minimum rectangular area ROI of whole candidate samples, reference picture 3 are included, its computational methods is ∪ z (z ∈ zy), square Shape union operator ∪ is O (l1, r1, t1, b1)∪O(l2, r2, t2, b2)=O (max (l1, l2), min (r1, r2), max (t1, t2), min(b1, b2));
C) the image sheet feature coding included to ROI, for each pixel p in ROI image piece, its adjacent 8 pixel Point, using right pixel point as starting point, arranged according to counter clockwise direction, 8 points after sequence are represented sequentially as q1~q8, haveCompare q successivelylWith p gray value size, by patrolling after 8 comparisons Collect value and be connected as 8 bits, the invariable rotary pattern ULBP as the pixel is encoded;
D) ROI after encoding is calculated as matrix H and carries out vectorial integration, and H is carried out to flatten by row, obtained after flattening it is one-dimensional arrange to Measure VC, to VCAdded up, an obtained cumulative dimensional vector meetsTo VΣCFractureed by row, obtain To with the equal-sized images of H, be calculated as Hi, to HiFlattened by row, one-dimensional row vector V is obtained after flatteningR, to VRTired out Add, an obtained cumulative dimensional vector meetsTo VΣRFractureed by row, obtain equal-sized with H Image, it is calculated as Hii
E) 2~3 nonzero elements of the sparse sampling matrix Θ generated by initialization procedure often row, wherein each element It is rectangular filter, these wave filters are to candidate samples zrOne-dimensional x of the results added for being filtered to obtain as featureR, c, Θ d rows are subjected to same operation, that is, obtain candidate samples zrD dimensional features xr=(xR, 1, xR, 2..., xR, d);
F) whole candidate samples z are calculatedr∈zyFeatureThe simplicity trained using the frame of kth -1 Bayes classifier h (xr;K-1) feature of each candidate samples is classified and calculates classification scoreWhereinWill The target O that the maximum candidate samples of classification score trace into as kth framek
2) the grader renewal of kth frame
A) the target O traced into kth framekTop left corner apex centered on, calculate to its distance meet's All pixels point { p1, with { p1Pinpointed for the upper left corner, with OkSize be size, the rectangle of gained is positive sample z1, it is calculated as
B) the target O traced into kth framekTop left corner apex centered on, calculate to its distance meet's All pixels point { p0, with { p0Pinpointed for the upper left corner, with OkSize be size, the rectangle of gained is negative sample z0, it is calculated as
C) the minimum rectangular area ROI of all positive negative samples is included, reference picture 2, its computational methods is ∪ z (z ∈ z1∪ z0), the step 1b of rectangle union operator ∪ and the target acquisition of target tracking stage kth frame) it is identical;
D) the step 1c of the target acquisition of target tracking stage kth frame is repeated);
E) the step 1d of the target acquisition of target tracking stage kth frame is repeated);
F) 2~3 nonzero elements of the sparse sampling matrix Θ generated by initialization procedure often row, wherein each element It is rectangular filter, these wave filters align negative sample zrOne-dimensional x of the results added for being filtered to obtain as featureR, c, Θ d rows are subjected to same operation, that is, obtain positive and negative sample zrD dimensional features xr=(xR, 1, xR, 2..., xR, d);
G) whole positive sample z are calculatedr∈z1FeatureWherein each positive sample zrFeature all It is d dimensions xr=(xR, 1, xR, 2..., xR, d), solve the averages of whole positive sample featuresAnd varianceWherein c=1,2 ..., d;
H) whole negative sample z are calculatedr∈z0FeatureWherein each negative sample zrFeature all It is d dimensions xr=(xR, 1, xR, 2..., xR, d), solve the averages of whole negative sample featuresAnd varianceWherein c=1,2 ..., d;
I) all Weak Classifiers are updated
Wherein c=1,2 ..., d.

Claims (3)

1. a kind of dark situation video object method for real time tracking based on textural characteristics, it is characterised in that including initial phase And target tracking stage, the initial phase comprise the following steps:
1) in initialization, sparse sampling matrix Θ is calculated:
A) signal sampling matrix Φ is calculated;
B) sparse perception matrix Ψ is calculated;
C) sparse sampling matrix Θ, wherein Θ=Ψ Φ are calculated;
2) one two classification Naive Bayes Classifier H (x) formed by 50 Bayes classifiers cascades of establishment, each weak point Class device hc(xc) it is all based on representing positive label y=1 and negative label y=0 two normal distributions Wherein (μY, c, σY, c) represent parameter value of the label for the Normal Discrimination curve of Weak Classifier corresponding to y c dimensional features;
The target tracking stage comprises the following steps:
1) target acquisition is carried out to the kth frame of video
A) the target O traced into the frame of kth -1k-1Centered on carry out candidate samples collection, n is collected in kth frameyIndividual Euclidean away from From satisfactionCandidate samples;
B) minimum rectangular area ∪ z (the z ∈ z for including whole candidate samples are calculatedy), the rectangular region image piece is carried out successively Gray processing, invariable rotary pattern ULBP codings, vector integration, finally give vectorial integrogram I;
C) using the diagonal of the nonzero element in sparse sampling matrix Θ as scale, with diagonal subtraction from vectorial integrogram I Extract compressed encoding characteristic value z → x (z ∈ z of each candidate samplesy);
D) by the compressed encoding characteristic value x of each candidate samplesrThe Naive Bayes Classifier that the input frame of kth -1 trains, to every The feature of individual candidate samples is classified, and calculates classification scoreScore of classifying maximum xrCorresponding r-th of sample is the target O that kth frame traces intok
2) the grader renewal of kth frame
A) the target O traced into kth framekCentered on carry out positive and negative sample collection, n is collected in kth frame1Individual Euclidean distance Meet z1=z | 0≤| | z-Ok||l2≤r1 +Positive sample, n is collected in kth frame0Individual Euclidean distance meetsNegative sample;
B) minimum rectangular area ∪ z (the z ∈ z for including all positive negative samples are calculated1∪z0), to the rectangular region image piece successively Gray processing, invariable rotary pattern ULBP codings, vector integration are carried out, finally gives vectorial integrogram I;
C) using the diagonal of the nonzero element in sparse sampling matrix Θ as scale, with diagonal subtraction from vectorial integrogram I Compressed encoding characteristic value z → x (z ∈ z of each positive negative sample of extraction1∪z0);
D) grader is updated
<mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msubsup> <mi>&amp;mu;</mi> <mrow> <mi>y</mi> <mo>,</mo> <mi>c</mi> </mrow> <mo>&amp;prime;</mo> </msubsup> <mo>&amp;LeftArrow;</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msub> <mi>&amp;mu;</mi> <mrow> <mi>y</mi> <mo>,</mo> <mi>c</mi> </mrow> </msub> <mo>+</mo> <msub> <mi>&amp;lambda;EX</mi> <mrow> <mi>y</mi> <mo>,</mo> <mi>c</mi> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mi>y</mi> <mo>,</mo> <mi>c</mi> </mrow> <mo>&amp;prime;</mo> </msubsup> <mo>&amp;LeftArrow;</mo> <msup> <mrow> <mo>&amp;lsqb;</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mi>y</mi> <mo>,</mo> <mi>c</mi> </mrow> <mn>2</mn> </msubsup> <mo>+</mo> <msub> <mi>&amp;lambda;DX</mi> <mrow> <mi>y</mi> <mo>,</mo> <mi>c</mi> </mrow> </msub> <mo>+</mo> <mi>&amp;lambda;</mi> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>&amp;mu;</mi> <mrow> <mi>y</mi> <mo>,</mo> <mi>c</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>EX</mi> <mrow> <mi>y</mi> <mo>,</mo> <mi>c</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>&amp;rsqb;</mo> </mrow> <mrow> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mrow> </msup> </mrow> </mtd> </mtr> </mtable> </mfenced>
WhereinWithIt is for positive sample y=1 and negative sample y=respectively 0 average and variance.
2. a kind of dark situation video object method for real time tracking based on textural characteristics according to claim 1, its feature It is, the target tracking stage is by gray-scale map IgrayTo uniform pattern ULBP code patterns IRULBPCoding method is:
Pending pixel is as central pixel point, its gray value g0With the p pixel g that distance is Ri∈ΓpGray value Difference carries out binaryzation, and connects into the binary number of a p position, if in the binary number 0~1 or 1~0 transition times U No more than 2, then with the number of element 1, the uniform pattern ULBP as the pending pixel is encoded, wherein
3. a kind of dark situation video object method for real time tracking based on textural characteristics according to claim 1, its feature It is, the target tracking stage is by uniform pattern ULBP code patterns IRULBPTo vectorial integrogram I integration method include it is following Step:
1) statistics with histogram, the 9 dimension each coding occurrence numbers of statistics with histogram of construction corresponding 0~8;
2) longitudinal direction is cumulative, and its step is
A) H is carried out flattening by row, a dimensional vector V is obtained after flatteningC
B) to VCAdded up, an obtained cumulative dimensional vector meets
C) to V∑CFractureed by row, obtain with the equal-sized images of H, be calculated as Hi
3) laterally cumulative, its step is
A) to HiFlattened by row, one-dimensional row vector V is obtained after flatteningR
B) to VRAdded up, an obtained cumulative dimensional vector meets
C) to V∑RFractureed by row, obtain with the equal-sized images of H, be calculated as Hii
HiiThe statistic histogram of any submatrix can be by H in image I, H after as handlingiiDiagonal subtraction is carried out to try to achieve.
CN201410610358.6A 2014-11-04 2014-11-04 Dark situation video object method for real time tracking based on textural characteristics Active CN104463192B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410610358.6A CN104463192B (en) 2014-11-04 2014-11-04 Dark situation video object method for real time tracking based on textural characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410610358.6A CN104463192B (en) 2014-11-04 2014-11-04 Dark situation video object method for real time tracking based on textural characteristics

Publications (2)

Publication Number Publication Date
CN104463192A CN104463192A (en) 2015-03-25
CN104463192B true CN104463192B (en) 2018-01-05

Family

ID=52909206

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410610358.6A Active CN104463192B (en) 2014-11-04 2014-11-04 Dark situation video object method for real time tracking based on textural characteristics

Country Status (1)

Country Link
CN (1) CN104463192B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915562A (en) * 2012-09-27 2013-02-06 天津大学 Compressed sensing-based multi-view target tracking and 3D target reconstruction system and method
CN103310466A (en) * 2013-06-28 2013-09-18 安科智慧城市技术(中国)有限公司 Single target tracking method and achievement device thereof
CN103632382A (en) * 2013-12-19 2014-03-12 中国矿业大学(北京) Compressive sensing-based real-time multi-scale target tracking method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7940843B1 (en) * 2002-12-16 2011-05-10 Apple Inc. Method of implementing improved rate control for a multimedia compression and encoding system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915562A (en) * 2012-09-27 2013-02-06 天津大学 Compressed sensing-based multi-view target tracking and 3D target reconstruction system and method
CN103310466A (en) * 2013-06-28 2013-09-18 安科智慧城市技术(中国)有限公司 Single target tracking method and achievement device thereof
CN103632382A (en) * 2013-12-19 2014-03-12 中国矿业大学(北京) Compressive sensing-based real-time multi-scale target tracking method

Also Published As

Publication number Publication date
CN104463192A (en) 2015-03-25

Similar Documents

Publication Publication Date Title
CN106778595B (en) Method for detecting abnormal behaviors in crowd based on Gaussian mixture model
CN104361353B (en) A kind of application of localization method of area-of-interest in instrument monitoring identification
CN108346159A (en) A kind of visual target tracking method based on tracking-study-detection
CN102722712B (en) Multiple-scale high-resolution image object detection method based on continuity
CN102496001A (en) Method of video monitor object automatic detection and system thereof
Kang et al. Deep learning-based weather image recognition
CN104778457A (en) Video face identification algorithm on basis of multi-instance learning
Yuan et al. Learning to count buildings in diverse aerial scenes
CN102945378A (en) Method for detecting potential target regions of remote sensing image on basis of monitoring method
Luo et al. Traffic analytics with low-frame-rate videos
CN103886585A (en) Video tracking method based on rank learning
CN107480585A (en) Object detection method based on DPM algorithms
Pan et al. Adaptive center pixel selection strategy in local binary pattern for texture classification
CN103902989A (en) Human body motion video recognition method based on non-negative matrix factorization
CN106846363A (en) A kind of scale adaptability compression tracking for improving sparse matrix
Caetano et al. Optical Flow Co-occurrence Matrices: A novel spatiotemporal feature descriptor
CN105809206A (en) Pedestrian tracking method
Tao et al. Smoke vehicle detection based on spatiotemporal bag-of-features and professional convolutional neural network
CN106815562A (en) A kind of pedestrian detection tracking based on compressive features
Sun et al. Exploiting deeply supervised inception networks for automatically detecting traffic congestion on freeway in China using ultra-low frame rate videos
CN110570450B (en) Target tracking method based on cascade context-aware framework
Papakostas et al. Thermal infrared face recognition based on lattice computing (LC) techniques
CN102156879B (en) Human target matching method based on weighted terrestrial motion distance
Saha et al. Neural network based road sign recognition
Samsami et al. Classification of the air quality level based on analysis of the sky images

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant