CN106991689A - Method for tracking target and GPU based on FHOG and color characteristic accelerate - Google Patents

Method for tracking target and GPU based on FHOG and color characteristic accelerate Download PDF

Info

Publication number
CN106991689A
CN106991689A CN201710216523.3A CN201710216523A CN106991689A CN 106991689 A CN106991689 A CN 106991689A CN 201710216523 A CN201710216523 A CN 201710216523A CN 106991689 A CN106991689 A CN 106991689A
Authority
CN
China
Prior art keywords
image
fhog
feature
parallel
gpu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710216523.3A
Other languages
Chinese (zh)
Other versions
CN106991689B (en
Inventor
李云松
刘金花
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201710216523.3A priority Critical patent/CN106991689B/en
Publication of CN106991689A publication Critical patent/CN106991689A/en
Application granted granted Critical
Publication of CN106991689B publication Critical patent/CN106991689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

Accelerate the invention discloses a kind of method for tracking target based on FHOG and color characteristic and GPU, can be achieved accurately to track the high speed of the target in video.Assemblage characteristic of the invention by extracting FHOG, color naming base colors and colourity saturation degree three, improves target following accuracy rate;By using with 0.006 for etc. difference 7 level adaptation change of scale, improve the tracking accuracy rate to target under change of scale scene;It is parallel to accelerate the KCF target tracking algorisms after improving by using computer graphics processor GPU, greatly improve tracking velocity.

Description

Method for tracking target and GPU based on FHOG and color characteristic accelerate
Technical field
The invention belongs to field of computer technology, one in computer video target following technical field is further related to Plant method for tracking target and GPU based on FHOG and color characteristic to accelerate, be mainly used in the real-time accurate tracking of video object.
Background technology
High performance method for tracking target is the core technology of computer vision field.Current method for tracking target is divided into Two classes:One class is the tracking of feature based matching, and this method, which is mainly structure, can represent clarification of objective, then pass through Matching degree between feature judges the position of target;Another kind of is the tracking based on target and background separation, this method fortune The grader of target and background can be separated by learning one with the method for machine learning, and learning process is generally on-line training mistake Journey, target location is judged by the grader learnt.By contrast, the former, which has, calculates simple, but to illumination, screening The situation of the factors such as gear, yardstick change can not be handled well.The latter can solve the problem that the problem of the former runs into a certain extent, And with higher robustness, but its computation complexity is higher.Tracking based on target and background separation is current master Flow tracking.
Patent application " using the GPU Struck accelerated method for tracking target " (Shen that Xian Electronics Science and Technology University proposes Please day:On March 14th, 2015, application number:201510112791.1, publication number:CN 104680558A) in disclose a kind of GPU Hardware-accelerated struck method for tracking target.This method, which is used, is based on structuring supporting vector machine model structured SVM, study one can distinguish the grader of target and background, and the position of target is judged by the grader learnt, improve Tracking velocity, by using GPU parallel computations improves tracking velocity.But the deficiency that this method is present is to employ base In structuring supporting vector machine model structured SVM, tracking accuracy rate is not high, and tracking velocity is slow.
Patent application " a kind of real-time video tracing method " (applying date that the upper grand softcom limited of Hypon proposes: On May 13rd, 2016, application number:201610314297.8, publication date:The A of CN 106023248) disclosed in it is a kind of will tracking The mode of Target Segmentation into block compresses characteristics of image, using between KCF (coring correlation filter) algorithm calculating characteristic vector Correlation to reach the purpose of video tracking.Although this method has the advantages that performance is high, real-time under general scene is met Demand.But the deficiency that this method still has is, the process employs grey level histogram and the group of chroma histogram feature Close, tracking accuracy rate is not high;The method for employing serial computing carries out feature extraction, model training, target detection, processing speed Slowly.
2014, Henriques, J.F., Caseiro, R., Martins, P., and the paper that Batista, J. are delivered “High-Speed Tracking with Kernelized Correlation Filters”(Pattern Analysis And Machine Intelligence, IEEE Transactions on) propose it is a kind of based on two-dimension fourier transform Tracking, i.e. KCF algorithms.The algorithm extracts FHOG features, the training sample of grader is built using circulation skew, using following The solution of problem is transformed to Fourier by the characteristic of ring matrix, reduces algorithm complex, and tracking is accelerated to a certain extent Speed.But, the deficiency still suffered from the algorithm is to be trained using the method for serial computing and detection causes tracking velocity It is not fast enough, it is only not high using FHOG signature tracking accuracys rate, it is impossible to adapt to target scale change etc..
The content of the invention
It is an object of the invention to overcome above-mentioned the deficiencies in the prior art to accelerate to be based on FHOG and face there is provided a kind of GPU The method for tracking target of color characteristic, can be achieved accurately to track the high speed of the target in video.
Assemblage characteristic of the invention by extracting FHOG, color-naming base color and colourity saturation degree three, is carried High target following accuracy rate;By using with 0.006 for etc. difference 7 level adaptation change of scale, improve to yardstick become The tracking accuracy rate of target under carry over scape;It is parallel to accelerate the KCF targets after improving by using computer graphics processor GPU Track algorithm, greatly improves tracking velocity.
To achieve the above object, step of the invention includes as follows substantially:
1st, a kind of method for tracking target and GPU based on FHOG and color characteristic accelerate, and comprise the following steps:
(1) image, incoming GPU are obtained:
A two field picture in image sequence to be tracked is loaded into main frame internal memory by (1a);
(1b) will be loaded into the copying image of calculator memory into GPU internal memories;
(2) whether judge the image obtained is the 1st two field picture in image sequence to be tracked;If so, then performing step (12) step (3), otherwise, is performed;
(3) judge whether to use dynamitic scales, if so, then performing step (4), otherwise perform step (8);
(4) multiple dimensioned extraction image:
(4a) extracts candidate region image using 7 grades of yardsticks;
(4b) zoomed image obtains search rectangular block diagram picture to specific dimensions;
(5) it is parallel to extract FHOG and color characteristic;
(5a) extracts FHOG features parallel;
(5b) extracts color-naming base color features parallel;
(5c) extracts HS colourities-saturation degree feature parallel;
(5d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering;
(6) cross-correlation matrix and maximum response are calculated;
(7) peak response in all yardsticks is calculated:
(7a) is from the corresponding response f of all 7 yardstickszmaxIn take maximum as final maximum response fzmax
(7b) performs step (11);
(8) image at search rectangular frame is extracted;
Target rectangle frame is extended, search rectangular frame is obtained, the target figure at search rectangular frame is extracted from image to be detected Picture;
(9) it is parallel to extract FHOG and color characteristic;
(9a) extracts FHOG features parallel;
(9b) extracts color-naming base color features parallel;
(9c) extracts HS colourities-saturation degree feature parallel;
(5d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering.
(10) cross-correlation matrix and maximum response are calculated;
(11) target rectangle frame is updated;
(11a) updates the target rectangle frame of tracking target using the corresponding coordinate of maximum response.
(11b) performs step (13);
(12) initialized target rectangle frame;
One is chosen from input picture and includes the rectangle frame including tracking target, selected rectangle frame is regard as tracking target Target rectangle frame;
(13) image at search rectangular frame is extracted;
Extension target rectangle frame obtains search rectangular frame, and the target figure at search rectangular frame is extracted from image to be detected Picture;
(14) it is parallel to extract FHOG and color characteristic;
(14a) extracts FHOG features parallel;
(14b) extracts color-naming base color features parallel;
(14c) extracts HS colourities-saturation degree feature parallel;
(14d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering;
(15) autocorrelation matrix is calculated;
(16) trace model parameter is updated;
(17) judge whether to have loaded the image of all frames;
Judge whether to have loaded all images;If so, then performing step (18), otherwise, step (1) is performed;
(18) tracking is terminated.
The present invention compared with prior art, has the following advantages that:
First, the present invention extracts FHOG, color- of tracking target image by step (5), step (9), step (14) Naming base colors, colourity saturation combinations feature, and accelerated parallel using GPU, overcome and track in the prior art accurately Rate is low, extracts the problem of feature is slow, and tracking velocity is improved while improving tracking accuracy rate.
Second, the present invention by step (4), step (5), step (6), step (7), use with 0.006 for etc. difference 7 Level adaptation scale transformation method, and accelerated parallel using GPU, improve tracking velocity while improving tracking accuracy rate.
3rd, the present invention uses computer image processor GPU, parallel to accelerate the target tracking algorism after improving, with CPU Processing is compared to the speed that tracking target greatly improved.
Brief description of the drawings
Fig. 1 is the flow chart of the present invention;
Embodiment
The present invention is described in detail below in conjunction with the accompanying drawings.
Referring to the drawings 1, step, which is described in detail, to be realized to the present invention.
Step 1, image, incoming GPU are obtained:
A two field picture in image sequence to be tracked is loaded into main frame internal memory, computer then will be loaded into The copying image of internal memory is into GPU internal memories;
Step 2, whether judge the image obtained is the 1st two field picture in image sequence to be tracked;If so, then performing step (12) step (3), otherwise, is performed;
Step 3, judge whether to use dynamitic scales, if so, then performing step (4), otherwise perform step (8).
Step 4, multiple dimensioned extraction image:
First, candidate region images are extracted using 7 grades of yardsticks, it with 0.006 is equal difference that the corresponding scale coefficient of 7 grades of yardsticks, which is, The arithmetic progression of finger, respectively 0.982,0.988,0.994,1.00,1.006,1.012,1.018.
Then, zoomed image regard wide, high 2.5 times of the extension respectively for scaling rectangle frame as scale searching to specific dimensions Rectangle frame, a width of 2.5*L of scale searching rectangle framep*Wori, a height of 2.5*Lp*Hori, wherein WoriIt is the width of target rectangle frame Degree, HoriIt is the height of target rectangle frame, the tracking target image at scale searching rectangle frame is extracted from image to be detected frame.
Finally, tracking target image is scaled a width of 2.5*W using linear difference methodori, a height of 2.5*HoriSearch Rope rectangle frame.
Step 5, it is parallel to extract FHOG and color characteristic.
FHOG and color characteristic refer to the FHOG features of 31 dimensions, the color-naming base colors feature of 11 dimensions and 2 dimensions HS colourities-saturation degree feature, 44 reform features being composed in series.
(5a) extracts FHOG features parallel;
The first step, converts the image into gray level image.
Second step, calculates gradient, and statistical gradient histogram obtains the orientation-sensitive feature of 18 dimensions.
(1) gradient is calculated using horizontal gradient integral operator [- 1,0,1] and vertical gradient integral operator [- 1,0,1]-1, and Row calculated level direction gradient GxWith vertical gradient Gy.According to formulaCalculating obtains gradient magnitude GmIf amplitude is equal to 0, it is 1/e10f by amplitude correction, if non-zero, then keeps constant.Using look-up table, with's As a result gradient direction G is calculated as index valueo
(2) parallel discrete gradient direction.By the direction of gradient, 0~2 π is averagely divided into 18 intervals, i.e.,Labeled as interval 1, interval 2 ... ..., interval 18.By gradient direction GoIt is included in most adjacent Between near region.
(3) parallel computation gradient orientation histogram.Gradient image edge is extended, each thread calculates alone gradient Nogata Figure, and it is divided into 18 layers by gradient direction, obtain (Wb+2)*(Hb+ 2) after * 18 histogram of gradients data, histogram data is cut Edge, finally give positioned at 18 interval total sizes be Wb*Hb* 18 orientation-sensitive characteristic vector R1, wherein,SymbolRepresent lower floor operation.
3rd step, normalization characteristic.
(1) gradient energy is calculated.The calculation formula for calculating gradient energy is as follows:
Wherein, Nδ, γ(i, j) is gradient energy, δ, γ ∈ { -1,1 }, therefore for four kinds of gradient energy of each (i, j) correspondence Amount, is N respectively- 1, -1(i, j), N+ 1, -1(i, j), N+ 1 ,+1(i, j), N- 1 ,+1(i, j), ε is the number of a very little.D (i, j) is 9 dimensions The insensitive characteristic vector in direction, it is obtained by the orientation-sensitive characteristic vector of 18 dimensions, and calculation formula is as follows:
Dk(i, j)=Ck(i, j)+Ck+9(i, j)
Wherein Dk(i, j) represents the data of the 9 kth dimensions for tieing up the insensitive characteristic vector D (i, j) in direction, Ck(i, j) represents 18 Tie up orientation-sensitive characteristic vector C (i, j) kth dimension data, k=0,1,2 ... ..., 8.
Accelerated method is merged using GPU cores, by the insensitive characteristic vector D in direction of 9 dimensionskThe calculating of (i, j) and gradient energy Measure Nδ, γThe calculating of (i, j) is placed in GPU same core and calculated, i.e., calculated using the formula oneself concluded:
Wherein, Ck(i, j) represents the 18 kth dimension datas for tieing up orientation-sensitive characteristic vector C (i, j), k=0,1,2 ... ..., 8
(2) according to equation below
Calculating obtains normalized 18 dimension orientation-sensitive characteristic vector R1.Wherein, C (i, j) is the orientation-sensitive spy of 18 dimensions The characteristic vector levied;TαIt is truncation funcation, represents for the value more than 0.2, be entered as 0.2;Nδ, γ(i, j) is gradient energy Amount, δ, γ ∈ { -1,1 }, ε is the number of a very little.
(3) fusion method is circulated using GPU, in same circulation, while calculating R1, is returned using a process oneself The expression formula received
R2k(i, j)=Tα(R1k(i, j)+R1k+9(i, j))
Calculate normalized 9 the dimension insensitive characteristic vector R2 in direction, wherein k and represent R1kThe data of the kth dimension of (i, j), k =0,1,2 ..., 8.
(4) in same circulation, while calculating R1, the R1 for 18 dimensions that add upk(i, j), k=0,1,2 ... ..., 17, obtain 4 dimension texture feature vector R3.
4th step, orientation-sensitive feature R1, direction insensitive feature R2 and texture feature vector R3 after series connection normalization, Obtain 31 dimension FHOG features.
In above process step, its parallel method is:Each thread block is extracted parallel using 16 × 16 threads Feature, it is assumed that data block size is w × h, then m × n thread block altogether, and wherein m and n are calculated according to equation below:
(5b) extracts color-naming base color features parallel;
Original image is divided into the image block of 16*16 sizes, each image block is calculated using a thread block in GPU, Each thread calculates the color-naming features of a pixel.RGB image is converted into the color- of 11 dimensions using look-up table Naming features F2.
(5c) extracts HS colourities-saturation degree feature parallel;
The first step, image normalization.
Second step, according to following formula, HSI spaces are transformed into by image from rgb space, and by colourity H, saturation degree S Feature, which is connected, obtains colourity-saturation degree feature of 2 dimensions.
Look-up table is used when calculating inverse cosine function;CalculateWhen, If result is 0, it is necessary to be modified to a 1/e10f.
In above process step, its parallel method is:Each thread block is extracted parallel using 16 × 16 threads Feature, it is assumed that data block size is w × h, then m × n thread block altogether, and wherein m and n are calculated according to equation below:
(5d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering.
Step 6, cross-correlation matrix and maximum response are calculated.
The first step, calls GPU built-in function Fourier transform function cufftExecC2C by the characteristics of image z of extractionfTurn Change to Fourier.
Second step, cross-correlation matrix k is calculated using Gaussian kernelxz', calculation formula is as follows:
Calculate x and z L2- norm squareds and | | x | |2With | | z | |2When, using shared drive, alternately plan is based on using GPU Reduction algorithm slightly accelerates summation operation.
3rd step, utilizes cross-correlation matrix kxz' and model parameter α ' calculating responses.Call GPU built-in function Fourier Response is transformed into spatial domain by transforming function transformation function cufftExecC2C, calculates maximum response fzmax.When calculating maximum, using altogether Internal memory is enjoyed, using reduction algorithms of the GPU based on alternately strategy, accelerates the computing of maximizing.
Step 7, the peak response in all yardsticks is calculated:
(7a) is from the corresponding response f of all 7 yardstickszmaxIn take maximum as final maximum response fmax
(7b) performs step 11.
Step 8, the image at search rectangular frame is extracted.
The length and width of target rectangle frame is extended to 2.5 times respectively as search rectangular frame.Search is extracted from image to be detected Tracking target image at rectangle frame, 2.5*Wori, a height of 2.5*Hori
Step 9, it is parallel to extract FHOG and color characteristic.
(9a) extracts FHOG features parallel;
(9b) extracts color-naming base color features parallel;
(9c) extracts HS colourities-saturation degree feature parallel;
(5d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering.
Step 10, cross-correlation matrix and maximum response are calculated.
Step 11, target rectangle frame is updated.
(11a) updates the target rectangle frame of tracking target using the coordinate of peak response point, by the length and width of target rectangle frame 2.5 times are extended respectively as search rectangular frame,
(11b) performs step 13.
Step 12, initialized target rectangle frame.
One is chosen from input picture and includes the rectangle frame including tracking target, selected rectangle frame is regard as tracking target Target rectangle frame.
Step 13, the image at search rectangular frame is extracted.
Extension target rectangle frame obtains search rectangular frame, and the target figure at search rectangular frame is extracted from image to be detected Picture.
Step 14, it is parallel to extract FHOG and color characteristic.
(14a) extracts FHOG features parallel;
(14b) extracts color-naming base color features parallel;
(14c) extracts HS colourities-saturation degree feature parallel;
(14d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering.
Step 15, autocorrelation matrix is calculated.
GPU built-in function Fourier transform function cufftExecC2C is called by the characteristics of image x of extractionfIt is transformed into Fu Leaf domain, calculates autocorrelation matrix kxx′。
Step 16, trace model parameter is updated.
The first step, the trace model parameter alpha obtained according to below equation calculating present frame training '
The DFT, k of y ' expression regressand valuesxxThe DFT of ' expression autocorrelation matrix, λ represent regularization parameter.
Second step, according to below equation, updates trace model parameter alphamodel' and
αmodel'=(1- β 1) * αmodel′+β1*α′
Wherein β 1 is weight coefficient, when without using dynamitic scales, β 1=0.02;Using adaptive scale When mapping function, β 1=0.01.
Step 17, judge whether to have loaded the image of all frames;
Judge whether to have loaded all images;If so, then performing step 18, otherwise, step 1 is performed;
Step 18, tracking is terminated.
For those skilled in the art, technical scheme that can be as described above and design, make other each It is kind corresponding to change and deform, and all these change and deformation should all belong to the protection model of the claims in the present invention Within enclosing.

Claims (7)

1. a kind of method for tracking target and GPU based on FHOG and color characteristic accelerate, it is characterised in that comprise the following steps:
(1) image, incoming GPU are obtained:
A two field picture in image sequence to be tracked is loaded into main frame internal memory by (1a);
(1b) will be loaded into the copying image of calculator memory into GPU internal memories;
(2) whether judge the image obtained is the 1st two field picture in image sequence to be tracked;If so, step (12) is then performed, it is no Then, step (3) is performed;
(3) judge whether to use dynamitic scales, if so, then performing step (4), otherwise perform step (8);
(4) multiple dimensioned extraction image:
(4a) extracts candidate region image using 7 grades of yardsticks;
(4b) zoomed image obtains search rectangular block diagram picture to specific dimensions;
(5) it is parallel to extract FHOG and color characteristic;
(5a) extracts FHOG features parallel;
(5b) extracts color-naming base color features parallel;
(5c) extracts HS colourities-saturation degree feature parallel;
(5d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering;
(6) cross-correlation matrix and maximum response are calculated;
(7) peak response in all yardsticks is calculated:
(7a) is from the corresponding response f of all 7 yardstickszmaxIn take maximum as final maximum response fmax
(7b) performs step (11);
(8) image at search rectangular frame is extracted;
Target rectangle frame is extended, search rectangular frame is obtained, the target image at search rectangular frame is extracted from image to be detected;
(9) it is parallel to extract FHOG and color characteristic;
(9a) extracts FHOG features parallel;
(9b) extracts color-naming base color features parallel;
(9c) extracts HS colourities-saturation degree feature parallel;
(5d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering.
(10) cross-correlation matrix and maximum response are calculated;
(11) target rectangle frame is updated;
(11a) updates the target rectangle frame of tracking target using the corresponding coordinate of maximum response.
(11b) performs step (13);
(12) initialized target rectangle frame;
One is chosen from input picture and includes the rectangle frame including tracking target, selected rectangle frame is regard as the mesh for tracking target Mark rectangle frame;
(13) image at search rectangular frame is extracted;
Extension target rectangle frame obtains search rectangular frame, and the target image at search rectangular frame is extracted from image to be detected;
(14) it is parallel to extract FHOG and color characteristic;
(14a) extracts FHOG features parallel;
(14b) extracts color-naming base color features parallel;
(14c) extracts HS colourities-saturation degree feature parallel;
(14d) three kinds of feature series connection obtain 44 reform features, plus Hanning window filtering;
(15) autocorrelation matrix is calculated;
(16) trace model parameter is updated;
(17) judge whether to have loaded the image of all frames;
Judge whether to have loaded all images;If so, then performing step (18), otherwise, step (1) is performed;
(18) tracking is terminated.
2. method for tracking target and GPU according to claim 1 based on FHOG and color characteristic accelerate, its feature exists In, the corresponding scale coefficient of 7 grades of yardsticks described in step (4a) be with 0.006 for etc. difference arithmetic progression, be respectively 0.982,0.988,0.994,1.00,1.006,1.012,1.018.
3. method for tracking target and GPU according to claim 1 based on FHOG and color characteristic accelerate, its feature exists In FHOG and color characteristic described in step (5) refer to that the FHOG features of 31 dimensions, the color-naming base colors of 11 dimensions are special HS colourities-saturation degree the feature for 2 dimensions of seeking peace, 44 reform features being composed in series.
4. method for tracking target and GPU according to claim 1 based on FHOG and color characteristic accelerate, its feature exists In concretely comprising the following steps for, parallel extraction FHOG features described in step (5a):
The first step, converts the image into gray level image;
Second step, calculates gradient, and statistical gradient histogram obtains the orientation-sensitive feature of 18 dimensions;
3rd step, normalization characteristic;
(a) gradient energy is calculated.The calculation formula for calculating gradient energy is as follows:
Wherein, Nδ, γ(i, j) is gradient energy, δ, γ ∈ { -1,1 }, therefore for four kinds of gradient energies of each (i, j) correspondence, point It is not N- 1, -1(i, j), N+ 1, -1(i, j), N+ 1 ,+1(i, j), N- 1 ,+1(i, j), ε is the number of a very little;D (i, j) is the direction of 9 dimensions Insensitive characteristic vector, it is obtained by the orientation-sensitive characteristic vector of 18 dimensions, and calculation formula is as follows:
Dk(i, j)=Ck(i, j)+Ck+9(i, j)
Wherein Dk(i, j) represents the data of the 9 kth dimensions for tieing up the insensitive feature D (i, j) in direction, Ck(i, j) represents that 18 dimension directions are quick Feel characteristic vector C (i, j) kth dimension data, k=0,1,2 ... ..., 8.
Accelerated method is merged using GPU cores, by the insensitive characteristic vector D in direction of 9 dimensionskThe calculating of (i, j) and gradient energy Nδ, γ The calculating of (i, j) is placed in GPU same core and calculated, i.e., calculated using the formula oneself concluded:
Wherein, Ck(i, j) represents the 18 kth dimension datas for tieing up orientation-sensitive characteristic vector C (i, j), k=0,1,2 ... ..., 8;
(b) according to equation below
Calculating obtains normalized 18 dimension orientation-sensitive characteristic vector R1;Wherein, C (i, j) is the orientation-sensitive feature of 18 dimensions Characteristic vector;TαIt is truncation funcation, represents for the value more than 0.2, be entered as 0.2;Nδ, γ(i, j) is gradient energy, δ, γ ∈ { -1,1 }, ε are the numbers of a very little;
(c) fusion method is circulated using GPU, in same circulation, while calculating R1, is concluded using a process oneself Expression formula:
R2k(i, j)=Tα(R1k(i, j)+R1k+9(i, j))
Calculate normalized 9 the dimension insensitive characteristic vector R2 in direction, wherein k and represent R1kThe data of the kth dimension of (i, j), k=0,1, 2,……,8;
(d) in same circulation, while calculating R1, the R1 for 18 dimensions that add upk(i, j), k=0,1,2 ... ..., 17, obtain To 4 dimension texture feature vector R3;
4th step, orientation-sensitive feature R1, direction insensitive feature R2 and texture feature vector R3 after series connection normalization, is obtained 31 dimension FHOG features;
In above process step, its parallel method is:Feature is extracted parallel using 16 × 16 threads to each thread block, Assuming that data block size is w × h, then m × n thread block altogether, wherein m and n are calculated according to equation below:
5. method for tracking target and GPU according to claim 1 based on FHOG and color characteristic accelerate, its feature exists In concretely comprising the following steps for, parallel extraction colourity-saturation degree feature described in step (5c):
The first step, image normalization;
Second step, according to following formula, HSI spaces are transformed into by image from rgb space, and by colourity H, saturation degree S features Series connection obtains colourity-saturation degree feature of 2 dimensions;
Look-up table is used when calculating inverse cosine function;CalculateWhen, such as Fruit result is 0, it is necessary to be modified to a 1/e10f;
In above process step, its parallel method is:Feature is extracted parallel using 16 × 16 threads to each thread block, Assuming that data block size is w × h, then m × n thread block altogether, wherein m and n are calculated according to equation below:
6. method for tracking target and GPU according to claim 1 based on FHOG and color characteristic accelerate, its feature exists In calculating cross-correlation matrix and maximum response described in step (6) are comprised the following steps that:
The first step, calls GPU built-in function Fourier transform function cufftExecC2C by the characteristics of image z of extractionfIt is transformed into Fu In leaf domain;
Second step, cross-correlation matrix k is calculated using Gaussian kernelxz', calculation formula is as follows:
Calculate x and z L2- norm squareds and | | x | |2With | | z | |2When, using shared drive, using GPU based on alternately strategy Reduction algorithm accelerates summation operation;
3rd step, utilizes cross-correlation matrix kxz' and model parameter α ' calculating responses.Call GPU built-in function Fourier transformation Response is transformed into spatial domain by function cufftExecC2C, calculates maximum response fzmax;When calculating maximum, in shared Deposit, using reduction algorithms of the GPU based on alternately strategy, accelerate the computing of maximizing.
7. method for tracking target and GPU according to claim 1 based on FHOG and color characteristic accelerate, its feature exists In comprising the following steps that for, calculating trace model parameter described in step (16):
The first step, the trace model parameter that present frame training is obtained is calculated according to below equation:
The DFT, k of y ' expression regressand valuesxxThe DFT of ' expression autocorrelation matrix, λ represent regularization parameter;
Second step, according to below equation, updates trace model parameter alphamodel' and x 'fmodle
βmodel'=(1- β 1) * αmodel′+β1*α′
x′fmodle=(1- β) * x 'fmodle1*x′f
Wherein β1It is weight coefficient, when without using dynamitic scales, β1=0.02;Using dynamitic scales When function, β1=0.01.
CN201710216523.3A 2017-04-05 2017-04-05 Target tracking method based on FHOG and color characteristics and GPU acceleration Active CN106991689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710216523.3A CN106991689B (en) 2017-04-05 2017-04-05 Target tracking method based on FHOG and color characteristics and GPU acceleration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710216523.3A CN106991689B (en) 2017-04-05 2017-04-05 Target tracking method based on FHOG and color characteristics and GPU acceleration

Publications (2)

Publication Number Publication Date
CN106991689A true CN106991689A (en) 2017-07-28
CN106991689B CN106991689B (en) 2019-12-31

Family

ID=59416287

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710216523.3A Active CN106991689B (en) 2017-04-05 2017-04-05 Target tracking method based on FHOG and color characteristics and GPU acceleration

Country Status (1)

Country Link
CN (1) CN106991689B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107491786A (en) * 2017-08-15 2017-12-19 电子科技大学 A kind of tobacco purchase repeats weigh behavior Automatic Visual Inspection and recognition methods
CN107918765A (en) * 2017-11-17 2018-04-17 中国矿业大学 A kind of Moving target detection and tracing system and its method
CN107977980A (en) * 2017-12-06 2018-05-01 北京飞搜科技有限公司 A kind of method for tracking target, equipment and computer-readable recording medium
CN108121945A (en) * 2017-11-14 2018-06-05 深圳市深网视界科技有限公司 A kind of multi-target detection tracking, electronic equipment and storage medium
CN108198192A (en) * 2018-01-15 2018-06-22 任俊芬 A kind of quick human body segmentation's method of high-precision based on deep learning
CN109034193A (en) * 2018-06-20 2018-12-18 上海理工大学 Multiple features fusion and dimension self-adaption nuclear phase close filter tracking method
CN109461170A (en) * 2018-09-20 2019-03-12 西安电子科技大学 Ultrahigh speed method for tracking target, computer vision system based on FPGA
CN110895820A (en) * 2019-03-14 2020-03-20 河南理工大学 KCF-based scale self-adaptive target tracking method
CN110895701A (en) * 2019-06-12 2020-03-20 河南理工大学 Forest fire online identification method and device based on CN and FHOG
CN111145217A (en) * 2019-12-27 2020-05-12 湖南华诺星空电子技术有限公司 KCF-based unmanned aerial vehicle tracking method
CN111862160A (en) * 2020-07-23 2020-10-30 中国兵器装备集团自动化研究所 Target tracking method, medium and system based on ARM platform
CN112396065A (en) * 2020-10-19 2021-02-23 北京理工大学 Scale-adaptive target tracking method and system based on correlation filtering

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020986A (en) * 2012-11-26 2013-04-03 哈尔滨工程大学 Method for tracking moving object
CN104200216A (en) * 2014-09-02 2014-12-10 武汉大学 High-speed moving target tracking algorithm for multi-feature extraction and step-wise refinement
CN104680558A (en) * 2015-03-14 2015-06-03 西安电子科技大学 Struck target tracking method using GPU hardware for acceleration

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020986A (en) * 2012-11-26 2013-04-03 哈尔滨工程大学 Method for tracking moving object
CN104200216A (en) * 2014-09-02 2014-12-10 武汉大学 High-speed moving target tracking algorithm for multi-feature extraction and step-wise refinement
CN104680558A (en) * 2015-03-14 2015-06-03 西安电子科技大学 Struck target tracking method using GPU hardware for acceleration

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JOAO F. HENRIQUES ET AL: ""High-Speed Tracking with Kernelized Correlation Filters"", 《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》 *
JUN ZHANG ET AL: "《2012 International Conference on Control Engineering and Communication Technology》", 17 January 2013 *
宋长贺 等: ""分布场的多特征融合目标跟踪方法"", 《西安电子科技大学学报(自然科学版)》 *
胡燕: ""基于GPU并行计算的目标跟踪快速算法研究"", 《中国优秀硕士学位论文全文数据库-信息科技辑》 *
袁国武 等: ""一种结合了纹理和颜色的运动目标跟踪算法"", 《计算机应用与软件》 *
赵嵩 等: ""GPU并行实现多特征融合粒子滤波目标跟踪算法"", 《微电子学与计算机》 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107491786B (en) * 2017-08-15 2020-10-20 电子科技大学 Automatic visual detection and identification method for repeated weighing behaviors of tobacco purchase
CN107491786A (en) * 2017-08-15 2017-12-19 电子科技大学 A kind of tobacco purchase repeats weigh behavior Automatic Visual Inspection and recognition methods
CN108121945A (en) * 2017-11-14 2018-06-05 深圳市深网视界科技有限公司 A kind of multi-target detection tracking, electronic equipment and storage medium
CN107918765A (en) * 2017-11-17 2018-04-17 中国矿业大学 A kind of Moving target detection and tracing system and its method
CN107977980A (en) * 2017-12-06 2018-05-01 北京飞搜科技有限公司 A kind of method for tracking target, equipment and computer-readable recording medium
CN107977980B (en) * 2017-12-06 2021-01-05 北京飞搜科技有限公司 Target tracking method, device and readable medium
CN108198192A (en) * 2018-01-15 2018-06-22 任俊芬 A kind of quick human body segmentation's method of high-precision based on deep learning
CN109034193A (en) * 2018-06-20 2018-12-18 上海理工大学 Multiple features fusion and dimension self-adaption nuclear phase close filter tracking method
CN109461170A (en) * 2018-09-20 2019-03-12 西安电子科技大学 Ultrahigh speed method for tracking target, computer vision system based on FPGA
CN109461170B (en) * 2018-09-20 2021-11-16 西安电子科技大学 Ultra-high-speed target tracking method based on FPGA and computer vision system
CN110895820A (en) * 2019-03-14 2020-03-20 河南理工大学 KCF-based scale self-adaptive target tracking method
CN110895820B (en) * 2019-03-14 2023-03-24 河南理工大学 KCF-based scale self-adaptive target tracking method
CN110895701A (en) * 2019-06-12 2020-03-20 河南理工大学 Forest fire online identification method and device based on CN and FHOG
CN110895701B (en) * 2019-06-12 2023-03-24 河南理工大学 Forest fire online identification method and device based on CN and FHOG
CN111145217A (en) * 2019-12-27 2020-05-12 湖南华诺星空电子技术有限公司 KCF-based unmanned aerial vehicle tracking method
CN111862160A (en) * 2020-07-23 2020-10-30 中国兵器装备集团自动化研究所 Target tracking method, medium and system based on ARM platform
CN111862160B (en) * 2020-07-23 2023-10-13 中国兵器装备集团自动化研究所有限公司 Target tracking method, medium and system based on ARM platform
CN112396065A (en) * 2020-10-19 2021-02-23 北京理工大学 Scale-adaptive target tracking method and system based on correlation filtering

Also Published As

Publication number Publication date
CN106991689B (en) 2019-12-31

Similar Documents

Publication Publication Date Title
CN106991689A (en) Method for tracking target and GPU based on FHOG and color characteristic accelerate
CN104573731B (en) Fast target detection method based on convolutional neural networks
Luo et al. Non-local deep features for salient object detection
Shen et al. Detection of stored-grain insects using deep learning
CN108038476B (en) A kind of facial expression recognition feature extracting method based on edge detection and SIFT
Flores et al. Application of convolutional neural networks for static hand gestures recognition under different invariant features
CN104835175B (en) Object detection method in a kind of nuclear environment of view-based access control model attention mechanism
CN104809731B (en) A kind of rotation Scale invariant scene matching method based on gradient binaryzation
CN110232387B (en) Different-source image matching method based on KAZE-HOG algorithm
CN111753828A (en) Natural scene horizontal character detection method based on deep convolutional neural network
CN110569782A (en) Target detection method based on deep learning
CN110929748A (en) Motion blur image feature matching method based on deep learning
CN110334762A (en) A kind of feature matching method combining ORB and SIFT based on quaternary tree
CN104123554A (en) SIFT image characteristic extraction method based on MMTD
CN107862680A (en) A kind of target following optimization method based on correlation filter
CN113011253B (en) Facial expression recognition method, device, equipment and storage medium based on ResNeXt network
CN110827312A (en) Learning method based on cooperative visual attention neural network
CN109858494A (en) Conspicuousness object detection method and device in a kind of soft image
CN115082551A (en) Multi-target detection method based on unmanned aerial vehicle aerial video
CN112801092B (en) Method for detecting character elements in natural scene image
US20230386023A1 (en) Method for detecting medical images, electronic device, and storage medium
CN112348767A (en) Wood counting model based on object edge detection and feature matching
CN106846366B (en) TLD video moving object tracking method using GPU hardware
CN106033550B (en) Method for tracking target and device
Estrada et al. Appearance-based keypoint clustering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant