CN106991689B - Target tracking method based on FHOG and color characteristics and GPU acceleration - Google Patents

Target tracking method based on FHOG and color characteristics and GPU acceleration Download PDF

Info

Publication number
CN106991689B
CN106991689B CN201710216523.3A CN201710216523A CN106991689B CN 106991689 B CN106991689 B CN 106991689B CN 201710216523 A CN201710216523 A CN 201710216523A CN 106991689 B CN106991689 B CN 106991689B
Authority
CN
China
Prior art keywords
image
extracting
parallel
target
fhog
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710216523.3A
Other languages
Chinese (zh)
Other versions
CN106991689A (en
Inventor
李云松
刘金花
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Electronic Science and Technology
Original Assignee
Xian University of Electronic Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Electronic Science and Technology filed Critical Xian University of Electronic Science and Technology
Priority to CN201710216523.3A priority Critical patent/CN106991689B/en
Publication of CN106991689A publication Critical patent/CN106991689A/en
Application granted granted Critical
Publication of CN106991689B publication Critical patent/CN106991689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

The invention discloses a target tracking method based on FHOG and color characteristics and GPU acceleration, which can realize high-speed accurate tracking of a target in a video. According to the invention, the combined characteristics of the basic color of FHOG, the color-naming and the chroma saturation are extracted, so that the target tracking accuracy is improved; by using 7-level adaptive scale transformation with an equal difference value of 0.006, the tracking accuracy of the target in a scale transformation scene is improved; by using a GPU (graphics processing Unit) of a computer graphics processor, the improved KCF target tracking algorithm is accelerated in parallel, and the tracking speed is greatly improved.

Description

Target tracking method based on FHOG and color characteristics and GPU acceleration
Technical Field
The invention belongs to the technical field of computers, and further relates to a target tracking method based on FHOG and color characteristics and GPU acceleration in the technical field of computer video target tracking, which are mainly applied to real-time accurate tracking of video targets.
Background
A high-performance target tracking method is a core technology in the field of computer vision. Current target tracking methods fall into two categories: one is a tracking method based on feature matching, which mainly constructs features capable of representing a target and then judges the position of the target according to the matching degree among the features; the other type is a tracking method based on target and background separation, the method uses a machine learning method to learn a classifier capable of separating a target and a background, the learning process is generally an online training process, and the target position is judged through the learned classifier. In contrast, the former has computational simplicity, but does not handle cases with variations in lighting, occlusion, scale, etc. factors. The latter can solve the problems encountered by the former to a certain extent, and has higher robustness, but the computational complexity is higher. Tracking methods based on the separation of the target and the background are the mainstream tracking methods at present.
A GPU hardware accelerated Struck target tracking method is disclosed in a patent application named 'target tracking method of Struck accelerated by using GPU' (application date: 2015, 3, month and 14, application number: 201510112791.1, publication number: CN 104680558A) proposed by the university of electronic technology of Xian. The method adopts a structured Support Vector Machine (SVM) based on a structured support vector machine model to learn a classifier capable of distinguishing the target from the background, judges the position of the target through the learned classifier, improves the tracking speed, and improves the tracking speed by using a Graphics Processing Unit (GPU) for parallel calculation. However, the method has the defects that the tracking accuracy is not high and the tracking speed is relatively slow by adopting the structured SVM based on the structured support vector machine model.
In the patent application "a real-time video tracking method" (application date: 2016, 5, 13, application number: 201610314297.8, published date: CN 106023248 a), which is proposed by shanghai bao macro software co., ltd., the image features are compressed in a mode of dividing a tracking target into character blocks, and the correlation between feature vectors is calculated by using a KCF (kernel correlation filter) algorithm so as to achieve the purpose of video tracking. The method has the advantage of high performance, and meets the real-time requirement in a common scene. However, the method still has the defects that the method adopts the combination of the characteristics of the gray level histogram and the chrominance histogram, and the tracking accuracy is not high; the method adopts a serial calculation method to extract features, train a model, detect a target and has low processing speed.
In 2014, Henriques, J.F., Caseiro, R., Martins, P., and Batista, J., published article "High-Speed transportation with Kernelized Correlation Filters" (Pattern analysis and Machine analysis, IEEE Transactions on) proposed a Tracking method based on two-dimensional Fourier transform, i.e., KCF algorithm. The algorithm extracts FHOG characteristics, uses cyclic shift to construct a training sample of a classifier, and uses the characteristics of a cyclic matrix to transform the solution of the problem to a Fourier domain, thereby reducing the complexity of the algorithm and accelerating the tracking speed to a certain extent. However, the algorithm still has the defects that the tracking speed is not high enough due to the adoption of the serial computing method for training and detecting, the tracking accuracy is not high only due to the adoption of FHOG characteristics, and the algorithm cannot adapt to target scale change and the like.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a GPU acceleration target tracking method based on FHOG and color characteristics, which can realize high-speed accurate tracking of a target in a video.
According to the invention, the combined characteristics of the basic color of FHOG, the color-naming and the chroma saturation are extracted, so that the target tracking accuracy is improved; by using 7-level adaptive scale transformation with an equal difference value of 0.006, the tracking accuracy of the target in a scale transformation scene is improved; by using a GPU (graphics processing Unit) of a computer graphics processor, the improved KCF target tracking algorithm is accelerated in parallel, and the tracking speed is greatly improved.
To achieve the above object, the steps of the present invention basically include the following:
1. a target tracking method based on FHOG and color characteristics and GPU acceleration comprise the following steps:
(1) acquiring an image, and transmitting the image into a GPU:
(1a) loading a frame of image in an image sequence to be tracked into a memory of a computer host;
(1b) copying the image loaded into the memory of the computer into the memory of the GPU;
(2) judging whether the obtained image is the 1 st frame image in the image sequence to be tracked; if so, executing the step (12), otherwise, executing the step (3);
(3) judging whether the self-adaptive scale transformation is used, if so, executing the step (4), otherwise, executing the step (8);
(4) multi-scale image extraction:
(4a) extracting candidate region images using a 7-level scale;
(4b) zooming the image to a specific size to obtain a search rectangular frame image;
(5) extracting FHIG and color characteristics in parallel;
(5a) extracting FHOG characteristics in parallel;
(5b) extracting color-naming basic color characteristics in parallel;
(5c) extracting HS chroma-saturation characteristics in parallel;
(5d) connecting the three characteristics in series to obtain a 44-dimensional new characteristic, and adding a Hanning window for filtering;
(6) calculating a cross-correlation matrix and a maximum response value;
(7) calculate the maximum response in all scales:
(7a) response values f corresponding from all 7 scaleszmaxTaking the maximum value as the final maximum response value fzmax
(7b) Executing the step (11);
(8) extracting an image at a search rectangular frame;
expanding the target rectangular frame to obtain a search rectangular frame, and extracting a target image at the search rectangular frame from the image to be detected;
(9) extracting FHIG and color characteristics in parallel;
(9a) extracting FHOG characteristics in parallel;
(9b) extracting color-naming basic color characteristics in parallel;
(9c) extracting HS chroma-saturation characteristics in parallel;
(5d) the three features are connected in series to obtain a 44-dimensional new feature, and Hanning window filtering is added.
(10) Calculating a cross-correlation matrix and a maximum response value;
(11) updating the target rectangular frame;
(11a) and updating the target rectangular frame of the tracking target by using the coordinate corresponding to the maximum response value.
(11b) Executing the step (13);
(12) initializing a target rectangular frame;
selecting a rectangular frame containing a tracking target from an input image, and taking the selected rectangular frame as a target rectangular frame of the tracking target;
(13) extracting an image at a search rectangular frame;
expanding the target rectangular frame to obtain a search rectangular frame, and extracting a target image at the search rectangular frame from the image to be detected;
(14) extracting FHIG and color characteristics in parallel;
(14a) extracting FHOG characteristics in parallel;
(14b) extracting color-naming basic color characteristics in parallel;
(14c) extracting HS chroma-saturation characteristics in parallel;
(14d) connecting the three characteristics in series to obtain a 44-dimensional new characteristic, and adding a Hanning window for filtering;
(15) calculating an autocorrelation matrix;
(16) updating tracking model parameters;
(17) judging whether the images of all frames are loaded or not;
judging whether all the images are loaded or not; if so, executing the step (18), otherwise, executing the step (1);
(18) and ending the tracking.
Compared with the prior art, the invention has the following advantages:
firstly, the combined characteristics of FHOG, color-naming basic color and chroma saturation of a tracked target image are extracted through the steps (5), (9) and (14), and the GPU is used for parallel acceleration, so that the problems of low tracking accuracy and slow extraction of characteristics in the prior art are solved, and the tracking accuracy is improved while the tracking speed is improved.
Secondly, the invention uses a 7-level adaptive scale transformation method taking 0.006 as an equal difference value through the steps (4), (5), (6) and (7), and uses GPU for parallel acceleration, thereby improving the tracking accuracy and the tracking speed.
Thirdly, the invention adopts a computer image processor GPU to parallelly accelerate the improved target tracking algorithm, thereby greatly improving the speed of tracking the target compared with the CPU processing.
Drawings
FIG. 1 is a flow chart of the present invention;
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings.
The implementation steps of the present invention will be described in detail with reference to fig. 1.
Step 1, acquiring an image, and transmitting the image into a GPU:
loading a frame of image in an image sequence to be tracked into a memory of a computer host, and then copying the image loaded into the memory of the computer into a GPU memory;
step 2, judging whether the obtained image is the 1 st frame image in the image sequence to be tracked; if so, executing the step (12), otherwise, executing the step (3);
and 3, judging whether the self-adaptive scale transformation is used, if so, executing the step (4), and otherwise, executing the step (8).
Step 4, multi-scale image extraction:
first, candidate region images are extracted using 7-level scales, and scale coefficients corresponding to the 7-level scales are 0.982,0.988, 0.994, 1.00, 1.006, 1.012, and 1.018 in an arithmetic progression using 0.006 as an arithmetic index.
Then, the image is zoomed to a specific size, the width and the height of the zoom rectangular frame are respectively expanded by 2.5 times to be used as a zoom search rectangular frame, and the width of the zoom search rectangular frame is 2.5Lp*WoriHeight of 2.5 x Lp*HoriWherein W isoriIs the width of the target rectangular frame, HoriAnd extracting a tracking target image at the zoom search rectangular frame from the image frame to be detected.
Finally, the tracking target image is scaled to a width of 2.5W using a linear difference methodoriHeight of 2.5 × HoriThe search rectangle.
And 5, extracting FHOG and color characteristics in parallel.
The FHOG and color characteristics refer to a 31-dimensional FHOG characteristic, an 11-dimensional color-naming basic color characteristic and a 2-dimensional HS chroma-saturation characteristic, and are connected in series to form a 44-dimensional new characteristic.
(5a) Extracting FHOG characteristics in parallel;
first, an image is converted into a grayscale image.
And secondly, calculating gradient, and counting a gradient histogram to obtain 18-dimensional direction sensitive characteristics.
(1)Calculating gradients uses a horizontal gradient integral operator [ -1, 0,1 [ -1]And vertical gradient integral operator [ -1, 0,1 [ -0 [ -1 [ ] -1 [ -1]-1Parallel calculation of horizontal gradient GxAnd a gradient G in the vertical directiony. According to the formulaCalculating to obtain gradient amplitude GmIf the amplitude is equal to 0, the amplitude is corrected to 1/e10f, and if not 0, it is kept unchanged. Using a table lookup method toThe result of (3) is used as an index value to calculate the gradient direction Go
(2) The gradient directions are discretized in parallel. Dividing 0-2 pi into 18 intervals according to the gradient direction, namelyLabeled as interval 1, interval 2, … …, interval 18. The gradient direction GoThe nearest neighbor interval is entered.
(3) The gradient direction histograms are computed in parallel. Expanding gradient image edge, calculating gradient histogram by each thread independently, and dividing into 18 layers according to gradient direction to obtain (W)b+2)*(Hb+2) 18 gradient histogram data, cutting the edges of the histogram data to obtain the total size W in 18 intervalsb*HbA direction sensitive feature vector R1 of 18, wherein,symbolIndicating a rounding down operation.
And thirdly, normalizing the characteristics.
(1) The gradient energy is calculated. The calculation formula for calculating the gradient energy is as follows:
wherein N isδ,γ(i, j) is the gradient energy, δ, γ ∈ { -1, 1}, so for each (i, j) there are four gradient energies, N respectively-1,-1(i,j),N+1,-1(i,j),N+1,+1(i,j),N-1,+1(i, j), ε is a very small number. D (i, j) is a 9-dimensional direction insensitive feature vector which is obtained from an 18-dimensional direction sensitive feature vector, and the calculation formula is as follows:
Dk(i,j)=Ck(i,j)+Ck+9(i,j)
wherein Dk(i, j) data of the k-th dimension of the 9-dimensional direction insensitive feature vector D (i, j), CkAnd (i, j) denotes the kth dimension data of the 18-dimensional direction-sensitive feature vector C (i, j), where k is 0,1,2, … …, 8.
Using a GPU kernel fusion acceleration method to make 9-dimensional direction insensitive feature vector Dk(i, j) calculation and gradient energy Nδ,γThe calculation of (i, j) is put into the same core of the GPU, i.e. calculated using the self-induced formula:
wherein, Ck(i, j) denotes the kth dimension data of the 18-dimensional direction-sensitive feature vector C (i, j), k being 0,1,2, … …,8
(2) According to the following formula
A normalized 18-dimensional direction sensitive feature vector R1 is calculated. Wherein C (i, j) is a feature vector of the 18-dimensional direction sensitive feature; t isαIs a truncation function, meaning that for values greater than 0.2, it is assigned a value of 0.2; n is a radical ofδ,γ(i, j) is the gradient energy, δ, γ ∈ { -1, 1}, ε is a very small number.
(3) By using a GPU (graphics processing Unit) loop fusion method, R1 is calculated in the same loop, and an expression summarized by the self is used
R2k(i,j)=Tα(R1k(i,j)+R1k+9(i,j))
Computing a normalized 9-dimensional direction-insensitive eigenvector R2, where k represents R1kThe k-th dimension data of (i, j), k being 0,1,2, … …, 8.
(4) In the same cycle, R1 is calculated, and simultaneously, 18-dimensional R1 is accumulatedk(i, j), k is 0,1,2, … …,17, resulting in a 4-dimensional texture feature vector R3.
And fourthly, connecting the normalized direction sensitive feature R1, the direction insensitive feature R2 and the texture feature vector R3 in series to obtain the 31-dimensional FHOG feature.
In the above processing steps, the parallel method is as follows: extracting features in parallel by using 16 × 16 threads for each thread block, and assuming that the data block size is w × h, totaling m × n thread blocks, where m and n are calculated according to the following formula:
(5b) extracting color-naming basic color characteristics in parallel;
the original image is divided into 16 x 16 size tiles, each tile is computed using one thread block in the GPU, each thread computing the color-naming feature of one pixel. The RGB image was converted into an 11-dimensional color-naming feature using a table lookup method F2.
(5c) Extracting HS chroma-saturation characteristics in parallel;
first, image normalization.
And secondly, converting the image from an RGB space to an HSI space according to the following formula, and connecting the chromaticity H and saturation S characteristics in series to obtain a 2-dimensional chromaticity-saturation characteristic.
When the inverse cosine function is calculated, a table look-up method is used; computingIn time, if the result is 0, it needs to be corrected to 1/e10 f.
In the above processing steps, the parallel method is as follows: extracting features in parallel by using 16 × 16 threads for each thread block, and assuming that the data block size is w × h, totaling m × n thread blocks, where m and n are calculated according to the following formula:
(5d) the three features are connected in series to obtain a 44-dimensional new feature, and Hanning window filtering is added.
And 6, calculating a cross-correlation matrix and a maximum response value.
Firstly, calling a library function Fourier transform function cuffExecC 2C of the GPU to extract image features zfConversion to the fourier domain.
Second, the cross-correlation matrix k is calculated using a Gaussian kernelxz', the calculation formula is as follows:
calculating the L2-norm sum of squares | | x | | of x and z2And z non calculation2And meanwhile, the shared memory is utilized, and the GPU is used for accelerating summation operation based on a reduction algorithm of an alternative strategy.
Third, using the cross-correlation matrix kxz'and model parameters a' are calculated to obtain response values. Invoking a library function Fourier transform function cuffExecC 2C of the GPU to convert the response value into a space domain, and calculating a maximum response value fzmax. When calculating the maximum value, the rule based on the alternative strategy is used by utilizing the shared memory and the GPUAnd the approximation algorithm accelerates the operation of solving the maximum value.
Step 7, calculating the maximum response in all scales:
(7a) response values f corresponding from all 7 scaleszmaxTaking the maximum value as the final maximum response value fmax
(7b) Step 11 is performed.
And 8, extracting the image at the rectangular frame of the search.
And respectively expanding the length and the width of the target rectangular frame by 2.5 times to obtain a search rectangular frame. Extracting tracking target image 2.5W of rectangular frame from image to be detectedoriHeight of 2.5 × Hori
And 9, extracting FHOG and color characteristics in parallel.
(9a) Extracting FHOG characteristics in parallel;
(9b) extracting color-naming basic color characteristics in parallel;
(9c) extracting HS chroma-saturation characteristics in parallel;
(5d) the three features are connected in series to obtain a 44-dimensional new feature, and Hanning window filtering is added.
And step 10, calculating a cross-correlation matrix and a maximum response value.
And step 11, updating the target rectangular frame.
(11a) Updating a target rectangular frame of the tracking target by using the coordinate of the maximum response point, respectively expanding the length and the width of the target rectangular frame by 2.5 times to be used as a search rectangular frame,
(11b) step 13 is performed.
Step 12, initializing a target rectangular frame.
And selecting a rectangular frame containing the tracking target from the input image, and taking the selected rectangular frame as a target rectangular frame of the tracking target.
And step 13, extracting the image at the rectangular frame of the search.
And expanding the target rectangular frame to obtain a search rectangular frame, and extracting a target image at the search rectangular frame from the image to be detected.
And step 14, extracting FHOG and color characteristics in parallel.
(14a) Extracting FHOG characteristics in parallel;
(14b) extracting color-naming basic color characteristics in parallel;
(14c) extracting HS chroma-saturation characteristics in parallel;
(14d) the three features are connected in series to obtain a 44-dimensional new feature, and Hanning window filtering is added.
Step 15, calculating an autocorrelation matrix.
Calling a library function Fourier transform function cuffExecC 2C of the GPU to extract the image feature xfTransforming to Fourier domain, calculating autocorrelation matrix kxx′。
And step 16, updating the tracking model parameters.
Firstly, calculating a tracking model parameter alpha 'obtained by current frame training according to the following formula'
y' denotes DFT, k of regression valuexx' denotes the DFT of the autocorrelation matrix, and λ denotes the regularization parameter.
Second, the tracking model parameter α is updated according to the following formulamodel' and
αmodel′=(1-β1)*αmodel′+β1*α′
wherein β 1 is a weighting coefficient, and when adaptive scaling is not used, β 1 is 0.02; when the adaptive scaling function is used, β 1 is 0.01.
Step 17, judging whether the loading of the images of all the frames is finished;
judging whether all the images are loaded or not; if yes, executing step 18, otherwise, executing step 1;
and step 18, ending the tracking.
Various other changes and modifications to the above-described embodiments and concepts will become apparent to those skilled in the art from the above description, and all such changes and modifications are intended to be included within the scope of the present invention as defined in the appended claims.

Claims (7)

1. A target tracking and GPU acceleration method based on FHOG and color features is characterized by comprising the following steps:
(1) acquiring an image, and transmitting the image into a GPU:
(1a) loading a frame of image in an image sequence to be tracked into a memory of a computer host;
(1b) copying the image loaded into the memory of the computer into the memory of the GPU;
(2) judging whether the obtained image is the 1 st frame image in the image sequence to be tracked; if so, executing the step (12), otherwise, executing the step (3);
(3) judging whether the self-adaptive scale transformation is used, if so, executing the step (4), otherwise, executing the step (8);
(4) multi-scale image extraction:
(4a) extracting candidate region images using a 7-level scale;
(4b) zooming the image to a specific size to obtain a search rectangular frame image;
(5) extracting FHIG and color characteristics in parallel;
(5a) extracting FHOG characteristics in parallel;
(5b) extracting color-naming basic color characteristics in parallel;
(5c) extracting HS chroma-saturation characteristics in parallel;
(5d) connecting the three characteristics in series to obtain a 44-dimensional new characteristic, and adding a Hanning window for filtering;
(6) calculating a cross-correlation matrix and a maximum response value;
(7) calculate the maximum response in all scales:
(7a) response values f corresponding from all 7 scaleszmaxTaking the maximum value as the final maximum response value fmax
(7b) Executing the step (11);
(8) extracting an image at a search rectangular frame;
expanding the target rectangular frame to obtain a search rectangular frame, and extracting a target image at the search rectangular frame from the image to be detected;
(9) extracting FHIG and color characteristics in parallel;
(9a) extracting FHOG characteristics in parallel;
(9b) extracting color-naming basic color characteristics in parallel;
(9c) extracting HS chroma-saturation characteristics in parallel;
(5d) connecting the three characteristics in series to obtain a 44-dimensional new characteristic, and adding a Hanning window for filtering;
(10) calculating a cross-correlation matrix and a maximum response value;
(11) updating the target rectangular frame;
(11a) updating a target rectangular frame of the tracking target by using the coordinate corresponding to the maximum response value;
(11b) executing the step (13);
(12) initializing a target rectangular frame;
selecting a rectangular frame containing a tracking target from an input image, and taking the selected rectangular frame as a target rectangular frame of the tracking target;
(13) extracting an image at a search rectangular frame;
expanding the target rectangular frame to obtain a search rectangular frame, and extracting a target image at the search rectangular frame from the image to be detected;
(14) extracting FHIG and color characteristics in parallel;
(14a) extracting FHOG characteristics in parallel;
(14b) extracting color-naming basic color characteristics in parallel;
(14c) extracting HS chroma-saturation characteristics in parallel;
(14d) connecting the three characteristics in series to obtain a 44-dimensional new characteristic, and adding a Hanning window for filtering;
(15) calculating an autocorrelation matrix;
(16) updating tracking model parameters;
(17) judging whether the images of all frames are loaded or not;
judging whether all the images are loaded or not; if so, executing the step (18), otherwise, executing the step (1);
(18) and ending the tracking.
2. The FHOG and color feature based target tracking and GPU acceleration method of claim 1, wherein the scale coefficients corresponding to the 7-level scales in step (4a) are an arithmetic series with an arithmetic difference of 0.006, which is 0.982,0.988, 0.994, 1.00, 1.006, 1.012, 1.018 respectively.
3. The FHOG and color feature-based target tracking and GPU acceleration method of claim 1, wherein the FHOG and color feature in step (5) is a 44-dimensional new feature composed of a 31-dimensional FHOG feature, an 11-dimensional color-naming base color feature and a 2-dimensional HS chroma-saturation feature in series.
4. The FHOG and color feature based target tracking and GPU acceleration method of claim 1, wherein the step (5a) of extracting FHOG features in parallel comprises the following specific steps:
firstly, converting an image into a gray image;
secondly, calculating gradient, and counting a gradient histogram to obtain 18-dimensional direction sensitive characteristics;
thirdly, normalizing the characteristics;
(a) calculating gradient energy; the calculation formula for calculating the gradient energy is as follows:
wherein N isδ,γ(i, j) is the gradient energy, δ, γ ∈ { -1, 1}, so for each (i, j) there are four gradient energies, N respectively-1,-1(i,j),N+1,-1(i,j),N+1,+1(i,j),N-1,+1(i, j), ε is a very small number; d (i, j) is a 9-dimensional direction insensitive feature vector which is obtained from an 18-dimensional direction sensitive feature vector, and the calculation formula is as follows:
Dk(i,j)=Ck(i,j)+Ck+9(i,j)
wherein Dk(i, j) data of the k-th dimension representing the 9-dimensional direction insensitive feature D (i, j), Ck(i, j) k-th dimensional data representing an 18-dimensional direction-sensitive feature vector C (i, j), k being 0,1,2, … …, 8;
using a GPU kernel fusion acceleration method to make 9-dimensional direction insensitive feature vector Dk(i, j) calculation and gradient energy Nδ,γThe calculation of (i, j) is put into the same core of the GPU, i.e. calculated using the self-induced formula:
wherein, Ck(i, j) k-th dimensional data representing an 18-dimensional direction-sensitive feature vector C (i, j), k being 0,1,2, … …, 8;
(b) according to the following formula
Calculating to obtain a normalized 18-dimensional direction sensitive characteristic vector R1; wherein C (i, j) is a feature vector of the 18-dimensional direction sensitive feature; t isαIs a truncation function, meaning that for values greater than 0.2, it is assigned a value of 0.2; n is a radical ofδ,γ(i, j) is the gradient energy, δ, γ ∈ { -1, 1}, ε is a very small number;
(c) using a GPU loop fusion method, calculating R1 in the same loop, and simultaneously using an expression summarized by the self:
R2k(i,j)=Tα(R1k(i,j)+R1k+9*(i,j))
computing a normalized 9-dimensional direction-insensitive eigenvector R2, where k represents R1k(i, j) of the k-th dimension, k being 0,1,2, … …, 8;
(d) in the same cycle, R1 is calculated, and simultaneously, 18-dimensional R1 is accumulatedk(i, j), k is 0,1,2, … …,17, resulting in a 4-dimensional textureA feature vector R3;
fourthly, connecting the normalized direction sensitive feature R1, the direction insensitive feature R2 and the texture feature vector R3 in series to obtain a 31-dimensional FHOG feature;
in the above processing steps, the parallel method is as follows: extracting features in parallel by using 16 × 16 threads for each thread block, and assuming that the data block size is w × h, totaling m × n thread blocks, where m and n are calculated according to the following formula:
5. the FHOG and color feature-based target tracking and GPU acceleration method of claim 1, wherein the parallel extraction of chroma-saturation features in step (5c) comprises the following specific steps:
firstly, normalizing an image;
secondly, converting the image from an RGB space to an HSI space according to the following formula, and connecting the chromaticity H and saturation S characteristics in series to obtain 2-dimensional chromaticity-saturation characteristics;
when the inverse cosine function is calculated, a table look-up method is used; computingIf the result is 0, it needs to be corrected to 1/e10 f;
in the above processing steps, the parallel method is as follows: extracting features in parallel by using 16 × 16 threads for each thread block, and assuming that the data block size is w × h, totaling m × n thread blocks, where m and n are calculated according to the following formula:
6. the FHOG and color feature based target tracking and GPU acceleration method of claim 1, wherein the step (6) of calculating the cross-correlation matrix and the maximum response value comprises the following steps:
firstly, calling a library function Fourier transform function cuffExecC 2C of the GPU to extract image features zfConverting to a Fourier domain;
second, the cross-correlation matrix k is calculated using a Gaussian kernelxz', the calculation formula is as follows:
calculating the L2-norm sum of squares | | x | | of x and z2And z non calculation2Meanwhile, the shared memory is utilized, and the GPU is used for accelerating summation operation based on a protocol algorithm of an alternative strategy;
third, using the cross-correlation matrix kxz'calculating a response value with the model parameter α'; invoking a library function Fourier transform function cuffExecC 2C of the GPU to convert the response value into a space domain, and calculating a maximum response value fzmax(ii) a And when the maximum value is calculated, the calculation of solving the maximum value is accelerated by utilizing the shared memory and using a protocol algorithm of the GPU based on an alternative strategy.
7. The FHOG and color feature based target tracking and GPU acceleration method of claim 1, wherein the step (16) of updating the tracking model parameters comprises the following steps:
firstly, calculating tracking model parameters obtained by training a current frame according to the following formula:
y' denotes DFT, k of regression valuexx' denotes DFT of the autocorrelation matrix, λ denotes the regularization parameter;
second, the tracking model parameter α is updated according to the following formulamodel' and
αmodel'=(1-β1)*αmodel'+β1*α';
wherein beta is1Is a weighting coefficient, beta when no adaptive scaling is used10.02; when using the adaptive scaling function, beta1=0.01。
CN201710216523.3A 2017-04-05 2017-04-05 Target tracking method based on FHOG and color characteristics and GPU acceleration Active CN106991689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710216523.3A CN106991689B (en) 2017-04-05 2017-04-05 Target tracking method based on FHOG and color characteristics and GPU acceleration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710216523.3A CN106991689B (en) 2017-04-05 2017-04-05 Target tracking method based on FHOG and color characteristics and GPU acceleration

Publications (2)

Publication Number Publication Date
CN106991689A CN106991689A (en) 2017-07-28
CN106991689B true CN106991689B (en) 2019-12-31

Family

ID=59416287

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710216523.3A Active CN106991689B (en) 2017-04-05 2017-04-05 Target tracking method based on FHOG and color characteristics and GPU acceleration

Country Status (1)

Country Link
CN (1) CN106991689B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107491786B (en) * 2017-08-15 2020-10-20 电子科技大学 Automatic visual detection and identification method for repeated weighing behaviors of tobacco purchase
CN108121945A (en) * 2017-11-14 2018-06-05 深圳市深网视界科技有限公司 A kind of multi-target detection tracking, electronic equipment and storage medium
CN107918765A (en) * 2017-11-17 2018-04-17 中国矿业大学 A kind of Moving target detection and tracing system and its method
CN107977980B (en) * 2017-12-06 2021-01-05 北京飞搜科技有限公司 Target tracking method, device and readable medium
CN108198192A (en) * 2018-01-15 2018-06-22 任俊芬 A kind of quick human body segmentation's method of high-precision based on deep learning
CN109034193A (en) * 2018-06-20 2018-12-18 上海理工大学 Multiple features fusion and dimension self-adaption nuclear phase close filter tracking method
CN109461170B (en) * 2018-09-20 2021-11-16 西安电子科技大学 Ultra-high-speed target tracking method based on FPGA and computer vision system
CN110895820B (en) * 2019-03-14 2023-03-24 河南理工大学 KCF-based scale self-adaptive target tracking method
CN110895701B (en) * 2019-06-12 2023-03-24 河南理工大学 Forest fire online identification method and device based on CN and FHOG
CN111145217A (en) * 2019-12-27 2020-05-12 湖南华诺星空电子技术有限公司 KCF-based unmanned aerial vehicle tracking method
CN111862160B (en) * 2020-07-23 2023-10-13 中国兵器装备集团自动化研究所有限公司 Target tracking method, medium and system based on ARM platform
CN112396065A (en) * 2020-10-19 2021-02-23 北京理工大学 Scale-adaptive target tracking method and system based on correlation filtering

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020986A (en) * 2012-11-26 2013-04-03 哈尔滨工程大学 Method for tracking moving object
CN104200216A (en) * 2014-09-02 2014-12-10 武汉大学 High-speed moving target tracking algorithm for multi-feature extraction and step-wise refinement
CN104680558A (en) * 2015-03-14 2015-06-03 西安电子科技大学 Struck target tracking method using GPU hardware for acceleration

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020986A (en) * 2012-11-26 2013-04-03 哈尔滨工程大学 Method for tracking moving object
CN104200216A (en) * 2014-09-02 2014-12-10 武汉大学 High-speed moving target tracking algorithm for multi-feature extraction and step-wise refinement
CN104680558A (en) * 2015-03-14 2015-06-03 西安电子科技大学 Struck target tracking method using GPU hardware for acceleration

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"GPU并行实现多特征融合粒子滤波目标跟踪算法";赵嵩 等;《微电子学与计算机》;20150930;第32卷(第9期);153-156 *
"High-Speed Tracking with Kernelized Correlation Filters";Joao F. Henriques et al;《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》;20150331;第37卷(第3期);583-596 *
"一种结合了纹理和颜色的运动目标跟踪算法";袁国武 等;《计算机应用与软件》;20111130;第28卷(第11期);81-84 *
"分布场的多特征融合目标跟踪方法";宋长贺 等;《西安电子科技大学学报(自然科学版)》;20150831;第42卷(第4期);1-7 *
"基于GPU并行计算的目标跟踪快速算法研究";胡燕;《中国优秀硕士学位论文全文数据库-信息科技辑》;20121015;第2012年卷(第10期);I138-2662 *

Also Published As

Publication number Publication date
CN106991689A (en) 2017-07-28

Similar Documents

Publication Publication Date Title
CN106991689B (en) Target tracking method based on FHOG and color characteristics and GPU acceleration
CN106960451B (en) Method for increasing number of feature points of image weak texture area
WO2020134478A1 (en) Face recognition method, feature extraction model training method and device thereof
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
CN108647694B (en) Context-aware and adaptive response-based related filtering target tracking method
CN111860494B (en) Optimization method and device for image target detection, electronic equipment and storage medium
CN110569782A (en) Target detection method based on deep learning
WO2023082784A1 (en) Person re-identification method and apparatus based on local feature attention
Xin et al. A self-adaptive optical flow method for the moving object detection in the video sequences
CN107862680B (en) Target tracking optimization method based on correlation filter
CN104123554A (en) SIFT image characteristic extraction method based on MMTD
US20190236345A1 (en) Hand detection method and system, image detection method and system, hand segmentation method, storage medium, and device
WO2017193414A1 (en) Image corner detection method based on turning radius
CN106600613B (en) Improvement LBP infrared target detection method based on embedded gpu
WO2017070923A1 (en) Human face recognition method and apparatus
CN107527348B (en) Significance detection method based on multi-scale segmentation
CN108319961B (en) Image ROI rapid detection method based on local feature points
CN110991547A (en) Image significance detection method based on multi-feature optimal fusion
Zhou et al. Scale adaptive kernelized correlation filter tracker with feature fusion
CN103324753A (en) Image retrieval method based on symbiotic sparse histogram
CN111968154A (en) HOG-LBP and KCF fused pedestrian tracking method
CN108241869A (en) A kind of images steganalysis method based on quick deformable model and machine learning
Wo et al. A saliency detection model using aggregation degree of color and texture
Zhou et al. On contrast combinations for visual saliency detection
CN115170854A (en) End-to-end PCANetV 2-based image classification method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant