CN112184762A - Gray wolf optimization particle filter target tracking algorithm based on feature fusion - Google Patents

Gray wolf optimization particle filter target tracking algorithm based on feature fusion Download PDF

Info

Publication number
CN112184762A
CN112184762A CN202010924590.2A CN202010924590A CN112184762A CN 112184762 A CN112184762 A CN 112184762A CN 202010924590 A CN202010924590 A CN 202010924590A CN 112184762 A CN112184762 A CN 112184762A
Authority
CN
China
Prior art keywords
target
particle
template
weight
particles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010924590.2A
Other languages
Chinese (zh)
Inventor
任红格
史涛
吴启隆
赵坚
杜静娟
戈文琪
梁晨
胡鸿长
王东辉
崔胤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Chengjian University
Original Assignee
Tianjin Chengjian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Chengjian University filed Critical Tianjin Chengjian University
Priority to CN202010924590.2A priority Critical patent/CN112184762A/en
Publication of CN112184762A publication Critical patent/CN112184762A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The invention relates to a gray wolf optimized particle filter target tracking algorithm based on feature fusion. In a particle filter tracking algorithm frame, the particles are guided to move according to a gray wolf position updating mechanism in a gray wolf algorithm, a traditional particle filter algorithm is optimized, a target feature fusion model is established to describe a target template, the particle weight is adaptively adjusted, and the target template is selected by utilizing a template updating method. According to the method, the target HSV color characteristic and the HOG characteristic are subjected to weighted fusion, so that the characteristic description of an algorithm on the target is promoted, and the single characteristic is prevented from being better adapted to complex environmental conditions; the GWO algorithm is used for optimizing the particle distribution situation, so that the particle distribution of each generation approaches the posterior probability of the real state of the system, the particle distribution is prevented from being excessively concentrated or dispersed, meanwhile, the weight adaptive adjustment is carried out on the particles before partial resampling, the particle diversity is improved, and the problem of particle degradation in the traditional particle filter algorithm is solved.

Description

Gray wolf optimization particle filter target tracking algorithm based on feature fusion
The technical field is as follows:
the invention relates to a gray wolf optimized particle filter target tracking algorithm based on feature fusion, and belongs to the technical field of computer vision target tracking.
Technical background:
the target tracking technology is a challenging research task in the field of computer vision, and has wide application value in the fields of intelligent video monitoring, city security, man-machine interaction, robot navigation and the like. With the development of computer technology, researchers have proposed many excellent tracking methods in terms of target tracking. However, in an actual application scenario, few algorithms are available for effective target extraction and tracking under the conditions of complex environmental changes, background sundry influences and uncertain factor interference. Therefore, it is necessary to invent a high-applicability long-term stable moving target tracking algorithm.
The target tracking is to predict the position of a target in a subsequent frame according to a tracking algorithm after the position of the target in the first frame image is given. The current tracking method is mainly divided into a generation class and a discrimination class. Common tracking algorithms based on the filter theory include a Kalman filtering algorithm, an extended Kalman filtering algorithm and a particle filtering algorithm. The particle filter algorithm is based on a Monte Carlo method and an importance sampling frame, and can be well adapted to a nonlinear and non-Gaussian system. Therefore, compared with algorithms such as Kalman filtering, mean shift and the like, the particle filtering is more widely applied. The traditional particle filtering algorithm has the defects that the particle degradation phenomenon caused in the particle resampling process is serious, the description and use characteristics of the target are single, the target template is not updated timely, and the like, so that the accuracy of algorithm tracking is reduced, and the robustness and the stability are poor.
The appearance model of the tracking target consists of target description and a statistical model, the traditional particle filter target tracking algorithm uses an RGB color histogram model with single characteristics as a probability model, and the robustness of the appearance model is poor. In practical application, the colors of the background of the tracking scene and the tracking target are close, the target is shielded, the illumination is changed, and other factors can cause target drift and tracking failure. Therefore, under the condition of a complex tracking environment, a plurality of representative features are selected and fused to describe the tracking target, and the tracking accuracy and robustness can be effectively improved.
Related patents such as patent application publication No. CN109670410A disclose a long-term moving target tracking method based on multi-feature fusion, which utilizes a plurality of features to train a position filter, and determines a predicted target position by calculating a confidence value of a next frame and a current frame. The invention patent of application publication No. CN109376369A discloses a particle filtering method for classified evolution resampling, which is characterized in that fission is carried out on a large-weight particle population, then a differential evolution method is adopted to carry out optimization with a small-weight population, and a new particle population is generated through mutation, intersection and selection, so that the diversity of a particle sample is ensured. The patent of the invention with the application publication number of CN108182447B discloses a self-adaptive particle filter target tracking method based on deep learning, which adopts an SGD algorithm to train a shallow deep learning network model in an off-line manner, calculates the current state of a target, judges the state change degree and updates an observation model in real time; however, the above patent does not relate to a grayish optimized particle filtering method based on feature fusion.
The invention content is as follows:
1. the invention provides a gray wolf optimized particle filter target tracking algorithm based on feature fusion in order to optimize a particle filter structure and improve filter precision by utilizing a gray wolf algorithm under a complex environment background, so as to solve the problems of target attitude change, target shielding, illumination change and the like in a target tracking process, thereby realizing robust and stable tracking of a moving target, guiding particle movement according to a gray wolf position updating mechanism in the gray wolf algorithm in a particle filter tracking algorithm frame, optimizing the traditional particle filter algorithm, establishing a target feature fusion model to describe a target template, adaptively adjusting particle weights, increasing particle diversity, selecting the target template by utilizing a template updating method, and performing the following steps:
step 1, in an initialization stage, manually selecting a tracking rectangular frame on an initial frame image to determine a tracked moving target, establishing a target template model, initializing a particle state, setting the number of particle samples, and calculating an HSV (hue, saturation, value) color histogram and an HOG (histogram of oriented gradient) histogram for the target template;
step 2, predicting according to a state transition equation, updating the state of each particle, obtaining a new predicted particle set, establishing a candidate target template, and calculating an HSV color histogram and an HOG direction gradient histogram of the candidate template;
step 3, fusing HSV color features and HOG directional gradient features, firstly adopting Bhattacharyya coefficient to measure the similarity between a target template and a candidate template, calculating the similarity coefficient of the HSV color histogram and the HOG directional gradient histogram of each particle in the step 2, and then carrying out a weighting fusion strategy according to different characteristic weight values to obtain a feature fusion observation model;
step 4, introducing a grayish wolf algorithm to carry out iterative optimization on the particles of the observation model obtained in the step 3, selecting the particles with high fitness to guide other particles to move towards the direction of the posterior probability high-likelihood region, stopping optimization when the iteration process reaches the maximum times, and otherwise, entering the step 3;
step 5, calculating the weight of the particles according to the latest observation information by the target state, carrying out normalization processing, carrying out a self-adaptive adjustment strategy on the weight of the particles, and estimating the target position and state according to the weight of the particles;
step 6, partial resampling, namely performing partial resampling operation on the high-weight particles and the low-weight particles obtained by calculation in the step 5 to obtain a new particle set;
step 7, updating the target template by adopting an initial template F0And a real-time updated motion template FcThe self-adaptive selection template updating mode selects whether to update the template by calculating the distance between the candidate area and the initial template and the color histogram of the motion template;
and 8, repeating the steps 2 to 7, and continuously tracking the moving target of the next frame.
Compared with the prior art, the invention adopting the technical scheme has the outstanding characteristics that:
by the weighted fusion of the HSV color feature and the HOG feature of the target, the feature description of the algorithm on the target is improved, and the single feature is prevented from being better adapted to complex environmental conditions; the GWO algorithm is used for optimizing the particle distribution situation, so that the particle distribution of each generation approaches the posterior probability of the real state of the system, the particle distribution is prevented from being too concentrated or dispersed, meanwhile, the weight value self-adaptive adjustment is carried out on the particles before partial resampling, the particle diversity is improved, and the problem of particle degradation in the traditional particle filter algorithm is solved; the effectiveness of the target template is improved by a new target template updating method, and the improved algorithm can still accurately track under the conditions of illumination change, moving target attitude displacement, shielding and the like.
Preferably, the further technical scheme of the invention is as follows:
preferably, the initialization phase in step 1 is performed according to the following steps:
step 1-1, manually selecting a tracking frame T ═ x, y, W, H, and determining the target position. Wherein x and y represent the central coordinate of the rectangular frame, W and H represent the width and height of the rectangular frame, set the total number N of particles, initialize the particle state;
step 1-2, dividing HSV three components into L-16 × 4 × 4 grades, counting the quantity of each pixel in each component, and finally counting the histogramObtaining HSV color characteristics of the target template and histogram of the target templatepExpressed as:
Figure BDA0002667906700000021
in the formula, N is the total number of the area pixels; x is the number of0Is a target center point pixel; x is the number ofiIs the coordinate of the ith pixel;
Figure BDA0002667906700000022
(Hxand HyHalf width and half height, respectively, of the target rectangle); is a dirac function; normalized coefficient of norm is
Figure BDA0002667906700000023
b(xi) Corresponds to a pixel xiA pixel index value of; m is an element of [1,256 ]]The range is indexed for the number of histogram segments.
Step 1-3, because the HOG is operated on the local grid unit of the image, the geometric and optical deformation of the image can be kept well unchanged, when the target appearance model is described by the HOG, the HOG is not influenced by illumination change, and the interference of target translation and torsion can be avoided, and the calculation formula is as follows:
Gx(x,y)=H(x+1,y)-H(x-1,y)
Gy(x,y)=H(x,y+1)-H(x,y-1);
in the formula, H (x, y) is a pixel value at the pixel point (x, y); gx,GyThe gradient in the horizontal direction and the gradient in the vertical direction are respectively, and the amplitude value at the pixel point (x, y) is as follows:
Figure BDA0002667906700000031
the gradient direction calculation formula is as follows:
Figure BDA0002667906700000032
preferably, step 3 is specifically performed according to the following steps:
step 3-1, the Bhattacharyya coefficient formula is as follows:
Figure BDA0002667906700000033
wherein n is the dimension of the histogram; the m value represents the similarity of the two histograms, and the larger the value is, the closer the target model is to the candidate model is;
step 3-2, calculating similarity coefficients of the HSV color histogram and the HOG direction gradient histogram of each particle by using the formula in the step 3-1
Figure BDA0002667906700000034
And
Figure BDA0002667906700000035
then, the similarity weight of each particle is obtained through weighted fusion, namely:
Figure BDA0002667906700000036
in the formula, alpha and beta are respectively the weight of the color characteristic and the direction gradient characteristic, the value of alpha is 1, and the value of beta is 0.75.
2. The sirius optimized particle filter target tracking algorithm based on feature fusion as claimed in claim 1, wherein step 4 is specifically performed according to the following steps:
step 4-1, initializing a random wolf population in a search area;
step 4-2, in the iterative process, calculating the fitness value of each gray wolf, wherein the gray wolf fitness is defined as:
f=|zact-zpred|;
in the formula, zactIs the actual observed value of the system, zpredIs the predicted value of the filter;
and 4-3, selecting three gray wolves with the best fitness to store as alpha and beta, wherein after the gray wolves determine the hunting position, the alpha gray wolve leader beta and the gray wolves guide a wolve group to approach and surround the hunting objects, and the distance between each individual and the hunting objects is as follows:
Figure BDA0002667906700000037
the grey wolf position updating formula is
Figure BDA0002667906700000038
4-4, updating the positions of other wolfs (omega) through the wolf hunting mathematical model;
Figure BDA0002667906700000039
Figure BDA00026679067000000310
after the iteration is finished, the final position vector of the wolf is as follows:
Figure BDA0002667906700000041
step 4-5, update the magnitude of the parameter A, C, a value
Figure BDA0002667906700000042
Preferably, the step 5 is specifically performed by the following steps:
step 5-1, setting a high weight threshold omegaHAnd a low weight threshold ωLDividing all the particles into three types of high, medium and low according to the weight value, then adjusting the proportion of the high-weight particles and the low-weight particles in resampling to ensure the diversity of the particles, and assuming that the particle set with weight value adjustment is
Figure BDA0002667906700000043
NsFor the total number of particles for weight adjustment, the average weight is calculated as:
Figure BDA0002667906700000044
the weight of the particles is adjusted as follows:
Figure BDA0002667906700000045
preferably, the partial resampling in step 6 is to perform partial resampling on the high-weight particles and the low-weight particles obtained by calculation in step 5, and a calculation formula for obtaining a new particle set is as follows:
Figure BDA0002667906700000046
preferably, in step 7, the update rule of the template is:
Figure BDA0002667906700000047
template FcThe update formula of (2) is: fc=γFt+(1-γ)Ft-1
In the formula, gamma represents a template updating coefficient, and the value of gamma is 0.2 in an experiment; ft-1,FtRepresenting the templates of the previous and current frames, respectively.
Description of the drawings:
FIG. 1 is a flow chart of a method according to the present invention;
FIG. 2 shows the tracking effect of the design method (PFGWO) and the conventional particle filter algorithm and CMT algorithm on a moving object in a video sequence Tiger;
FIG. 3 is a graph showing the tracking effect of the design method (PFGWO) of the present invention and the conventional particle filter algorithm and CMT algorithm on a moving object in a video sequence Girl;
FIG. 4 is a graph of tracking accuracy of the algorithm of the present invention in a Tiger video sequence and a conventional particle filter algorithm and CMT algorithm;
FIG. 5 is a graph of the tracking accuracy of the algorithm of the present invention in a Girl video sequence versus a conventional particle filter algorithm and a CMT algorithm;
FIG. 6 is a comparison of successful frame count and single frame processing time in a tracking sequence of total 120 frames for the design method of the present invention versus the conventional particle filtering method and CMT.
The specific implementation mode is as follows:
the invention will be further illustrated by the following examples, which are intended only for a better understanding of the present invention and therefore do not limit the scope of the invention.
The invention tracks the target according to the step flow of fig. 1, including the following steps:
step 1, an initialization stage, namely manually selecting a tracking rectangular frame on an initial frame image to determine a tracked moving target, establishing a target template model, initializing a particle state and setting the number of particle samples. The HSV color histogram and HOG directional gradient histogram are computed for the target template.
And 2, predicting according to a state transition equation, updating the state of each particle, obtaining a new predicted particle set, and establishing a candidate target template. And calculating HSV color histograms and HOG direction gradient histograms of the candidate templates.
And 3, fusing the HSV color feature and the HOG directional gradient feature, measuring the similarity of the target template and the candidate template by adopting a Bhattacharyya coefficient, and calculating the similarity coefficient of the HSV color histogram and the HOG directional gradient histogram of each particle in the step 2. And then carrying out a weighting fusion strategy according to different characteristic weight values to obtain a characteristic fusion observation model.
And 4, introducing a gray wolf algorithm to carry out iterative optimization on the particles of the observation model obtained in the step 3, and selecting the particles with high fitness to guide other particles to move towards the direction of the high-likelihood area with the posterior probability. And when the iteration process reaches the end of the maximum times, stopping optimization, and otherwise, entering the step 3.
And 5, calculating the weight of the particles according to the latest observation information in the target state, carrying out normalization processing, carrying out a self-adaptive adjustment strategy on the weight of the particles, and estimating the target position and state according to the weight of the particles.
And 6, performing partial resampling, namely performing partial resampling operation on the high-weight particles and the low-weight particles obtained by calculation in the step 5 to obtain a new particle set.
Step 7, updating the target template by adopting the initial templateF0And a motion template updated in real timeFcAnd (4) adaptively selecting a template updating mode. By calculating the candidate area, the initial template and the color of the motion templateThe distance of the color histogram is used to select whether to update the template.
And 8, repeating the steps 2 to 7, and continuously tracking the moving target of the next frame.
In the above steps, the algorithm of the invention uses MATLAB-2014b software to perform tracking experiments on moving objects in a video sequence under the environment of desktop computer main frequency 2.6GHz, Intel core i5-4210M as CPU, 4G as memory and Win10 system, and the sizes of Tiger and Girl sequence images are 640 × 480 and 128 × 96 respectively.
Preferably, the initialization phase in step 1 is performed according to the following steps:
step 1-1, manually selecting a tracking frame T ═ x, y, W, H, and determining the target position. Wherein x and y represent the central coordinate of the rectangular frame, W and H represent the width and height of the rectangular frame, set the total number N of particles, initialize the particle state;
step 1-2, dividing three HSV components into L-16 × 4 × 4 levels, counting the number of each pixel in each component, finally counting a histogram to obtain HSV color characteristics of a target template, and calculating the histogram of the target templatepExpressed as:
Figure BDA0002667906700000051
in the formula, N is the total number of the area pixels; x is the number of0Is a target center point pixel; x is the number ofiIs the coordinate of the ith pixel;
Figure BDA0002667906700000052
(Hxand HyHalf width and half height, respectively, of the target rectangle); is a dirac function; normalized coefficient of norm is
Figure BDA0002667906700000053
b(xi) Corresponds to a pixel xiA pixel index value of; m is an element of [1,256 ]]The range is indexed for the number of histogram segments.
Step 1-3, because the HOG is operated on the local grid unit of the image, the geometric and optical deformation of the image can be kept well unchanged, when the target appearance model is described by the HOG, the HOG is not influenced by illumination change, and the interference of target translation and torsion can be avoided, and the calculation formula is as follows:
Gx(x,y)=H(x+1,y)-H(x-1,y)
Gy(x,y)=H(x,y+1)-H(x,y-1);
in the formula, H (x, y) is a pixel value at the pixel point (x, y); gx,GyThe gradient in the horizontal direction and the gradient in the vertical direction are respectively, and the amplitude value at the pixel point (x, y) is as follows:
Figure BDA0002667906700000054
the gradient direction calculation formula is as follows:
Figure BDA0002667906700000055
preferably, step 3 is specifically performed according to the following steps:
step 3-1, the Bhattacharyya coefficient formula is as follows:
Figure BDA0002667906700000061
wherein n is the dimension of the histogram; the m value represents the similarity of the two histograms, and the larger the value is, the closer the target model is to the candidate model is;
step 3-2, calculating similarity coefficients of the HSV color histogram and the HOG direction gradient histogram of each particle by using the formula in the step 3-1
Figure BDA0002667906700000062
And
Figure BDA0002667906700000063
then, the similarity weight of each particle is obtained through weighted fusion, namely:
Figure BDA0002667906700000064
in the formula, alpha and beta are respectively the weight of the color characteristic and the direction gradient characteristic, the value of alpha is 1, and the value of beta is 0.75.
3. The sirius optimized particle filter target tracking algorithm based on feature fusion as claimed in claim 1, wherein step 4 is specifically performed according to the following steps:
step 4-1, initializing a random wolf population in a search area;
step 4-2, in the iterative process, calculating the fitness value of each gray wolf, wherein the gray wolf fitness is defined as:
f=|zact-zpred|;
in the formula, zactIs the actual observed value of the system, zpredIs the predicted value of the filter;
and 4-3, selecting three gray wolves with the best fitness to store as alpha and beta, wherein after the gray wolves determine the hunting position, the alpha gray wolve leader beta and the gray wolves guide a wolve group to approach and surround the hunting objects, and the distance between each individual and the hunting objects is as follows:
Figure BDA0002667906700000065
the grey wolf position updating formula is
Figure BDA0002667906700000066
4-4, updating the positions of other wolfs (omega) through the wolf hunting mathematical model;
Figure BDA0002667906700000067
Figure BDA0002667906700000068
after the iteration is finished, the final position vector of the wolf is as follows:
Figure BDA0002667906700000069
step 4-5, update the magnitude of the parameter A, C, a value
Figure BDA00026679067000000610
Preferably, the step 5 is specifically performed by the following steps:
step 5-1, setting a high weight threshold omegaHAnd a low weight threshold ωLDividing all the particles into three types of high, medium and low according to the weight value, then adjusting the proportion of the high-weight particles and the low-weight particles in resampling to ensure the diversity of the particles, and assuming that the particle set with weight value adjustment is
Figure BDA00026679067000000611
NsFor the total number of particles for weight adjustment, the average weight is calculated as:
Figure BDA0002667906700000071
the weight of the particles is adjusted as follows:
Figure BDA0002667906700000072
preferably, the partial resampling in step 6 is to perform partial resampling on the high-weight particles and the low-weight particles obtained by calculation in step 5, and a calculation formula for obtaining a new particle set is as follows:
Figure BDA0002667906700000073
preferably, in step 7, the update rule of the template is:
Figure BDA0002667906700000074
template FcThe update formula of (2) is: fc=γFt+(1-γ)Ft-1
In the formula, gamma represents a template updating coefficient, and the value of gamma is 0.2 in an experiment; ft-1,FtRepresenting the templates of the previous and current frames, respectively.
An example of the application of the invention to moving object tracking in a video sequence is given below: the target tracking precision is obtained by measuring the distance between the actual position of the target and the standard central position by using the Euclidean distance and drawing a tracking precision graph of the algorithm; we define a video frame with an overlap rate threshold (overlap pixel area of the actual target tracking box and the sequence standard tracking box/standard box pixel area) of 0.8 as a tracking success. And calculating to obtain the tracking success rate. The invention adopts the two evaluation indexes to carry out quantitative analysis on the algorithm of the invention, and the method is shown in figures 2-6.
Example 1: the algorithm of the invention, the traditional particle filter algorithm and the CMT algorithm carry out a target tracking experiment on the images of 120 frames before the Tiger of the video sequence; the Tiger sequence is that the target doll is partially occluded by the green plant at frames 24 and 50; in the 91 st frame, the tracking target moves to a position below the table lamp, the illumination intensity of the doll face is intense due to light reflection, the doll face is bright, and the color features disappear; in the moving process of the tracking target from the 24 th frame to the 120 th frame, the scale of the tracking target is changed due to the change of the front-back distance; as can be seen from the tracking effect comparison experiment of fig. 2 and 3: when a target is shielded, the tracking target drifts because the target template is not updated in the traditional particle filter algorithm, and in a 91 st frame, because the characteristics select single RGB (red, green and blue) characteristics, when the characteristics are changed or are not obvious, the target tracking is easy to lose; although the effect of the CMT algorithm is stronger than that of the traditional particle filtering when the target scale changes and rotates, the problem of shielding is still not solved; the algorithm of the invention can still accurately track the target position when the video sequence ends from the 1 st frame to the 120 th frame, the tracking accuracy rate reaches 97.5%, the single-frame processing speed is 24.6ms, and the algorithm can cope with the influences of complex environment change, target shielding, rotation and the like.
Example 2: the algorithm of the invention, the traditional particle filter algorithm and the CMT algorithm carry out a target tracking experiment on 120 frames of images before a video sequence Girl; starting at frame 54 of the Girl video sequence, gradually disappearing the facial features after the posture is changed due to the turning of the target person, and only displaying the side face of the Girl in the image at frame 90, wherein the facial features completely disappear; at frames 112 through 120, the features gradually recover; experimental results for the target moving in the first 54 frames, such as shown in fig. 3 and 4, all three algorithms can effectively perform face tracking; in the 90 th frame of the sequence, when the head is twisted and facial features are lost in the process of turning the target, the traditional PF algorithm and CMT algorithm can not carry out effective tracking; the GWOPF algorithm is still not influenced when the sequence is tracked, the target tracking success rate reaches 95%, the single-frame processing time is 18.4ms, and the method has certain advantages compared with other two algorithms.
The method takes the traditional particle filter tracking algorithm as a frame, introduces the wolf algorithm to optimize the particle distribution, and moves the particle probability distribution to a high-likelihood region by simulating the wolf colony predation process; providing a characteristic-fused Grey wolf optimization particle filter tracking algorithm, and carrying out a tracking experiment of a moving target in a challenging video sequence by using the algorithm; the HSV color feature and the HOG direction gradient feature of a moving target in an initial frame of a video sequence are subjected to weighted fusion to establish a target model, so that the robustness of target tracking is improved; the gray wolf algorithm is used for guiding the movement of the particles, the reasonability of particle distribution is improved, the weight value of the particles is subjected to self-adaptive adjustment in the resampling process, and the diversity of the particles is improved; by using the new target template and the new mode, the problem that the updated template cannot be matched with the target template in time when the target is shielded is solved.
According to the method, the target HSV color characteristic and the HOG characteristic are subjected to weighted fusion, so that the characteristic description of an algorithm on the target is promoted, and the single characteristic is prevented from being better adapted to complex environmental conditions; the GWO algorithm is used for optimizing the particle distribution situation, so that the particle distribution of each generation approaches the posterior probability of the real state of the system, the particle distribution is prevented from being excessively concentrated or dispersed, meanwhile, the weight value is adaptively adjusted before partial resampling, the particle diversity is improved, and the problem of particle degradation in the traditional particle filter algorithm is solved; the effectiveness of the target template is improved by a new target template updating method, and the improved algorithm can still accurately track under the conditions of illumination change, moving target attitude displacement, shielding and the like.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the scope of the present invention, which is defined in the appended claims.

Claims (7)

1. A grey wolf optimized particle filter target tracking algorithm based on feature fusion is characterized in that in a particle filter tracking algorithm framework, according to a grey wolf position updating mechanism in the grey wolf algorithm, particle movement is guided, a traditional particle filter algorithm is optimized, a target feature fusion model is established to describe a target template, particle weight is adaptively adjusted, particle diversity is increased, a template updating method is used for selecting the target template, and the method is carried out according to the following steps:
step 1, in an initialization stage, manually selecting a tracking rectangular frame on an initial frame image to determine a tracked moving target, establishing a target template model, initializing a particle state, setting the number of particle samples, and calculating an HSV (hue, saturation, value) color histogram and an HOG (histogram of oriented gradient) histogram for the target template;
step 2, predicting according to a state transition equation, updating the state of each particle, obtaining a new predicted particle set, establishing a candidate target template, and calculating an HSV color histogram and an HOG direction gradient histogram of the candidate template;
step 3, fusing HSV color features and HOG directional gradient features, measuring the similarity of the target template and the candidate template by adopting a Bhattacharyya coefficient, calculating the similarity coefficient of the HSV color histogram and the HOG directional gradient histogram of each particle in the step 2, and then performing a weighting fusion strategy according to different feature weight values to obtain a feature fusion observation model;
step 4, introducing a grayish wolf algorithm to carry out iterative optimization on the particles of the observation model obtained in the step 3, selecting the particles with high fitness to guide other particles to move towards the direction of the posterior probability high-likelihood region, stopping optimization when the iteration process reaches the maximum times, and otherwise, entering the step 3;
step 5, calculating the weight of the particles according to the latest observation information in the target state, carrying out normalization processing, carrying out a self-adaptive adjustment strategy on the weight of the particles, and estimating the target position and state according to the weight of the particles;
step 6, partial resampling, namely performing partial resampling operation on the high-weight particles and the low-weight particles obtained by calculation in the step 5 to obtain a new particle set;
step 7, updating the target template by adopting an initial template F0And a real-time updated motion template FcThe self-adaptive selection template updating mode is used for selecting whether to update the template or not by calculating the distances between the candidate area and the initial template and the color histogram of the motion template;
and 8, repeating the steps 2 to 7, and continuously tracking the moving target of the next frame.
2. The sirius optimized particle filter target tracking algorithm based on feature fusion as claimed in claim 1, wherein the initialization phase of step 1 is performed according to the following steps:
step 1-1, manually selecting a tracking frame T ═ x, y, W, H, and determining the target position. Wherein x and y represent the central coordinates of the rectangular frame, W and H represent the width and height of the rectangular frame, the total number N of particles is set, and the particle state is initialized;
step 1-2, dividing three HSV components into L-16 × 4 × 4 levels, counting the number of each pixel in each component, and finally counting a histogram to obtain HSV color characteristics of a target template, wherein a histogram p of the target template is represented as:
Figure FDA0002667906690000011
in the formula, N is the total number of the area pixels; x is the number of0Is a target center point pixel; x is the number ofiIs the coordinate of the ith pixel;
Figure FDA0002667906690000012
(Hxand HyHalf width and half height, respectively, of the target rectangle); is a dirac function; normalized coefficient of norm is
Figure FDA0002667906690000013
b(xi) Corresponds to a pixel xiA pixel index value of; m is an element of [1,256 ]]The range is indexed for the number of histogram segments.
Step 1-3, because the HOG is operated on a local grid unit of the image, the geometric and optical deformation of the image can be kept well unchanged, when the target appearance model is described by the HOG, the HOG is not influenced by illumination change, and the interference of target translation and torsion can be avoided, and the calculation formula is as follows:
Gx(x,y)=H(x+1,y)-H(x-1,y)
Gy(x,y)=H(x,y+1)-H(x,y-1);
in the formula, H (x, y) is a pixel value at the pixel point (x, y); gx,GyThe gradient in the horizontal direction and the gradient in the vertical direction are respectively, and the amplitude value at the pixel point (x, y) is as follows:
Figure FDA0002667906690000021
the gradient direction calculation formula is as follows:
Figure FDA0002667906690000022
3. the sirius optimized particle filter target tracking algorithm based on feature fusion as claimed in claim 1, wherein step 3 is specifically performed according to the following steps:
step 3-1, the Bhattacharyya coefficient formula is as follows:
Figure FDA0002667906690000023
wherein n is the dimension of the histogram; the m value represents the similarity of the two histograms, and the larger the value is, the closer the target model is to the candidate model is;
step 3-2, calculating the similarity coefficient of the HSV color histogram and the HOG direction gradient histogram of each particle by using the formula in the step 3-1
Figure FDA0002667906690000024
And
Figure FDA0002667906690000025
then, the similarity weight of each particle is obtained through weighted fusion, namely:
Figure FDA0002667906690000026
in the formula, alpha and beta are respectively the weight of the color characteristic and the direction gradient characteristic, the value of alpha is 1, and the value of beta is 0.75.
4. The sirius optimized particle filter target tracking algorithm based on feature fusion as claimed in claim 1, wherein the step 4 is specifically performed according to the following steps:
step 4-1, initializing a random wolf population in a search area;
step 4-2, in the iterative process, calculating the fitness value of each gray wolf, wherein the gray wolf fitness is defined as:
f=|zact-zpred|;
in the formula, zactIs the actual observed value of the system, zpredIs the predicted value of the filter;
and 4-3, selecting three gray wolves with the best fitness to store as alpha and beta, and after the gray wolves determine the hunting position, guiding the wolf group to approach and surround the prey by the alpha gray wolve leader beta and the gray wolves, wherein the distance between each individual and the prey is as follows:
Figure FDA0002667906690000027
the grey wolf position updating formula is
Figure FDA0002667906690000028
4-4, updating the positions of other wolfs (omega) through the wolf hunting mathematical model;
Figure FDA0002667906690000029
Figure FDA0002667906690000031
after the iteration is finished, the final position vector of the wolf is as follows:
Figure FDA0002667906690000032
step 4-5, update the magnitude of the parameter A, C, a value
Figure FDA0002667906690000033
5. The sirius optimized particle filter target tracking algorithm based on feature fusion as claimed in claim 1, wherein the step 5 is specifically implemented by the following steps:
step 5-1, setting a high weight threshold omegaHAnd a low weight threshold ωLDividing all particles into three types of high, medium and low according to weight values, then adjusting the proportion of the high-weight particles and the low-weight particles in resampling to ensure the diversity of the particles, and assuming that the particle set with weight value adjustment is
Figure FDA0002667906690000034
NsFor the total number of particles for weight adjustment, the average weight is calculated as:
Figure FDA0002667906690000035
the weight of the particles is adjusted as follows:
Figure FDA0002667906690000036
6. feature fusion based on claim 1The grey wolf optimized particle filter target tracking algorithm is characterized in that the partial resampling in the step 6 is performed, the partial resampling is performed on the high-weight particles and the low-weight particles obtained by the calculation in the step 5, and the calculation formula for obtaining the new particle set is as follows:
Figure FDA0002667906690000037
7. the sirius optimized particle filter target tracking algorithm based on feature fusion as claimed in claim 1, wherein in step 7, the updating rule of the template is:
Figure FDA0002667906690000038
template FcThe update formula of (2) is: fc=γFt+(1-γ)Ft-1
In the formula, gamma represents a template updating coefficient, and the value of gamma is 0.2 in an experiment; ft-1,FtRepresenting the templates of the previous and current frames, respectively.
CN202010924590.2A 2020-09-05 2020-09-05 Gray wolf optimization particle filter target tracking algorithm based on feature fusion Pending CN112184762A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010924590.2A CN112184762A (en) 2020-09-05 2020-09-05 Gray wolf optimization particle filter target tracking algorithm based on feature fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010924590.2A CN112184762A (en) 2020-09-05 2020-09-05 Gray wolf optimization particle filter target tracking algorithm based on feature fusion

Publications (1)

Publication Number Publication Date
CN112184762A true CN112184762A (en) 2021-01-05

Family

ID=73925774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010924590.2A Pending CN112184762A (en) 2020-09-05 2020-09-05 Gray wolf optimization particle filter target tracking algorithm based on feature fusion

Country Status (1)

Country Link
CN (1) CN112184762A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113053348A (en) * 2021-03-12 2021-06-29 上海物骐微电子有限公司 Active noise control method and system based on wolf algorithm
CN114347030A (en) * 2022-01-13 2022-04-15 中通服创立信息科技有限责任公司 Robot vision following method and vision following robot
CN116109897A (en) * 2023-04-14 2023-05-12 中国科学院自动化研究所 Robot fish sensor fault detection method and device based on airspace image fusion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105389807A (en) * 2015-10-26 2016-03-09 南京理工大学 Particle filter infrared tracking method with fusion of gradient feature and adaptive template
CN106780567A (en) * 2016-11-24 2017-05-31 红河学院 A kind of immune particle filter extension method for tracking target of Fusion of Color and histogram of gradients
CN108629797A (en) * 2018-04-28 2018-10-09 四川大学 A kind of visual target tracking method based on particle filter
CN108985375A (en) * 2018-07-14 2018-12-11 李军 Consider the multiple features fusion tracking of particle weight spatial distribution
CN110991565A (en) * 2019-12-24 2020-04-10 华北理工大学 Target tracking optimization algorithm based on KCF
CN111369597A (en) * 2020-03-09 2020-07-03 南京理工大学 Particle filter target tracking method based on multi-feature fusion
CN111476338A (en) * 2020-04-17 2020-07-31 西安电子科技大学 Particle filtering method and filtering system based on goblet sea squirt group optimization algorithm

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105389807A (en) * 2015-10-26 2016-03-09 南京理工大学 Particle filter infrared tracking method with fusion of gradient feature and adaptive template
CN106780567A (en) * 2016-11-24 2017-05-31 红河学院 A kind of immune particle filter extension method for tracking target of Fusion of Color and histogram of gradients
CN108629797A (en) * 2018-04-28 2018-10-09 四川大学 A kind of visual target tracking method based on particle filter
CN108985375A (en) * 2018-07-14 2018-12-11 李军 Consider the multiple features fusion tracking of particle weight spatial distribution
CN110991565A (en) * 2019-12-24 2020-04-10 华北理工大学 Target tracking optimization algorithm based on KCF
CN111369597A (en) * 2020-03-09 2020-07-03 南京理工大学 Particle filter target tracking method based on multi-feature fusion
CN111476338A (en) * 2020-04-17 2020-07-31 西安电子科技大学 Particle filtering method and filtering system based on goblet sea squirt group optimization algorithm

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李维刚 等: "基于改进灰狼算法的粒子滤波算法研究", 《系统仿真学报》 *
赵康 等: "基于改进粒子滤波的视觉目标跟踪", 《四川大学学报(自然科学版)》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113053348A (en) * 2021-03-12 2021-06-29 上海物骐微电子有限公司 Active noise control method and system based on wolf algorithm
CN113053348B (en) * 2021-03-12 2023-08-11 上海物骐微电子有限公司 Active noise control method and system based on wolf algorithm
CN114347030A (en) * 2022-01-13 2022-04-15 中通服创立信息科技有限责任公司 Robot vision following method and vision following robot
CN116109897A (en) * 2023-04-14 2023-05-12 中国科学院自动化研究所 Robot fish sensor fault detection method and device based on airspace image fusion
CN116109897B (en) * 2023-04-14 2023-08-15 中国科学院自动化研究所 Robot fish sensor fault detection method and device based on airspace image fusion

Similar Documents

Publication Publication Date Title
CN112184762A (en) Gray wolf optimization particle filter target tracking algorithm based on feature fusion
CN109919910B (en) SAR image change detection method based on difference map fusion and improved level set
CN107273905B (en) Target active contour tracking method combined with motion information
CN111369597B (en) Particle filter target tracking method based on multi-feature fusion
CN108470354A (en) Video target tracking method, device and realization device
CN112784736B (en) Character interaction behavior recognition method based on multi-modal feature fusion
CN108876820B (en) Moving target tracking method under shielding condition based on mean shift
CN109685045A (en) A kind of Moving Targets Based on Video Streams tracking and system
CN111582349B (en) Improved target tracking algorithm based on YOLOv3 and kernel correlation filtering
CN110009060B (en) Robustness long-term tracking method based on correlation filtering and target detection
CN111652317A (en) Hyper-parameter image segmentation method based on Bayesian deep learning
CN113177456B (en) Remote sensing target detection method based on single-stage full convolution network and multi-feature fusion
CN110941999A (en) Method for adaptively calculating size of Gaussian kernel in crowd counting system
CN109448019B (en) Adaptive method for smoothing parameters of variable-split optical flow model
CN107292896A (en) Contour extraction method based on Snake models
CN111199245A (en) Rape pest identification method
Liu et al. Image edge recognition of virtual reality scene based on multi-operator dynamic weight detection
CN110782487A (en) Target tracking method based on improved particle filter algorithm
CN103985139B (en) Particle filter target tracking method based on color model and prediction vector cluster model information fusion
CN113379789B (en) Moving target tracking method in complex environment
CN112164093A (en) Automatic person tracking method based on edge features and related filtering
Liu et al. Self-correction ship tracking and counting with variable time window based on YOLOv3
CN108010002A (en) A kind of structuring point cloud denoising method based on adaptive implicit Moving Least Squares
CN108765463B (en) Moving target detection method combining region extraction and improved textural features
Lee et al. An edge detection–based eGAN model for connectivity in ambient intelligence environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210105