CN110276383B - Nuclear correlation filtering target positioning method based on multi-channel memory model - Google Patents

Nuclear correlation filtering target positioning method based on multi-channel memory model Download PDF

Info

Publication number
CN110276383B
CN110276383B CN201910471284.5A CN201910471284A CN110276383B CN 110276383 B CN110276383 B CN 110276383B CN 201910471284 A CN201910471284 A CN 201910471284A CN 110276383 B CN110276383 B CN 110276383B
Authority
CN
China
Prior art keywords
memory space
feature
gray
gray scale
kcf
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910471284.5A
Other languages
Chinese (zh)
Other versions
CN110276383A (en
Inventor
宫琳
莫振冲
唐圣
陈西
谢剑
刘壮
林颖捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201910471284.5A priority Critical patent/CN110276383B/en
Publication of CN110276383A publication Critical patent/CN110276383A/en
Application granted granted Critical
Publication of CN110276383B publication Critical patent/CN110276383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis

Abstract

The invention provides a nuclear correlation filtering target positioning method based on a multi-channel memory model, which comprises the steps of firstly, establishing an updating model based on the multi-channel memory, adopting a gray scale feature of a control channel memory target, adopting two execution channels to memorize a gradient direction histogram feature and a KCF classifier parameter; and then, introducing the established multi-channel memory model into a KCF classifier, so that the multi-channel memory model can memorize the gray scale characteristics, the gradient direction histogram characteristics and the KCF classifier parameters corresponding to the target in the image scene appearing before, and when the target is shielded, subjected to similar interference and subjected to posture transformation, the memory information related to the target, namely the gradient direction histogram characteristics and the KCF classifier parameters are extracted in a gray scale characteristic matching mode to accurately and quickly locate the position of the target in the current frame image.

Description

Nuclear correlation filtering target positioning method based on multi-channel memory model
Technical Field
The invention belongs to the technical field of moving target tracking based on computer vision, and particularly relates to a nuclear correlation filtering target positioning method based on a multi-channel memory model.
Background
The moving target tracking technology is an important research direction in the field of computer vision, and has wide application in the fields of intelligent security, vision monitoring, human-computer interaction and the like. Currently, the main interference factors existing in the target tracking technology include: 1) changes caused by the surrounding environment of the target, including the shielding of the target, illumination changes, similar target interference and the like; 2) the changes caused by the target itself include geometric deformation, rotation, posture changes and the like of the target. The above factors make it still very challenging to achieve accurate and robust target tracking in a complex context.
The target detection method is mainly classified into a generating method and a discriminating method. The generative method mainly represents the target by learning an appearance model, then searches in an image region, and finally takes a region most similar to the appearance model as the target. The discriminant method mainly trains a binary classifier through an existing sample, classifies a search region by using the binary classifier, and takes a point which enables the confidence value of the classifier to be maximum as a target position, so as to distinguish a target from a background. The distinguishing and tracking method does not need to learn a complex appearance model, has small calculated amount and relatively high tracking speed, and therefore becomes a mainstream tracking method.
As a novel target tracking method based on discrimination, a training sample of a classifier is constructed by a Kernel Correlation Filtering (KCF) algorithm through cyclic offset, so that a data matrix is converted into a cyclic matrix. And then, the solution of the problem is transformed to a Fourier transform domain based on the characteristics of the cyclic matrix, so that the matrix inversion process is avoided, the method has higher operation efficiency, can well process the moving target tracking problem under the nonlinear condition, and has higher tracking speed and higher robustness. However, since the process of establishing the kernel correlation filter updating learning model is only based on the current frame, the influence of the previous frame is ignored, and when the target has sudden change of posture, serious shielding, interference of similar targets and the like, the model loses the characteristic value of the target image, so that the tracking accuracy is reduced, and even the tracking fails.
In order to solve the problems, researchers provide a scale and rotation self-adaptive tracking algorithm based on kernel correlation filtering, and design a shielding detection module, so that the influence of shielding on tracking precision is reduced. However, the algorithm estimates the scale change and the rotation angle change according to the feature point matching, and when the target area has fewer feature points or is smaller, the tracking accuracy is still difficult to guarantee. Meanwhile, other researchers also provide a related tracking method based on blocks, the tracked target is segmented into different parts to learn characteristics, then tracking is carried out respectively, and then the parts are fused to obtain the final position of the target. Although the algorithm can realize real-time robust tracking on the target, the tracking speed of the algorithm is seriously reduced by the blocking process.
Disclosure of Invention
In order to solve the problems, the invention provides a nuclear correlation filtering target positioning method based on a multi-channel memory model, which can accurately and quickly position the position of a target in a current frame image when the target is shielded, subjected to similar interference and subjected to posture transformation.
A nuclear correlation filtering target positioning method based on a multi-channel memory model comprises the following steps:
s1: acquiring a target area, gray scale characteristics of the target area and gradient direction histogram characteristics corresponding to the gray scale characteristics in at least five previous frame images of a current frame, wherein the gradient direction histogram characteristics are used for representing the position of the target area on the image;
s2: sequentially taking a target area corresponding to each frame image as input, taking the gray characteristic of the target area as output, and training a KCF classifier to obtain KCF classifier parameters corresponding to each frame image;
s3: storing the gray scale features, the gradient direction histogram features and the KCF classifier parameters corresponding to each frame of image in a multi-channel memory model in a one-to-one correspondence manner, wherein the multi-channel memory model comprises a control channel, a first execution channel and a second execution channel, and the control channel, the first execution channel and the second execution channel are divided into a transient memory space, a short-time memory space and a long-time memory space;
wherein, the transient memory space of the control channel is used for storing the gray scale feature q of the target area in the current frame imagetThe short-term memory space and the long-term memory space are both used for storing the gray scale characteristics of the target area in the previous frame image, and the instantaneous memory space of the control channel is empty in the initial state;
the instantaneous memory space of the first execution channel is used for storing and acquiring the gray scale feature q of the current frame imagetTemporal KCF classifier parameters, short-term memory space and long-term memoryThe memory space is used for storing KCF classifier parameters obtained by adopting gray feature training of previous frame images, and in an initial state, the transient memory space of the first execution channel is empty;
the transient memory space of the second execution channel is used for storing the gradient direction histogram feature of the current frame image, the short-term memory space and the long-term memory space are both used for storing the gradient direction histogram feature of the previous frame image, and in an initial state, the transient memory space of the second execution channel is empty;
s4: re-acquiring a frame of image as a current frame image, and classifying the current frame image by adopting the KCF classifier parameters stored in the first position of the short-time memory space of the first execution channel to obtain the gray characteristic q of the current frame imagetThen the gray scale feature q is addedtStoring the KCF classifier parameters in the first position of the short-time memory space of the first execution channel in the transient memory space of the control channel, and storing the grayscale characteristics q in the transient memory space of the first execution channeltThe corresponding gradient direction histogram features are stored in the transient memory space of the second execution channel;
s5: sequentially acquiring the gray level features q of the transient memory space according to the storage sequence of the gray level features in the short-time memory space of the control channeltSimilarity rho with each gray feature in short-time memory spacedUp to the similarity pdGreater than a first set threshold TdThen the gray scale characteristic qtAnd the gray scale characteristic qtdMatching successfully, and then obtaining the gray feature qtdCorresponding gradient direction histogram characteristics, so as to realize the positioning of the target on the current frame image; if the gray scale feature qtIf the matching with all the gray-scale features in the short-time memory space fails, the step S6 is executed;
wherein if the gray scale feature qtAnd the gray scale characteristic qtdIf the matching is successful, the gray feature q is usedtdThe storage positions of other gray scale features of the short-time memory space are sequentially moved backwards, and the gray scale features in the control channel are moved and stored in the first position of the short-time memory space, and simultaneously, the storage positions of the other gray scale features in the first execution channel are sequentially moved backwardsThe KCF classifier parameters and the storage positions of the gradient direction histogram features in the second execution channel are moved in the same way;
s6: sequentially acquiring the gray level features q of the instantaneous memory space according to the storage sequence of the gray level features in the long-term memory space of the control channeltSimilarity rho with each gray feature in long-term memory spacecUp to the similarity pcGreater than a second set threshold TcThen the gray scale characteristic qtAnd the gray scale characteristic qtcMatching successfully, and then obtaining the gray feature qtcCorresponding gradient direction histogram characteristics, so as to realize the positioning of the target on the current frame image; if the gray scale feature qtIf the matching with all the gray features in the long-term memory space fails, the gray feature q is usedtThe position of a target area represented by the corresponding gradient direction histogram feature on the current frame image is used for realizing the positioning of the target;
wherein if the gray scale feature qtAnd the gray scale characteristic qtcIf the matching is successful, the gray feature q is usedtcMoving and storing in the first position of the short-time memory space and the long-time memory space arranged in the gray feature qtcThe previous storage positions of the gray features are sequentially moved backwards, and the storage positions of other gray features are kept unchanged; while the gray feature in the control channel is moved, the KCF classifier parameters in the first execution channel and the storage position of the gradient direction histogram feature in the second execution channel are moved in the same way, and the gray feature q is adoptedtcAnd acquiring the gray scale characteristics of the next frame of image by the corresponding KCF classifier parameters.
Further, in step S6, the gradation feature q is usedtAfter the target area represented by the corresponding gradient direction histogram feature is positioned on the current frame image, the gray feature q is used fortStoring the gray feature q in the first position of the short-term memory space, and simultaneously judging the gray feature q in the last position of the short-term memory spaceNWhether the matching success times is greater than a set threshold value T or notMIf not, the gray characteristic q is setNRemoving, if greater than, the gray scaleCharacteristic qNStoring the gray scale features in the first position in the long-term memory space, simultaneously removing the gray scale features in the last position in the long-term memory space, and sequentially moving the storage positions of other gray scale features in the short-term memory space and the long-term memory space backwards; and when the gray feature in the control channel is rejected and moved, the KCF classifier parameters in the first execution channel and the gradient direction histogram features in the second execution channel are subjected to the same rejection and movement.
Further, if the gradation characteristic q is presenttAnd the gray scale characteristic qtdIf the matching is successful, updating the gray scale feature stored at the first position of the short-time memory space in the control channel according to a first set rule, updating the KCF classifier parameter stored at the first position of the short-time memory space in the first execution channel according to a second set rule, updating the gradient direction histogram feature stored at the first position of the short-time memory space in the second execution channel according to a third set rule, and using the updated gray scale feature, KCF classifier parameter and gradient direction histogram feature for target positioning of the next frame of image;
the first set rule is as follows:
qth=(1-ε)qtb+εqta
wherein q isthFor updated gray features, qtaIs associated with the gray scale feature qtMatching successful gray features, qtbStoring the gray scale characteristics for the first position of the short-time memory space in the control channel, wherein epsilon is the set gray scale characteristic updating rate;
the second update rule is:
αth=(1-β)αtb+βαta
wherein alpha isthFor updated KCF classifier parameters, alphatbIs associated with the gray scale feature qtKCF classifier parameter, alpha, corresponding to gray scale feature successfully matchedtaStoring KCF classifier parameters for a first position in a short-time memory space in a first execution channel, wherein beta is a set KCF classifier parameter updating rate;
the third update rule is:
xth=(1-β)xtb+βxta
wherein x isthFor the updated gradient histogram feature, xtbIs associated with the gray scale feature qtGradient histogram feature, x, corresponding to gray scale feature successfully matchedtaThe histogram feature of the gradient direction stored for the first position in the short term memory space in the second execution channel.
Has the advantages that:
the invention provides a nuclear correlation filtering target positioning method based on a multi-channel memory model, which comprises the steps of firstly, establishing an updating model based on the multi-channel memory, adopting a gray scale feature of a control channel memory target, adopting two execution channels to memorize a gradient direction histogram feature and a KCF classifier parameter; and then, introducing the established multi-channel memory model into a KCF classifier, so that the multi-channel memory model can memorize the gray scale characteristics, the gradient direction histogram characteristics and the KCF classifier parameters corresponding to the target in the image scene appearing before, and when the target is shielded, subjected to similar interference and subjected to posture transformation, the memory information related to the target, namely the gradient direction histogram characteristics and the KCF classifier parameters are extracted in a gray scale characteristic matching mode to accurately and quickly locate the position of the target in the current frame image.
Drawings
FIG. 1 is a flowchart of a method for positioning a nuclear correlation filtering target based on a multi-channel memory model according to the present invention;
FIG. 2 is a diagram of a multi-channel memory model according to the present invention.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
The memory is the process of coding, storing and extracting the information input from the outside in the human brain and can be divided into 3 stages of instantaneous memory, short-time memory and long-time memory. Meanwhile, the human memory system can be described as a multi-dimensional multi-channel memory system (MEM) composed of basic cognitive elements. Memory, on the other hand, is a key cognitive process in the brain. When a new object is recognized, the memory information related to the object can be extracted, so that the recognition process is accelerated and the new environment is adapted. Therefore, the memory mechanism of the Human Visual System (HVS) has important applications in target detection, recognition and tracking. Therefore, the problems of abrupt attitude change, serious shielding, similar target interference and the like in the target tracking process can be solved by combining a memory mechanism with a target tracking method based on discrimination. Based on the above understanding, the present embodiment proposes to establish a memory model based on multiple channels for the problems of abrupt attitude change, severe occlusion, similar target interference, and the like in the target tracking process.
Referring to fig. 1, the figure is a flowchart of a method for locating a target based on kernel correlation filtering of a multi-channel memory model according to this embodiment. A nuclear correlation filtering target positioning method based on a multi-channel memory model is characterized by comprising the following steps:
s1: and acquiring a target area of at least five frames of images before the current frame, the gray scale feature of the target area and the gradient direction histogram feature corresponding to the gray scale feature, wherein the gradient direction histogram feature is used for representing the position of the target area on the image.
S2: and sequentially taking the target area corresponding to each frame image as input and the gray feature of the target area as output, and training the KCF classifier to obtain the KCF classifier parameters corresponding to each frame image.
It should be noted that, in this embodiment, since the KCF classifier is trained through the pixel points of each frame image, one frame image corresponds to one KCF classifier parameter.
Optionally, the training process of the KCF classifier is as follows:
firstly, training samples of the classifier are constructed through cyclic offset, so that a data matrix is converted into a cyclic matrix. Let the initial frame tracking window, i.e. the base sample x ═ x1,x2,…,xn]Sample alignment using permutation matrix PThis image X is cyclically shifted to form a cyclic matrix X ═ X, Px, …, Pn-1x]T
Learning a classifier f (x) by using training samples, wherein the training device represents the samples by using HOG characteristics, and the training process is shown as formula (1):
Figure BDA0002080911410000081
where λ is the regularization parameter to prevent overfitting, w is the classifier parameter, yiIs the corresponding classification label.
In the KCF tracker, f (x) is a nonlinear classifier whose solution w can be calculated by a linear combination of samples mapped to a high-dimensional space, as shown in equation (2):
Figure BDA0002080911410000082
wherein alpha isiFor corresponding training sample XiThe coefficient of (a) is determined,
Figure BDA0002080911410000083
is a mapping function that maps x to a high dimensional space. The correlation of x and x' after mapping to a high-dimensional space can be represented by a Gaussian kernel function k:
Figure BDA0002080911410000084
the regularized least square based kernel can be used to obtain a closed-form solution of ridge regression, and the calculation process of α is shown in formula (3):
α=(K+λI)-1Y (3)
where K is the kernel matrix and the vector Y ═ Y0,y1,…,yn-1]TAnd a vector.
Further, a discrete fourier transform expression of α can be obtained as shown in equation (4):
Figure BDA0002080911410000091
wherein the content of the first and second substances,
Figure BDA0002080911410000092
is a fourier transform of a and is,
Figure BDA0002080911410000093
the first line of K, thus converting ω to α.
Thus, given a single test sample z, the response of the classifier, i.e., the magnitude of its probability of being a target, can be expressed as shown in equation (5).
Figure BDA0002080911410000094
After the nonlinear classifier is trained, the classifier is used for positioning the target, and meanwhile, the cyclic matrix is also applied to detection so as to accelerate the whole process.
S3: the gray scale features, the gradient direction histogram features and the KCF classifier parameters corresponding to each frame of image are stored in a multi-channel memory model in a one-to-one correspondence manner, the multi-channel memory model includes a control channel, a first execution channel and a second execution channel, and the control channel, the first execution channel and the second execution channel are all divided into a transient memory space, a short-time memory space and a long-time memory space, as shown in fig. 2.
Wherein, the transient memory space of the control channel is used for storing the gray scale feature q of the target area in the current frame imagetAnd both the short-term memory space and the long-term memory space are used for storing the gray scale characteristics of the target area in the previous frame image, and the instantaneous memory space of the control channel is empty in the initial state.
The instantaneous memory space of the first execution channel is used for storing and acquiring the gray scale feature q of the current frame imagetThe time-adopted KCF classifier parameters, the short-time memory space and the long-time memory space are used for storing the KCF classifier parameters obtained by adopting the gray feature training of the previous frame imageIn the initial state, the transient memory space of the first execution channel is empty.
The transient memory space of the second execution channel is used for storing Histogram of Oriented Gradients (HOG) features of the current frame image, namely, a target appearance model, the short-term memory space and the long-term memory space are both used for storing gradient histogram features of a previous frame image, and in an initial state, the transient memory space of the second execution channel is empty.
S4: re-acquiring a frame of image as a current frame image, and classifying the current frame image by adopting the KCF classifier parameters stored in the first position of the short-time memory space of the first execution channel to obtain the gray characteristic q of the current frame imagetThen the gray scale feature q is addedtStoring the KCF classifier parameters in the first position of the short-time memory space of the first execution channel in the transient memory space of the control channel, and storing the grayscale characteristics q in the transient memory space of the first execution channeltThe corresponding gradient direction histogram features are stored in the transient memory space of the second execution channel.
Optionally, the calculation process for classifying the current frame image by using the trained KCF classifier parameters is as follows:
when a current frame image z is input, z is circularly shifted to form a circular matrix, namely zi=Piz. Let fiIs ziResponse of (2), xt-1For the updated target template of the previous frame, the response of the classifier obtained according to equation (5) is shown in equation (6):
Figure BDA0002080911410000101
definition KzMatrix: kz=κ(Piz,Pixt-1) The matrix is a circulant matrix. Further, a Fourier transform expression of f (z) is obtained
Figure BDA0002080911410000102
Figure BDA0002080911410000103
Wherein the content of the first and second substances,
Figure BDA0002080911410000104
is a fourier transform of a and is,
Figure BDA0002080911410000105
is KzThe first row of (2).
Will be provided with
Figure BDA0002080911410000111
Transform back to time domain to obtain f (z). The element value in the vector f (z) is the probability that each candidate region in the current frame input image z becomes the tracking target, and the region corresponding to the maximum value is regarded as the position of the target.
S5: sequentially acquiring the gray level features q of the transient memory space according to the storage sequence of the gray level features in the short-time memory space of the control channeltSimilarity rho with each gray feature in short-time memory spacedUp to the similarity pdGreater than a first set threshold TdThen the gray scale characteristic qtAnd the gray scale characteristic qtdMatching successfully, and then obtaining the gray feature qtdCorresponding gradient direction histogram characteristics, so as to realize the positioning of the target on the current frame image; if the gray scale feature qtIf all the grayscale features in the short-term memory space fail to match, the process proceeds to step S6.
Wherein if the gray scale feature qtAnd the gray scale characteristic qtdIf the matching is successful, the gray feature q is usedtdThe KCF classifier parameters in the first execution channel and the storage positions of the gradient direction histogram features in the second execution channel are moved in the same way while the gray features in the control channel are moved.
Further, if the gray scale is very specificSign qtAnd the gray scale characteristic qtdIf the matching is successful, updating the gray scale feature stored at the first position of the short-time memory space in the control channel according to a first set rule, updating the KCF classifier parameter stored at the first position of the short-time memory space in the first execution channel according to a second set rule, updating the gradient direction histogram feature stored at the first position of the short-time memory space in the second execution channel according to a third set rule, and using the updated gray scale feature, KCF classifier parameter and gradient direction histogram feature for target positioning of the next frame of image;
the first set rule is as follows:
qth=(1-ε)qtb+εqta
wherein q isthFor updated gray features, qtaIs associated with the gray scale feature qtMatching successful gray features, qtbStoring the gray scale characteristics for the first position of the short-time memory space in the control channel, wherein epsilon is the set gray scale characteristic updating rate;
the second update rule is:
αth=(1-β)αtb+βαta
wherein alpha isthFor updated KCF classifier parameters, alphatbIs associated with the gray scale feature qtKCF classifier parameter, alpha, corresponding to gray scale feature successfully matchedtaStoring KCF classifier parameters for a first position in a short-time memory space in a first execution channel, wherein beta is a set KCF classifier parameter updating rate;
the third update rule is:
xth=(1-β)xtb+βxta
wherein x isthFor the updated gradient histogram feature, xtbIs associated with the gray scale feature qtGradient histogram feature, x, corresponding to gray scale feature successfully matchedtaThe histogram feature of the gradient direction stored for the first position in the short term memory space in the second execution channel.
Therefore, the multi-channel memory model updates the storage contents of the memory spaces in the two execution channels mainly according to the matching degree of the gray-scale features stored in the memory space in the control channel and the gray-scale features of the current frame image, and the updating rules of each channel are different. Specifically, in a memory space of the control channel, the gray features obtained by the tracking result of the current frame are stored in an instantaneous memory space and are sequentially matched with the gray features in a short-term memory space and a long-term memory space. If the matching is successful, updating the gray features in the memory space of the control channel according to the updating rules of the control channel, and updating the storage contents in the memory space according to the updating rules of the first execution channel R-1 and the second execution channel R-2; if the matching fails, the gray features in the transient memory space in the control channel C are memorized into the short-time memory space, and meanwhile, the two execution channels R-1 and R-2 also perform the same operation.
S6: sequentially acquiring the gray scale features q of the transient memory space of the control channel according to the storage sequence of the gray scale features in the long-term memory space of the control channeltSimilarity rho with each gray feature in long-term memory spacecUp to the similarity pcGreater than a second set threshold TcThen the gray scale characteristic qtAnd the gray scale characteristic qtcMatching successfully, and then obtaining the gray feature qtcCorresponding gradient direction histogram characteristics, so as to realize the positioning of the target on the current frame image; if the gray scale feature qtIf the matching with all the gray features in the long-term memory space fails, the gray feature q is usedtAnd the position of the target area characterized by the corresponding gradient direction histogram feature on the current frame image is used for realizing the positioning of the target.
Wherein if the gray scale feature qtAnd the gray scale characteristic qtcIf the matching is successful, the gray feature q is usedtcMoving and storing in the first position of the short-time memory space and the long-time memory space arranged in the gray feature qtcThe previous storage positions of the gray features are sequentially moved backwards, and the storage positions of other gray features are kept unchanged; while the gray feature in the control channel is moving, in the first execution channelThe KCF classifier parameters and the storage positions of the gradient direction histogram features in the second execution channel are moved the same, and the gray feature q is adoptedtcAnd acquiring the gray scale characteristics of the next frame of image by the corresponding KCF classifier parameters.
Further, in step S6, the gradation feature q is usedtAfter the target area represented by the corresponding gradient direction histogram feature is positioned on the current frame image, the gray feature q is used fortStoring the gray feature q in the first position of the short-term memory space, and simultaneously judging the gray feature q in the last position of the short-term memory spaceNWhether the matching success times is greater than a set threshold value T or notMIf not, the gray characteristic q is setNRemoving, if greater than, the gray feature qNStoring the gray scale features in the first position in the long-term memory space, simultaneously removing the gray scale features in the last position in the long-term memory space, and sequentially moving the storage positions of other gray scale features in the short-term memory space and the long-term memory space backwards; and when the gray feature in the control channel is rejected and moved, the KCF classifier parameters in the first execution channel and the gradient direction histogram features in the second execution channel are subjected to the same rejection and movement.
The present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof, and it will be understood by those skilled in the art that various changes and modifications may be made herein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (2)

1. A nuclear correlation filtering target positioning method based on a multi-channel memory model is characterized by comprising the following steps:
s1: acquiring a target area, gray scale characteristics of the target area and gradient direction histogram characteristics corresponding to the gray scale characteristics in at least five previous frame images of a current frame, wherein the gradient direction histogram characteristics are used for representing the position of the target area on the image;
s2: sequentially taking a target area corresponding to each frame image as input, taking the gray characteristic of the target area as output, and training a KCF classifier to obtain KCF classifier parameters corresponding to each frame image;
s3: storing the gray scale features, the gradient direction histogram features and the KCF classifier parameters corresponding to each frame of image in a multi-channel memory model in a one-to-one correspondence manner, wherein the multi-channel memory model comprises a control channel, a first execution channel and a second execution channel, and the control channel, the first execution channel and the second execution channel are divided into a transient memory space, a short-time memory space and a long-time memory space;
wherein, the transient memory space of the control channel is used for storing the gray scale feature q of the target area in the current frame imagetThe short-term memory space and the long-term memory space are both used for storing the gray scale characteristics of the target area in the previous frame image, and the instantaneous memory space of the control channel is empty in the initial state;
the instantaneous memory space of the first execution channel is used for storing and acquiring the gray scale feature q of the current frame imagetThe temporal KCF classifier parameters are used, the short-term memory space and the long-term memory space are used for storing the KCF classifier parameters obtained by adopting the gray feature training of the previous frame image, and the instantaneous memory space of the first execution channel is empty in the initial state;
the transient memory space of the second execution channel is used for storing the gradient direction histogram feature of the current frame image, the short-term memory space and the long-term memory space are both used for storing the gradient direction histogram feature of the previous frame image, and in an initial state, the transient memory space of the second execution channel is empty;
s4: re-acquiring a frame of image as a current frame image, and classifying the current frame image by adopting the KCF classifier parameters stored in the first position of the short-time memory space of the first execution channel to obtain the gray characteristic q of the current frame imagetThen the gray scale feature q is addedtStoring the KCF classifier parameters in the first position of the short-time memory space of the first execution channel in the transient memory space of the control channel, and storing the grayscale characteristics q in the transient memory space of the first execution channeltCorresponding direction of gradientThe graph features are stored in a transient memory space of the second execution channel;
s5: sequentially acquiring the gray level features q of the transient memory space according to the storage sequence of the gray level features in the short-time memory space of the control channeltSimilarity rho with each gray feature in short-time memory spacedUp to the similarity pdGreater than a first set threshold TdThe first and the gray feature q in the space are memorized in short timetDegree of similarity ρdGreater than a first set threshold TdIs noted as the gray scale feature of (q)tdThen the gray scale characteristic qtAnd the gray scale characteristic qtdMatching successfully, and then obtaining the gray feature qtdCorresponding gradient direction histogram characteristics, so as to realize the positioning of the target on the current frame image; if the gray scale feature qtIf the matching with all the gray-scale features in the short-time memory space fails, the step S6 is executed;
wherein if the gray scale feature qtAnd the gray scale characteristic qtdIf the matching is successful, the gray feature q is usedtdThe storage positions of other gray scale features in the short-time memory space are sequentially moved backwards, and when the gray scale features in the control channel are moved, the KCF classifier parameters in the first execution channel and the storage positions of the gradient direction histogram features in the second execution channel are moved in the same way;
s6: sequentially acquiring the gray level features q of the instantaneous memory space according to the storage sequence of the gray level features in the long-term memory space of the control channeltSimilarity rho with each gray feature in long-term memory spacecUp to the similarity pcGreater than a second set threshold TcThe first one in the long-term memory space is compared with the gray feature qtDegree of similarity ρcGreater than a first set threshold TdIs noted as the gray scale feature of (q)tcThen the gray scale characteristic qtAnd the gray scale characteristic qtcMatching successfully, and then obtaining the gray feature qtcCorresponding gradient direction histogram characteristics, so as to realize the positioning of the target on the current frame image; if the gray scale feature qtAnd long term memoryIf all the gray features in the space fail to be matched, the gray feature q is usedtThe position of a target area represented by the corresponding gradient direction histogram feature on the current frame image is used for realizing the positioning of the target;
wherein if the gray scale feature qtAnd the gray scale characteristic qtcIf the matching is successful, the gray feature q is usedtcMoving and storing in the first position of the short-time memory space and the long-time memory space arranged in the gray feature qtcThe previous storage positions of the gray features are sequentially moved backwards, and the storage positions of other gray features are kept unchanged; while the gray feature in the control channel is moved, the KCF classifier parameters in the first execution channel and the storage position of the gradient direction histogram feature in the second execution channel are moved in the same way, and the gray feature q is adoptedtcAcquiring the gray scale feature of the next frame of image by the corresponding KCF classifier parameters;
in step S6, the gray feature q is usedtAfter the target area represented by the corresponding gradient direction histogram feature is positioned on the current frame image, the gray feature q is used fortStoring the gray feature q in the first position of the short-term memory space, and simultaneously judging the gray feature q in the last position of the short-term memory spaceNWhether the matching success times is greater than a set threshold value T or notMIf not, the gray characteristic q is setNRemoving, if greater than, the gray feature qNStoring the gray scale features in the first position in the long-term memory space, simultaneously removing the gray scale features in the last position in the long-term memory space, and sequentially moving the storage positions of other gray scale features in the short-term memory space and the long-term memory space backwards; and when the gray feature in the control channel is rejected and moved, the KCF classifier parameters in the first execution channel and the gradient direction histogram features in the second execution channel are subjected to the same rejection and movement.
2. The method as claimed in claim 1, wherein if the gray scale feature q is a correlated kernel filtering target, the method is characterized in thattAnd gray scaleSign qtdIf the matching is successful, updating the gray scale feature stored at the first position of the short-time memory space in the control channel according to a first set rule, updating the KCF classifier parameter stored at the first position of the short-time memory space in the first execution channel according to a second set rule, updating the gradient direction histogram feature stored at the first position of the short-time memory space in the second execution channel according to a third set rule, and using the updated gray scale feature, KCF classifier parameter and gradient direction histogram feature for target positioning of the next frame of image;
the first set rule is as follows:
qth=(1-ε)qtb+εqta
wherein q isthFor updated gray features, qtaIs associated with the gray scale feature qtMatching successful gray features, qtbStoring the gray scale characteristics for the first position of the short-time memory space in the control channel, wherein epsilon is the set gray scale characteristic updating rate;
the second setting rule is as follows:
αth=(1-β)αtb+βαta
wherein alpha isthFor updated KCF classifier parameters, alphatbIs associated with the gray scale feature qtKCF classifier parameter, alpha, corresponding to gray scale feature successfully matchedtaStoring KCF classifier parameters for a first position in a short-time memory space in a first execution channel, wherein beta is a set KCF classifier parameter updating rate;
the third setting rule is as follows:
xth=(1-β)xtb+βxta
wherein x isthFor the updated gradient histogram feature, xtbIs associated with the gray scale feature qtGradient histogram feature, x, corresponding to gray scale feature successfully matchedtaThe histogram feature of the gradient direction stored for the first position in the short term memory space in the second execution channel.
CN201910471284.5A 2019-05-31 2019-05-31 Nuclear correlation filtering target positioning method based on multi-channel memory model Active CN110276383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910471284.5A CN110276383B (en) 2019-05-31 2019-05-31 Nuclear correlation filtering target positioning method based on multi-channel memory model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910471284.5A CN110276383B (en) 2019-05-31 2019-05-31 Nuclear correlation filtering target positioning method based on multi-channel memory model

Publications (2)

Publication Number Publication Date
CN110276383A CN110276383A (en) 2019-09-24
CN110276383B true CN110276383B (en) 2021-05-14

Family

ID=67961252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910471284.5A Active CN110276383B (en) 2019-05-31 2019-05-31 Nuclear correlation filtering target positioning method based on multi-channel memory model

Country Status (1)

Country Link
CN (1) CN110276383B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115115992B (en) * 2022-07-26 2022-11-15 中国科学院长春光学精密机械与物理研究所 Multi-platform photoelectric auto-disturbance rejection tracking system and method based on brain map control right decision

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9613273B2 (en) * 2015-05-19 2017-04-04 Toyota Motor Engineering & Manufacturing North America, Inc. Apparatus and method for object tracking
CN106683110A (en) * 2015-11-09 2017-05-17 展讯通信(天津)有限公司 User terminal and object tracking method and device thereof
CN107368802B (en) * 2017-07-14 2021-06-01 北京理工大学 Moving target tracking method based on KCF and human brain memory mechanism
CN109753846A (en) * 2017-11-03 2019-05-14 北京深鉴智能科技有限公司 Target following system for implementing hardware and method
CN108288062B (en) * 2017-12-29 2022-03-01 中国电子科技集团公司第二十七研究所 Target tracking method based on kernel correlation filtering
CN109785366B (en) * 2019-01-21 2020-12-25 中国科学技术大学 Related filtering target tracking method for shielding

Also Published As

Publication number Publication date
CN110276383A (en) 2019-09-24

Similar Documents

Publication Publication Date Title
CN108734723B (en) Relevant filtering target tracking method based on adaptive weight joint learning
Cakir et al. Mihash: Online hashing with mutual information
CN108986140B (en) Target scale self-adaptive tracking method based on correlation filtering and color detection
GB2548199A (en) Low-and high-fidelity classifiers applied to road-scene images
CN110008844B (en) KCF long-term gesture tracking method fused with SLIC algorithm
Freytag et al. Labeling examples that matter: Relevance-based active learning with gaussian processes
KR20200061118A (en) Tracking method and system multi-object in video
CN108038515A (en) Unsupervised multi-target detection tracking and its storage device and camera device
CN111985333B (en) Behavior detection method based on graph structure information interaction enhancement and electronic device
CN107368802B (en) Moving target tracking method based on KCF and human brain memory mechanism
CN111754548A (en) Multi-scale correlation filtering target tracking method and device based on response discrimination
Jain et al. Channel graph regularized correlation filters for visual object tracking
Li et al. Robust visual tracking with occlusion judgment and re-detection
CN110276383B (en) Nuclear correlation filtering target positioning method based on multi-channel memory model
CN110827327B (en) Fusion-based long-term target tracking method
CN109584267B (en) Scale adaptive correlation filtering tracking method combined with background information
CN108921872B (en) Robust visual target tracking method suitable for long-range tracking
CN106709934A (en) Frequency domain Gaussian kernel function image tracking method
CN113706580B (en) Target tracking method, system, equipment and medium based on relevant filtering tracker
CN113033356B (en) Scale-adaptive long-term correlation target tracking method
Lei et al. Convolutional restricted Boltzmann machines learning for robust visual tracking
Liu et al. Fast tracking via spatio-temporal context learning based on multi-color attributes and pca
Vaquero et al. SiamMT: Real-time arbitrary multi-object tracking
Huang et al. An anti-occlusion and scale adaptive kernel correlation filter for visual object tracking
Jiang et al. Kernelized Correlation Filter Tracking with Scale Adaptive Filter and Feature Integration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant