CN108846851B - Moving target tracking method and terminal equipment - Google Patents

Moving target tracking method and terminal equipment Download PDF

Info

Publication number
CN108846851B
CN108846851B CN201810381273.3A CN201810381273A CN108846851B CN 108846851 B CN108846851 B CN 108846851B CN 201810381273 A CN201810381273 A CN 201810381273A CN 108846851 B CN108846851 B CN 108846851B
Authority
CN
China
Prior art keywords
target
size
determining
response value
image block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810381273.3A
Other languages
Chinese (zh)
Other versions
CN108846851A (en
Inventor
韩提文
王丽佳
张莉
张惠荣
梁海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei College of Industry and Technology
Original Assignee
Hebei College of Industry and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei College of Industry and Technology filed Critical Hebei College of Industry and Technology
Priority to CN201810381273.3A priority Critical patent/CN108846851B/en
Publication of CN108846851A publication Critical patent/CN108846851A/en
Application granted granted Critical
Publication of CN108846851B publication Critical patent/CN108846851B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention is applicable to the technical field of computer vision and pattern recognition, and provides a moving target tracking method and terminal equipment. Determining a target search image block of a current frame according to the position and the size of a previous frame of the current frame of a target to be tracked; determining a first response value of a target search image block according to a state transition filter; determining a first final response value according to the first response value, and determining the tracking position of the target to be tracked according to the first final response value; extracting a target multi-size training sample of a target to be tracked at a tracking position; determining a second response value of the target multi-size training sample according to the size filter; and determining a second final response value according to the second response value, and determining the tracking size of the target to be tracked according to the second final response value. By adopting the scheme, the problems of pose change, illumination change and shielding in the tracking process are solved, and the accuracy of target tracking when the appearance is changed due to illumination, complex background, shielding and the like is improved.

Description

Moving target tracking method and terminal equipment
Technical Field
The invention belongs to the technical field of computer vision and pattern recognition, and particularly relates to a moving target tracking method and terminal equipment.
Background
The target tracking means that an interested target is tracked in each frame of image of a video sequence, and the target tracking has important application value in the fields of human-computer interaction, video monitoring, vehicle navigation and the like.
In recent years, the kernel function can be used for converting inner product calculation of a high-dimensional feature space into function calculation of a low-dimensional space, so that the calculation complexity of the algorithm is simplified, the kernel function theory is introduced into related filtering to form a kernel related filtering algorithm, the operation efficiency of the algorithm is improved to a certain extent, and efficient target tracking is realized.
Disclosure of Invention
In view of this, embodiments of the present invention provide a moving target tracking method and a terminal device, so as to solve the problems in the prior art that a target is easily affected by illumination changes, and when a half occlusion or a full occlusion occurs, the target is easily lost, and the tracking effect on a fast moving target is not good.
A first aspect of an embodiment of the present invention provides a moving target tracking method, including:
determining a target search image block of a current frame according to the position and the size of a previous frame of the current frame of a target to be tracked;
determining a first response value of the target search image block according to a state transition filter;
determining a first final response value according to the first response value, and determining the tracking position of the target to be tracked according to the first final response value;
extracting a target multi-size training sample of the target to be tracked at the tracking position;
determining a second response value of the target multi-sized training sample according to a size filter;
and determining a second final response value according to the second response value, and determining the tracking size of the target to be tracked according to the second final response value.
As a further technical solution, the method further comprises:
and determining the movement speed of the target to be tracked according to the position of the target to be tracked of the preset frame number in front of the current frame, and then changing the size of the target search image block according to the movement speed.
As a further technical solution, the method further comprises:
according to the expression
Figure BDA0001641067880000021
Determining an initial rectangular frame size of a current frame of a target to be tracked
Figure BDA0001641067880000022
Wherein x0,y0Respectively representing the abscissa and ordinate, w, of the initial rectangular frame0,h0Respectively representing the width and height of the initial rectangular frame;
according to the expression
Figure BDA0001641067880000023
Determining the position and size S of image blocks of a training samplepatchWherein padding represents parameters of adjustment preset when the position and size of the image block of the training sample are extracted,
Figure BDA0001641067880000024
representing the size of an initial rectangular frame of a current frame of a target to be tracked;
according to the expression x ═ F (GetFhog (p)Spatch) Extracting fHOG features of the training samples and performing a fast fourier transform on the extracted fHOG features, where pSpatchImage blocks representing training samples, SpatchRepresenting the position and size of an image block of the training sample, GetFhog (DEG) representing the fHOG feature of the image block of the extracted training sample, and F (DEG) representing the fast Fourier transform;
according to the expression
Figure BDA0001641067880000025
Obtaining a regression target of Gaussian type from the initial target, wherein rs ═ 1, …, w/2],cs=[1,…,h/2]W denotes the image block p of the training sampleSpatchH denotes the image block p of the training sampleSpatchRepresents the width parameter of the function,
Figure BDA0001641067880000031
according to the expression
Figure BDA0001641067880000032
Training a state transition filter a, wherein
Figure BDA0001641067880000033
Denotes the autocorrelation of x, F-1(. -) represents an inverse fast Fourier transform, λ represents a tuning parameter, σkRepresenting a preset width parameter.
As a further technical solution, the method further comprises:
according to the expression xs=αnw0×αnh0Determining to extract a multi-sized training sample at the initial rectangular box, wherein α represents a size factor,
Figure BDA0001641067880000034
representing the size level of the training samples, S representing the number of multi-sized training samples, h0Indicates the height, w, of the initial rectangular box0Represents the width of the initial rectangular box;
according to the expression
Figure BDA0001641067880000035
Deriving a regression target dimension y of Gaussian from the initial target dimensions of the multi-dimensional training samplesWherein σ issRepresenting a preset width parameter;
according to the expression
Figure BDA0001641067880000036
Determining an autocorrelation of an initial target size of a multi-sized training sample, wherein,
Figure BDA0001641067880000037
denotes xsThe auto-correlation of (a) is,
Figure BDA0001641067880000038
denotes xsComplex conjugation of, σskRepresenting a preset width parameter, F-1(. cndot.) denotes an inverse fast fourier transform.
According to the expression
Figure BDA0001641067880000039
Training size filter asWherein F (-) represents a fast Fourier transform, ysRegression target size, x, representing a GaussiansRepresenting multi-dimensional training samples, λsIndicating the tuning parameters.
As a further technical solution, the method further comprises:
determining a response value range according to a maximum response value of a state transfer filter of a tracking result of a current frame in the target to be tracked and a preset threshold value;
determining different learning rates according to the response value range;
and updating the state transition filter and the size filter according to the learning rate respectively.
As a further technical solution, the updating the state transition filter and the size filter according to the learning rate respectively includes:
according to the expression
Figure BDA0001641067880000041
Updating the state transition filter a, and updating the fHOG characteristic of the target image block, wherein lambdalIndicates the learning rate, aiA state transition filter representing the current frame, x represents the fast Fourier transform of the fHOG feature of the target image block, xiFast Fourier transform of fHOG features representing target image blocks tracked by the current frame;
according to the expression
Figure BDA0001641067880000042
To size filter asAn update is made to the size factor α, where λlIndicates the learning rate, asiSize filter representing the current frame, αiRepresenting the size factor of the current frame.
As a further technical solution, the determining a target search image block of a current frame according to a position and a size of a previous frame of the current frame of a target to be tracked includes:
according to the expression Searchpatch=Sli-1× (1+ padding) determines the position and size Search of the target Search image blockpatchWherein S isli-1The position and the size of a previous frame of a current frame of the target to be tracked, namely the position and the size of the current frame of the target to be tracked in the i-1 frame are represented, and padding represents an adjusting parameter;
according to the expression z ═ F (GetFhog (P)sp) Extracting fHOG features of the position and size of the target search image block, and performing fast fourier transform on the fHOG features, wherein PSpRepresenting a target Search image block, SearchpatchIndicating the position and size of the target search image block, GetFhog (P)sp) And F (-) is a fast Fourier transform, and F (-) represents the fHOG characteristic of the target search image block.
A second aspect of an embodiment of the present invention provides a moving object tracking apparatus, including:
the target searching image block determining module is used for determining a target searching image block of a current frame according to the position and the size of a previous frame of the current frame of the target to be tracked;
the first response value determining module is used for determining a first response value of the target search image block according to the state transition filter;
a tracking position determining module, configured to determine a first final response value according to the first response value, and then determine a tracking position of the target to be tracked according to the first final response value;
a target multi-size training sample extraction module, configured to extract a target multi-size training sample of the target to be tracked at the tracking position;
a second response value determining module, configured to determine a second response value of the target multi-size training sample according to a size filter;
and the tracking size determining module is used for determining a second final response value according to the second response value and then determining the tracking size of the target to be tracked according to the second final response value.
A third aspect of the embodiments of the present invention provides a moving object tracking terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the method according to the first aspect.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method as described in the first aspect above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: after the scheme is adopted, the tracking position of the target to be tracked is determined by using the state transfer filter, and the tracking size is determined at the tracking position by using the size filter, so that the problems of pose change, illumination change and shielding in the tracking process are solved, and the accuracy of target tracking when the appearance is changed due to illumination, complex background, shielding and the like is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a flowchart illustrating steps of a method for tracking a moving object according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a moving object tracking apparatus according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a moving object tracking apparatus according to another embodiment of the present invention;
fig. 4 is a schematic diagram of a moving object tracking terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
As shown in fig. 1, a flowchart of steps of a moving object tracking method provided in an embodiment of the present invention includes the following steps:
step S101, determining a target search image block of a current frame according to the position and the size of a previous frame of the current frame of a target to be tracked.
Specifically, the position change and the size change of the target have certain continuity when the target moves between frames, and the position of the target in the previous frame is closer to the position of the current frame, that is: the object of the current frame will appear in the neighborhood of the object at the previous frame position. Therefore, the state of the target in the current frame is determined according to the position and the size of the previous frame, the size of the target searching image block directly influences the tracking result of the target to be tracked, the smaller target searching image block is suitable for tracking the target to be tracked with slower moving speed, calculation time consumption of an algorithm is reduced, when the target to be tracked moves fast, the target to be tracked is easy to move out of the target searching image block, and therefore tracking failure is caused, and at the moment, the size of the target searching image block is required to be properly enlarged to ensure accurate tracking of the target.
And step S102, determining a first response value of the target search image block according to a state transition filter.
Specifically, a state transition filter is trained, and then a first response value of the target search image block is determined according to the trained state transition filter, preferably, using an expression z _ response ═ F-1(a·F(kxz) Compute state transition filteringA first response value of the device in the target search image block, wherein x represents a fast Fourier transform value of the target template, z represents a fast Fourier transform value of the target search image block, kxzRepresenting the Gaussian correlation of x and z, a representing the trained state transition filter, F-1(. cndot.) represents solving the inverse fourier transform.
Step S103, determining a first final response value according to the first response value, and then determining the tracking position of the target to be tracked according to the first final response value.
Specifically, after solving a plurality of first response values of the target search image block, a first final response value is determined according to the plurality of first response values, preferably, according to an expression ziDetermining a first final response value, namely the largest first response value in the plurality of first response values, wherein the first response value is the largest in the target search image block, and the position of the first final response value is the tracking position of the target to be tracked.
And step S104, extracting a target multi-size training sample of the target to be tracked at the tracking position.
And step S105, determining a second response value of the target multi-size training sample according to a size filter.
Specifically, after solving a plurality of second response values of the target multi-size training sample, a second final response value is determined according to the plurality of second response values, preferably, according to an expression
Figure BDA0001641067880000071
Determining a second final response value, wherein xsFast Fourier transform values, z, representing target size templatessFast fourier transform values representing the target search image block,
Figure BDA0001641067880000072
denotes xsAnd zsGaussian correlation of (a)sRepresenting the size filter obtained by training, F-1(. cndot.) represents solving the inverse fourier transform. On-target multi-size training sampleThe maximum second response value is determined as a second final response value, and the position of the second final response value is the tracking size of the target to be tracked.
And S106, determining a second final response value according to the second response value, and determining the tracking size of the target to be tracked according to the second final response value.
After the scheme is adopted, the tracking position of the target to be tracked is determined by using the state transfer filter, and the tracking size is determined at the tracking position by using the size filter, so that the problems of pose change, illumination change and shielding in the tracking process are solved, and the accuracy of target tracking when the appearance is changed due to illumination, complex background, shielding and the like is improved.
In addition, in a specific example, the method further comprises:
and determining the movement speed of the target to be tracked according to the position of the target to be tracked of the preset frame number in front of the current frame, and then changing the size of the target search image block according to the movement speed.
Specifically, the motion speed of the target to be tracked is determined according to the position of the target to be tracked of the preset frame number before the current frame in the current frame image, the size of the target search image block is further changed, and the target search image block is obtained according to the expression
Figure BDA0001641067880000081
Determining the movement velocity v of the object to be tracked, wherein Li-jRepresents the position of the target to be tracked at the j-th frame before the current frame, i represents the current frame, M represents the preset frame number before, j is [1,2, …, M ═]Representing the jth frame before the current frame. Preferably, the value of the previous preset frame is 4, after a plurality of times of test analysis, the movement speed of the target to be tracked, which is determined by selecting the position of the target to be tracked of the previous 4 frames of the current frame, is very accurate, the movement speed can be used for determining the size of a proper target search image block, and the substituted expression is
Figure BDA0001641067880000082
And determining the motion speed v of the target search image block.
According toExpression formula
Figure BDA0001641067880000083
Determining an adjusting parameter padding of the target search image block, wherein v represents a moving speed of the target search image block thlAnd thhThe motion speed of the target search image block is divided into three types of low-speed, medium-speed and high-speed motion according to the two thresholds.
In addition, in a specific example, the method further comprises:
according to the expression
Figure BDA0001641067880000091
Determining an initial rectangular frame size of a current frame of a target to be tracked
Figure BDA0001641067880000092
Wherein x0,y0Respectively representing the abscissa and ordinate, w, of the initial rectangular frame0,h0Respectively representing the width and height of the initial rectangular box.
According to the expression
Figure BDA0001641067880000093
Determining the position and size S of image blocks of a training samplepatchWherein padding represents parameters of adjustment preset when the position and size of the image block of the training sample are extracted,
Figure BDA0001641067880000094
the method comprises the steps of representing the size of an initial rectangular frame of a current frame of a target to be tracked, preferably setting a padding value in the current frame as a fixed value 2, wherein the fixed value is obtained by a plurality of tests of workers, and training samples determined according to the fixed value can well meet actual requirements.
According to the expression x ═ F (GetFhog (p)Spatch) Extracting fHOG features of the training samples and performing a fast fourier transform on the extracted fHOG features, where pSpatchImage blocks representing training samples, SpatchGraph representing training samplesThe position and size of the image block, GetFhog (phi) represents fHOG characteristics of the image block for extracting the training sample, F (phi) represents fast Fourier transform, and the position and size S of the image block according to the training samplepatchThe image block p of the training sample can be directly obtainedSpatch
According to the expression
Figure BDA0001641067880000095
Obtaining a regression target of Gaussian type from the initial target, wherein rs ═ 1, …, w/2],cs=[1,…,h/2]W denotes the image block S of the training samplepatchH denotes the image block S of the training samplepatchRepresents the width parameter of the function,
Figure BDA0001641067880000096
according to the expression
Figure BDA0001641067880000097
Training a state transition filter a, wherein
Figure BDA0001641067880000098
Denotes the autocorrelation of x, F-1(. -) represents an inverse fast Fourier transform, λ represents a tuning parameter, σkRepresenting a preset width parameter.
In addition, in a specific example, the method further comprises:
according to the expression xs=αnw0×αnh0Extracting multi-sized training samples at the initial rectangular box, wherein α represents a size factor,
Figure BDA0001641067880000101
represents the training sample size level, S represents the number of multi-sized training samples, h0Indicates the height, w, of the initial rectangular box0Indicating the width of the initial rectangular box.
According to the expression
Figure BDA0001641067880000102
Obtaining a regression target size y of the Gaussian form from the initial target sizesWherein σ issRepresenting a preset width parameter.
According to the expression
Figure BDA0001641067880000103
An autocorrelation of an initial target size of a multi-sized training sample is determined. Wherein the content of the first and second substances,
Figure BDA0001641067880000104
denotes xsThe auto-correlation of (a) is,
Figure BDA0001641067880000105
denotes xsComplex conjugation of, σskRepresenting a preset width parameter, F-1(. cndot.) denotes an inverse fast fourier transform.
According to the expression
Figure BDA0001641067880000106
Training size filter asWherein F (-) represents a fast Fourier transform, ysRegression target size, x, representing a GaussiansRepresenting multi-dimensional training samples, λsIndicating the tuning parameters.
In addition, in a specific example, the method further comprises:
and determining a response value range according to the maximum response value of the state transition filter of the tracking result of the next frame of the initial frame in the target to be tracked and a preset threshold value.
Different learning rates are determined according to the response value range.
And updating the state transition filter and the size filter according to the learning rate respectively.
Specifically, two thresholds H are given1And H2And combining the maximum response value SC of the state transition filter of the tracking result of the current frame of the target to be tracked0Determining a range of response values (SC)0-H1,SC0+H2) If the first end of the current frame state transition filterLearning rate lambda when the response value or the second final response value of the current frame size filter is between the given response value rangeslC, otherwise the learning rate lambdal1-c, it is preferable that the learning rate λ is a rate at which the first final response value of the current frame state transition filter or the second final response value of the current frame size filter is between a given range of response valueslIs 0.25, otherwise, the learning rate λl0.75, as shown by the following formula:
Figure BDA0001641067880000111
sc is the first final response value of the current frame state transition filter or the second final response value of the current frame size filter, lambdalIndicates the learning rate, H1And H2Two thresholds are indicated.
Further, in a specific example, the updating the state transition filter and the size filter according to the learning rate, respectively, includes:
according to the expression
Figure BDA0001641067880000112
Updating the state transition filter a, and updating the fHOG characteristic of the target image block, wherein lambdalIndicates the learning rate, aiA state transition filter representing the current frame, x represents the fast Fourier transform of the fHOG feature of the target image block, xiA fast fourier transform of the fHOG feature representing the target image block tracked by the current frame.
According to the expression
Figure BDA0001641067880000113
To size filter asAn update is made to the size factor α, where λlIndicates the learning rate, asiSize filter representing the current frame, αiRepresenting the size factor of the current frame.
In addition, in a specific example, the determining a target search image block of a current frame according to a position and a size of a previous frame of an initial frame of a target to be tracked includes:
according to the expression Searchpatch=Sli-1× (1+ padding) determines the position and size Search of the target Search image blockpatchWherein S isli-1And indicating the position and the size of the previous frame of the current frame of the target to be tracked, namely the position and the size of the current frame of the target to be tracked in the i-1 frame, and padding is an adjusting parameter.
According to the expression z ═ F (GetFhog (P)sp) Extracting fHOG features of the position and size of the target search image block, and performing fast fourier transform on the fHOG features, wherein PSpRepresenting a target Search image block, SearchpatchIndicating the position and size of the target search image block, GetFhog (P)sp) And F (-) is a fast Fourier transform, and F (-) represents the fHOG characteristic of the target search image block.
After the scheme is adopted, the state transfer filter and the size filter are trained to be used for target position and size tracking, the target movement speed is calculated in the state transfer filter according to the target position relation in a plurality of continuous frames, the size of a target search image block is further determined, the accuracy of target tracking of movement at different speeds is improved, in the filter updating process, a response value range is determined according to the maximum response value of the state transfer filter of a target to be tracked in the next frame of an initial frame and two given threshold values, then, the maximum response value of the state transfer filter in the current frame is compared with the response value range so as to determine the learning rate of the current frame, further, the state transfer filter and the size filter are updated so as to overcome the problems of pose change, illumination change and shielding in the tracking process, and the updating of a size factor is added in the size filter updating process, to accommodate target tracking at different rates of size change.
As shown in fig. 2, a moving object tracking apparatus provided in an embodiment of the present invention includes:
a target search image block determining module 201, configured to determine a target search image block of a current frame according to a position and a size of a previous frame of the current frame of a target to be tracked;
a first response value determining module 202, configured to determine a first response value of the target search image block according to a state transition filter;
a tracking position determining module 203, configured to determine a first final response value according to the first response value, and then determine a tracking position of the target to be tracked according to the first final response value;
a target multi-size training sample extracting module 204, configured to extract a target multi-size training sample of the target to be tracked at the tracking position;
a second response value determining module 205, configured to determine a second response value of the target multi-size training sample according to a size filter;
and a tracking size determining module 206, configured to determine a second final response value according to the second response value, and then determine a tracking size of the target to be tracked according to the second final response value.
After the scheme is adopted, the tracking position of the target to be tracked is determined by using the state transfer filter, and the tracking size is determined at the tracking position by using the size filter, so that the problems of pose change, illumination change and shielding in the tracking process are solved, and the accuracy of target tracking when the appearance is changed due to illumination, complex background, shielding and the like is improved.
As shown in fig. 3, a moving object tracking apparatus provided in an embodiment of the present invention includes:
a target search image block determining module 301, configured to determine a target search image block of a current frame according to a position and a size of a previous frame of the current frame of a target to be tracked;
a first response value determining module 302, configured to determine a first response value of the target search image block according to a state transition filter;
a tracking position determining module 303, configured to determine a first final response value according to the first response value, and then determine a tracking position of the target to be tracked according to the first final response value;
a target multi-size training sample extracting module 304, configured to extract a target multi-size training sample of the target to be tracked at the tracking position;
a second response value determining module 305, configured to determine a second response value of the target multi-size training sample according to a size filter;
and a tracking size determining module 306, configured to determine a second final response value according to the second response value, and then determine a tracking size of the target to be tracked according to the second final response value.
Further, in one particular example, the apparatus further comprises:
and a target search image block changing module 307, configured to determine a motion speed of the target to be tracked according to the position of the target to be tracked of the preset frame number before the current frame, and change the size of the target search image block according to the motion speed.
Further, in one particular example, the apparatus further comprises:
an initial rectangular box size determination module 308 for determining the size of the rectangular box according to an expression
Figure BDA0001641067880000131
Determining an initial rectangular frame size of a current frame of a target to be tracked
Figure BDA0001641067880000132
Wherein x0,y0Respectively representing the abscissa and ordinate, w, of the initial rectangular frame0,h0Respectively representing the width and height of the initial rectangular frame;
a training sample extraction module 309 for extracting a training sample according to an expression
Figure BDA0001641067880000133
Determining the position and size S of image blocks of a training samplepatchWherein padding represents parameters of adjustment preset when the position and size of the image block of the training sample are extracted,
Figure BDA0001641067880000141
representing the size of an initial rectangular frame of a current frame of a target to be tracked;
a feature transformation module 310 for transforming the feature according to the expression x ═ F (GetFhog (p))Spatch) Extracting the fHOG features of the training samples and aligning the extracted fHThe OG features are fast Fourier transformed, where pSpatchImage blocks representing training samples, SpatchRepresenting the position and size of an image block of the training sample, GetFhog (DEG) representing the fHOG characteristic of the image block for extracting the training sample, and F (DEG) representing fast Fourier transform;
regression target determination module 311 for determining a regression target based on an expression
Figure BDA0001641067880000142
Obtaining a regression target of Gaussian type from the initial target, wherein rs ═ 1, …, w/2],cs=[1,…,h/2]W denotes the image block p of the training sampleSpatchH denotes the image block p of the training sampleSpatchRepresents the width parameter of the function,
Figure BDA0001641067880000143
a state transition filter training module 312 for training the filter according to the expression
Figure BDA0001641067880000144
Training a state transition filter a, wherein
Figure BDA0001641067880000145
Denotes the autocorrelation of x, F-1(. -) represents an inverse fast Fourier transform, λ represents a tuning parameter, σkRepresenting a preset width parameter.
Further, in one particular example, the apparatus further comprises:
a multi-size training sample extraction module 313 for extracting a multi-size training sample according to the expression xs=αnw0×αnh0Determining to extract a multi-sized training sample at the initial rectangular box, wherein α represents a size factor,
Figure BDA0001641067880000146
representing the size level of the training samples, S representing the number of multi-sized training samples, h0Indicates the height, w, of the initial rectangular box0Represents the width of the initial rectangular box;
a regression target size determination module 314 to determine the size of the target according to an expression
Figure BDA0001641067880000147
Deriving a regression target dimension y of Gaussian from the initial target dimensions of the multi-dimensional training samplesWherein σ issRepresenting a preset width parameter;
target size autocorrelation determination 315 for use in accordance with the expression
Figure BDA0001641067880000151
Determining an autocorrelation of an initial target size of a multi-sized training sample, wherein,
Figure BDA0001641067880000152
denotes xsThe auto-correlation of (a) is,
Figure BDA0001641067880000153
denotes xsComplex conjugation of, σskRepresenting a preset width parameter, F-1(. cndot.) denotes an inverse fast fourier transform.
A size filter training module 316 to train the filter according to an expression
Figure BDA0001641067880000154
Training size filter asWherein F (-) represents a fast Fourier transform, ysRegression target size, x, representing a GaussiansDenotes a dimensional training sample, λsIndicating the tuning parameters.
Further, in one particular example, the apparatus further comprises:
a response value range determining module 317, configured to determine a response value range according to a maximum response value of a state transition filter of a tracking result of a current frame in the target to be tracked and a preset threshold;
a learning rate determination module 318 for determining different learning rates according to the response value range;
a filter updating module 319, configured to update the state transition filter and the size filter according to a learning rate, respectively.
In addition, in a specific example, the filter updating module 319 is further configured to:
according to the expression
Figure BDA0001641067880000155
Updating the state transition filter a, and updating the fHOG characteristic of the target image block, wherein lambdalIndicates the learning rate, aiA state transition filter representing the current frame, x represents the fast Fourier transform of the fHOG feature of the target image block, xiFast Fourier transform of fHOG features representing target image blocks tracked by the current frame;
according to the expression
Figure BDA0001641067880000156
To size filter asAn update is made to the size factor α, where λlIndicates the learning rate, asiSize filter representing the current frame, αiRepresenting the size factor of the current frame.
In addition, in a specific example, the target search image block determining module 301 is further configured to:
according to the expression Searchpatch=Sli-1× (1+ padding) determines the position and size Search of the target Search image blockpatchWherein S isli-1The position and the size of a previous frame of a current frame of the target to be tracked, namely the position and the size of the current frame of the target to be tracked in the i-1 frame are represented, and padding represents an adjusting parameter;
according to the expression z ═ F (GetFhog (P)sp) Extracting fHOG features of the position and size of the target search image block, and performing fast fourier transform on the fHOG features, wherein PSpRepresenting a target Search image block, SearchpatchIndicating the position and size of the target search image block, GetFhog (P)sp) And F (-) is a fast Fourier transform, and F (-) represents the fHOG characteristic of the target search image block.
After the scheme is adopted, the state transfer filter and the size filter are trained to be used for target position and size tracking, the target movement speed is calculated in the state transfer filter according to the target position relation in a plurality of continuous frames, the size of a target search image block is further determined, the accuracy of target tracking of movement at different speeds is improved, in the filter updating process, a response value range is determined according to the maximum response value of the state transfer filter of a target to be tracked in the next frame of an initial frame and two given threshold values, then, the maximum response value of the state transfer filter in the current frame is compared with the response value range so as to determine the learning rate of the current frame, further, the state transfer filter and the size filter are updated so as to overcome the problems of pose change, illumination change and shielding in the tracking process, and the updating of a size factor is added in the size filter updating process, to accommodate target tracking at different rates of size change.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
As shown in fig. 4, a schematic diagram of a moving object tracking terminal device according to an embodiment of the present invention is provided, where the moving object tracking terminal device 4 according to the embodiment includes: a processor 40, a memory 41 and a computer program 42, such as a moving object tracking program, stored in said memory 41 and executable on said processor 40. The processor 40, when executing the computer program 42, implements the steps in the various embodiments of the moving object tracking method described above, such as the steps 101 to 106 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of each module/unit in the above-mentioned device embodiments, for example, the functions of the modules 201 to 206 shown in fig. 2.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 42 in the moving object tracking terminal device 4. For example, the computer program 42 may be divided into a synchronization module, a summary module, an acquisition module, and a return module (a module in a virtual device), and each module has the following specific functions:
and determining a target search image block of the current frame according to the position and the size of the previous frame of the current frame of the target to be tracked.
And determining a first response value of the target search image block according to a state transition filter.
And determining a first final response value according to the first response value, and determining the tracking position of the target to be tracked according to the first final response value.
Extracting a target multi-size training sample of the target to be tracked at the tracking position.
Determining a second response value for the target multi-sized training sample according to a size filter.
And determining a second final response value according to the second response value, and determining the tracking size of the target to be tracked according to the second final response value.
And determining the movement speed of the target to be tracked according to the position of the target to be tracked of the preset frame number in front of the current frame, and then changing the size of the target search image block according to the movement speed.
According to the expression
Figure BDA0001641067880000171
Determining an initial rectangular frame size of a current frame of a target to be tracked
Figure BDA0001641067880000172
Wherein x0,y0Respectively representing the abscissa and ordinate, w, of the initial rectangular frame0,h0Respectively representing the width and height of the initial rectangular box.
According to the expression
Figure BDA0001641067880000173
Determining training samplesPosition and size S of image blockpatchWherein padding represents parameters of adjustment preset when the position and size of the image block of the training sample are extracted,
Figure BDA0001641067880000181
an initial rectangular box size of a current frame representing an object to be tracked.
According to the expression x ═ F (GetFhog (p)Spatch) Extracting fHOG features of the training samples and performing a fast fourier transform on the extracted fHOG features, where pSpatchImage blocks representing training samples, SpatchThe positions and sizes of the image blocks of the training samples are represented, GetFhog (-) represents the fHOG features of the image blocks of the extracted training samples, and F (-) represents the fast Fourier transform.
According to the expression
Figure BDA0001641067880000182
Obtaining a regression target of Gaussian type from the initial target, wherein rs ═ 1, …, w/2],cs=[1,…,h/2]W denotes the image block p of the training sampleSpatchH denotes the image block p of the training sampleSpatchRepresents the width parameter of the function,
Figure BDA0001641067880000183
according to the expression
Figure BDA0001641067880000184
Training a state transition filter a, wherein
Figure BDA0001641067880000185
Denotes the autocorrelation of x, F-1(. -) represents an inverse fast Fourier transform, λ represents a tuning parameter, σkRepresenting a preset width parameter.
According to the expression xs=αnw0×αnh0Determining to extract a multi-sized training sample at the initial rectangular box, wherein α represents a size factor,
Figure BDA0001641067880000186
representing the size level of the training samples, S representing the number of multi-sized training samples, h0Indicates the height, w, of the initial rectangular box0Indicating the width of the initial rectangular box.
According to the expression
Figure BDA0001641067880000187
Deriving a regression target dimension y of Gaussian from the initial target dimensions of the multi-dimensional training samplesWherein σ issRepresenting a preset width parameter.
According to the expression
Figure BDA0001641067880000188
Determining an autocorrelation of an initial target size of a multi-sized training sample, wherein,
Figure BDA00016410678800001811
denotes xsThe auto-correlation of (a) is,
Figure BDA00016410678800001810
denotes xsComplex conjugation of, σskRepresenting a preset width parameter, F-1(. cndot.) denotes an inverse fast fourier transform.
According to the expression
Figure BDA0001641067880000191
Training size filter asWherein F (-) represents a fast Fourier transform, ysRegression target size, x, representing a GaussiansDenotes a dimensional training sample, λsIndicating the tuning parameters.
And determining a response value range according to the maximum response value of the state transfer filter of the tracking result of the current frame in the target to be tracked and a preset threshold value.
Different learning rates are determined according to the response value range.
And updating the state transition filter and the size filter according to the learning rate respectively.
According to the expression
Figure BDA0001641067880000192
Updating the state transition filter a, and updating the fHOG characteristic of the target image block, wherein lambdalIndicates the learning rate, aiA state transition filter representing the current frame, x represents the fast Fourier transform of the fHOG feature of the target image block, xiA fast fourier transform of the fHOG feature representing the target image block tracked by the current frame.
According to the expression
Figure BDA0001641067880000193
To size filter asAn update is made to the size factor α, where λlIndicates the learning rate, asiSize filter representing the current frame, αiRepresenting the size factor of the current frame.
According to the expression Searchpatch=Sli-1× (1+ padding) determines the position and size Search of the target Search image blockpatchWherein S isli-1And indicating the position and the size of a frame before the current frame of the target to be tracked, namely the position and the size of the frame in the i-1 frame, and padding indicating an adjusting parameter.
According to the expression z ═ F (GetFhog (P)sp) Extracting fHOG features of the position and size of the target search image block, and performing fast fourier transform on the fHOG features, wherein PSpRepresenting a target Search image block, SearchpatchIndicating the position and size of the target search image block, GetFhog (P)sp) And F (-) is a fast Fourier transform, and F (-) represents the fHOG characteristic of the target search image block.
The moving object tracking terminal device 4 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The moving object tracking terminal device may include, but is not limited to, a processor 40 and a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of the moving object tracking terminal device 4, and does not constitute a limitation of the moving object tracking terminal device 4, and may include more or less components than those shown, or combine some components, or different components, for example, the moving object tracking terminal device may further include an input-output device, a network access device, a bus, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the moving object tracking terminal device 4, such as a hard disk or a memory of the moving object tracking terminal device 4. The memory 41 may also be an external storage device of the moving object tracking terminal device 4, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the moving object tracking terminal device 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the moving object tracking terminal device 4. The memory 41 is used for storing the computer program and other programs and data required by the moving object tracking terminal device. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (9)

1. A moving object tracking method, comprising:
determining a target search image block of a current frame according to the position and the size of a previous frame of the current frame of a target to be tracked;
determining a first response value of the target search image block according to a state transition filter;
determining a first final response value according to the first response value, and determining the tracking position of the target to be tracked according to the first final response value;
extracting a target multi-size training sample of the target to be tracked at the tracking position;
determining a second response value of the target multi-sized training sample according to a size filter;
determining a second final response value according to the second response value, and determining the tracking size of the target to be tracked according to the second final response value;
the moving object tracking method further includes:
according to the expression
Figure FDA0002468036260000011
Determining an initial rectangular frame size of a current frame of a target to be tracked
Figure FDA0002468036260000012
Wherein x0,y0Respectively representing the abscissa and ordinate, w, of the initial rectangular frame0,h0Respectively representing the width and height of the initial rectangular frame;
according to the expression
Figure FDA0002468036260000013
Determining the position and size S of image blocks of a training samplepatchWherein padding represents parameters of adjustment preset when the position and size of the image block of the training sample are extracted,
Figure FDA0002468036260000014
representing the size of an initial rectangular frame of a current frame of a target to be tracked;
according to the expression x ═ F (GetFhog (p)Spatch) Extracting fHOG features of the training samples and performing a fast fourier transform on the extracted fHOG features, where pSpatchImage blocks representing training samples, SpatchRepresenting the position and size of an image block of the training sample, GetFhog (DEG) representing the fHOG characteristic of the image block for extracting the training sample, and F (DEG) representing fast Fourier transform;
according to the expression
Figure FDA0002468036260000021
Obtaining a regression target of Gaussian type from the initial target, wherein rs ═ 1, …, w/2],cs=[1,…,h/2]W denotes the image block p of the training sampleSpatchH denotes the image block p of the training sampleSpatchRepresents the width parameter of the function,
Figure FDA0002468036260000022
cell=4,σ'=0.1;
according to the expression
Figure FDA0002468036260000023
Training a state transition filter a, wherein
Figure FDA0002468036260000024
The autocorrelation of x is represented as a function of,
Figure FDA0002468036260000025
denotes the complex conjugate of x, F-1(. -) represents an inverse fast Fourier transform, λ represents a tuning parameter, σkRepresenting a preset width parameter.
2. The moving object tracking method according to claim 1, further comprising:
and determining the movement speed of the target to be tracked according to the position of the target to be tracked of the preset frame number in front of the current frame, and then changing the size of the target search image block according to the movement speed.
3. The moving object tracking method according to claim 1, further comprising:
according to the expression xs=αnw0×αnh0Determining to extract a multi-sized training sample at the initial rectangular box, wherein α represents a size factor,
Figure FDA0002468036260000026
represents the training sample size level, S represents the number of multi-sized training samples, h0Indicates the height, w, of the initial rectangular box0Represents the width of the initial rectangular box;
according to the expression
Figure FDA0002468036260000027
Deriving a regression target dimension y of Gaussian from the initial target dimensions of the multi-dimensional training samplesWherein σ issRepresenting a preset width parameter;
according to the expression
Figure FDA0002468036260000028
Determining an autocorrelation of an initial target size of a multi-sized training sample, wherein,
Figure FDA0002468036260000029
denotes xsThe auto-correlation of (a) is,
Figure FDA00024680362600000210
denotes xsComplex conjugation of, σskRepresenting a preset width parameter, F-1(. -) represents an inverse fast fourier transform;
according to the expression
Figure FDA0002468036260000031
Training size filter asWherein F (-) represents a fast Fourier transform, ysRegression target size, x, representing a GaussiansDenotes a dimensional training sample, λsIndicating the tuning parameters.
4. The moving object tracking method according to claim 1, further comprising:
determining a response value range according to a maximum response value of a state transfer filter of a tracking result of a current frame in the target to be tracked and a preset threshold value;
determining different learning rates according to the response value range;
and updating the state transition filter and the size filter according to the learning rate respectively.
5. The moving object tracking method according to claim 4, wherein the updating the state transition filter and the size filter respectively according to learning rates includes:
according to the expression
Figure FDA0002468036260000032
Updating the state transition filter a, and updating the fHOG characteristic of the target image block, wherein lambdalIndicates the learning rate, aiA state transition filter representing the current frame, x represents the fast Fourier transform of the fHOG feature of the target image block, xiFast Fourier transform of fHOG features representing target image blocks tracked by the current frame;
according to the expression
Figure FDA0002468036260000033
To size filter asAn update is made to the size factor α, where λlIndicates the learning rate, asiSize filter representing the current frame, αiRepresenting the size factor of the current frame.
6. The method for tracking a moving object according to claim 1, wherein said determining the target search image block of the current frame according to the position and size of the previous frame of the current frame of the target to be tracked comprises:
according to the expression Searchpatch=Sli-1× (1+ padding) determines the position and size Search of the target Search image blockpatchWherein S isli-1The position and the size of a previous frame of a current frame of the target to be tracked, namely the position and the size of the current frame of the target to be tracked in the i-1 frame are represented, and padding represents an adjusting parameter;
according to the expression z ═ F (GetFhog (P)sp) Extracting fHOG features of the position and size of the target search image block, and performing fast fourier transform on the fHOG features, wherein PSpRepresenting a target Search image block, SearchpatchIndicating the position and size of the target search image block, GetFhog (P)sp) And F (-) is a fast Fourier transform, and F (-) represents the fHOG characteristic of the target search image block.
7. A moving object tracking apparatus, comprising:
the target searching image block determining module is used for determining a target searching image block of a current frame according to the position and the size of a previous frame of the current frame of the target to be tracked;
the first response value determining module is used for determining a first response value of the target search image block according to the state transition filter;
a tracking position determining module, configured to determine a first final response value according to the first response value, and then determine a tracking position of the target to be tracked according to the first final response value;
a target multi-size training sample extraction module, configured to extract a target multi-size training sample of the target to be tracked at the tracking position;
a second response value determining module, configured to determine a second response value of the target multi-size training sample according to a size filter;
a tracking size determining module, configured to determine a second final response value according to the second response value, and then determine a tracking size of the target to be tracked according to the second final response value;
the moving object tracking apparatus further includes:
an initial rectangular box size determination module for determining the size of the rectangular box according to an expression
Figure FDA0002468036260000041
Determining an initial rectangular frame size of a current frame of a target to be tracked
Figure FDA0002468036260000042
Wherein x0,y0Respectively representing the abscissa and ordinate, w, of the initial rectangular frame0,h0Respectively representing the width and height of the initial rectangular frame;
a training sample extraction module for extracting training samples according to an expression
Figure FDA0002468036260000043
Determining the position and size S of image blocks of a training samplepatchWherein padding represents parameters of adjustment preset when the position and size of the image block of the training sample are extracted,
Figure FDA0002468036260000044
representing the size of an initial rectangular frame of a current frame of a target to be tracked;
a feature transformation module for transforming the characteristic according to the expression x ═ F (GetFhog (p)Spatch) Extracting fHOG features of the training samples and performing a fast fourier transform on the extracted fHOG features, where pSpatchImage blocks representing training samples, SpatchRepresenting training samplesThe position and the size of the image block, GetFhog (DEG)) represents the fHOG characteristic of the image block for extracting the training sample, and F (DEG)) represents the fast Fourier transform;
a regression target determination module for determining the regression target according to the expression
Figure FDA0002468036260000051
Obtaining a regression target of Gaussian type from the initial target, wherein rs ═ 1, …, w/2],cs=[1,…,h/2]W denotes the image block p of the training sampleSpatchH denotes the image block p of the training sampleSpatchRepresents the width parameter of the function,
Figure FDA0002468036260000052
cell=4,σ'=0.1;
a state transition filter training module for training a state transition filter according to an expression
Figure FDA0002468036260000053
Training a state transition filter a, wherein
Figure FDA0002468036260000054
The autocorrelation of x is represented as a function of,
Figure FDA0002468036260000055
denotes the complex conjugate of x, F-1(. -) represents an inverse fast Fourier transform, λ represents a tuning parameter, σkRepresenting a preset width parameter.
8. A moving object tracking terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN201810381273.3A 2018-04-25 2018-04-25 Moving target tracking method and terminal equipment Active CN108846851B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810381273.3A CN108846851B (en) 2018-04-25 2018-04-25 Moving target tracking method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810381273.3A CN108846851B (en) 2018-04-25 2018-04-25 Moving target tracking method and terminal equipment

Publications (2)

Publication Number Publication Date
CN108846851A CN108846851A (en) 2018-11-20
CN108846851B true CN108846851B (en) 2020-07-28

Family

ID=64212297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810381273.3A Active CN108846851B (en) 2018-04-25 2018-04-25 Moving target tracking method and terminal equipment

Country Status (1)

Country Link
CN (1) CN108846851B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685832A (en) * 2018-12-26 2019-04-26 山东创科自动化科技有限公司 A kind of motion target tracking method, device and computer equipment
CN109993776B (en) * 2019-04-04 2020-12-11 杭州电子科技大学 Related filtering target tracking method and system based on multi-level template
CN111815670A (en) * 2019-04-10 2020-10-23 曜科智能科技(上海)有限公司 Multi-view target tracking method, device and system, electronic terminal and storage medium
CN111080675B (en) * 2019-12-20 2023-06-27 电子科技大学 Target tracking method based on space-time constraint correlation filtering
CN111723593B (en) * 2020-06-19 2024-05-10 中国科学院微电子研究所 Bar code positioning method and positioning device
CN111754548B (en) * 2020-06-29 2023-10-03 西安科技大学 Multi-scale correlation filtering target tracking method and device based on response discrimination
CN113822911B (en) * 2021-10-08 2022-09-16 中国人民解放军国防科技大学 Tracking method and device of columnar inclined target, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366370A (en) * 2013-07-03 2013-10-23 深圳市智美达科技有限公司 Target tracking method and device in video monitoring
CN104574445A (en) * 2015-01-23 2015-04-29 北京航空航天大学 Target tracking method and device
CN106251364A (en) * 2016-07-19 2016-12-21 北京博瑞爱飞科技发展有限公司 Method for tracking target and device
CN106875426A (en) * 2017-02-21 2017-06-20 中国科学院自动化研究所 Visual tracking method and device based on correlated particle filtering
CN106874854A (en) * 2017-01-19 2017-06-20 西安电子科技大学 Unmanned plane wireless vehicle tracking based on embedded platform
CN107016689A (en) * 2017-02-04 2017-08-04 中国人民解放军理工大学 A kind of correlation filtering of dimension self-adaption liquidates method for tracking target
CN107316316A (en) * 2017-05-19 2017-11-03 南京理工大学 The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366370A (en) * 2013-07-03 2013-10-23 深圳市智美达科技有限公司 Target tracking method and device in video monitoring
CN104574445A (en) * 2015-01-23 2015-04-29 北京航空航天大学 Target tracking method and device
CN106251364A (en) * 2016-07-19 2016-12-21 北京博瑞爱飞科技发展有限公司 Method for tracking target and device
CN106874854A (en) * 2017-01-19 2017-06-20 西安电子科技大学 Unmanned plane wireless vehicle tracking based on embedded platform
CN107016689A (en) * 2017-02-04 2017-08-04 中国人民解放军理工大学 A kind of correlation filtering of dimension self-adaption liquidates method for tracking target
CN106875426A (en) * 2017-02-21 2017-06-20 中国科学院自动化研究所 Visual tracking method and device based on correlated particle filtering
CN107316316A (en) * 2017-05-19 2017-11-03 南京理工大学 The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈倩茹等.多相关滤波自适应融合的鲁棒目标跟踪.《中国图像图形学报》.2018,第23卷(第2期), *

Also Published As

Publication number Publication date
CN108846851A (en) 2018-11-20

Similar Documents

Publication Publication Date Title
CN108846851B (en) Moving target tracking method and terminal equipment
CN111815754B (en) Three-dimensional information determining method, three-dimensional information determining device and terminal equipment
WO2020119527A1 (en) Human action recognition method and apparatus, and terminal device and storage medium
CN111860398A (en) Remote sensing image target detection method and system and terminal equipment
CN112336342B (en) Hand key point detection method and device and terminal equipment
CN111860276B (en) Human body key point detection method, device, network equipment and storage medium
CN112634316B (en) Target tracking method, device, equipment and storage medium
US20190332858A1 (en) Method and device for identifying wrist, method for identifying gesture, electronic equipment and computer-readable storage medium
CN108960046A (en) A kind of training data method of sampling and its device, computer server
CN111754548A (en) Multi-scale correlation filtering target tracking method and device based on response discrimination
CN110378932B (en) Correlation filtering visual tracking method based on spatial regularization correction
CN112418089A (en) Gesture recognition method and device and terminal
CN109740109A (en) A kind of PolSAR image broad object decomposition method based on unitary transformation
CN111281355B (en) Method and equipment for determining pulse acquisition position
CN112926436A (en) Behavior recognition method and apparatus, electronic device, and storage medium
CN112861934A (en) Image classification method and device of embedded terminal and embedded terminal
CN110633630A (en) Behavior identification method and device and terminal equipment
CN108093153B (en) Target tracking method and device, electronic equipment and storage medium
CN115661198A (en) Target tracking method, device and medium based on single-stage target tracking model
CN116486151A (en) Image classification model training method, image classification method, device and storage medium
CN114511922A (en) Physical training posture recognition method, device, equipment and storage medium
WO2020237674A1 (en) Target tracking method and apparatus, and unmanned aerial vehicle
CN113688785A (en) Multi-supervision-based face recognition method and device, computer equipment and storage medium
CN112418098A (en) Training method of video structured model and related equipment
CN112084896B (en) Combined recursion weighted spatial filtering method based on Lp/q-mixed norm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant