CN112233143A - Target tracking method, device and computer readable storage medium - Google Patents

Target tracking method, device and computer readable storage medium Download PDF

Info

Publication number
CN112233143A
CN112233143A CN202011468540.4A CN202011468540A CN112233143A CN 112233143 A CN112233143 A CN 112233143A CN 202011468540 A CN202011468540 A CN 202011468540A CN 112233143 A CN112233143 A CN 112233143A
Authority
CN
China
Prior art keywords
target
tracked
frame image
final position
current frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011468540.4A
Other languages
Chinese (zh)
Other versions
CN112233143B (en
Inventor
张兴明
郑少飞
潘华东
殷俊
唐邦杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202011468540.4A priority Critical patent/CN112233143B/en
Publication of CN112233143A publication Critical patent/CN112233143A/en
Application granted granted Critical
Publication of CN112233143B publication Critical patent/CN112233143B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/262Analysis of motion using transform domain methods, e.g. Fourier domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a target tracking method, a target tracking device and a computer-readable storage medium, wherein the method comprises the following steps: acquiring a current frame image; obtaining the predicted position of the target to be tracked in the current frame image according to the final position of the target to be tracked in the previous frame image; processing the predicted position by using a related filtering template to obtain the final position of the target to be tracked in the current frame image; updating parameters of a related filtering template based on the final position of the target to be tracked in the current frame image, the final position of the target to be tracked in the previous frame image and the similarity between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image; and obtaining the predicted position of the target to be tracked in the current frame image according to the final position of the target to be tracked in the previous frame image and performing the subsequent steps again. Through the mode, the stability of updating the parameters of the related filtering template can be enhanced.

Description

Target tracking method, device and computer readable storage medium
Technical Field
The present application relates to the field of target tracking technologies, and in particular, to a target tracking method and apparatus, and a computer-readable storage medium.
Background
Target tracking technology has been a research hotspot in computer vision, and is widely applied to tasks such as video structuring and manual interaction. The target tracking task is to give the initial frame position of the target O and then to distinguish the position of the target O in the subsequent frame through the model. A typical correlation filtering-based target tracking procedure is: and calculating a filtering template according to the position of the target to be tracked in the initial frame image, and after the position of the target to be tracked in the next frame image is predicted, updating the filtering template according to the position of the target to be tracked in the next frame image, and continuously using the filtering template for target tracking of the subsequent frame.
Disclosure of Invention
The application mainly provides a target tracking method, a target tracking device and a computer readable storage medium, which can solve the problem that the update of related filtering template parameters is unstable in the prior art.
In order to solve the above technical problem, a first aspect of the present application provides a target tracking method. Wherein, the method comprises the following steps: acquiring a current frame image; obtaining the predicted position of the target to be tracked in the current frame image according to the final position of the target to be tracked in the previous frame image; processing the predicted position by using the related filtering template to obtain the final position of the target to be tracked in the current frame image; updating parameters of the related filtering template based on the final position of the target to be tracked in the current frame image, the final position of the target to be tracked in the previous frame image and the similarity between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image; and the next frame of image is the current frame of image, and the predicted position of the target to be tracked in the current frame of image is obtained according to the final position of the target to be tracked in the previous frame of image, and the subsequent steps are executed again.
In order to solve the above technical problem, a second aspect of the present application provides a target tracking apparatus, including a processor and a memory coupled to each other, where the memory stores a computer program, and the processor is configured to execute the computer program to implement the target tracking method provided by the first aspect.
In order to solve the above technical problem, a third aspect of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the target tracking method provided by the first aspect.
The beneficial effect of this application is: different from the situation of the prior art, the method and the device have the advantages that the predicted position of the target to be tracked in the current frame image is obtained according to the final position with the tracking target in the previous frame image, the predicted position is processed by utilizing the related filtering template to obtain the final position of the target to be tracked in the current frame image, and then the parameters of the related filtering template are updated based on the final position of the target to be tracked in the current frame image, the final position of the target to be tracked in the previous frame image and the similarity between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image, so that the prediction of the position of the target to be tracked is carried out frame by frame, and the target tracking. In the updating process of the related filtering template, the perception of the related filtering template on the position change of the target to be tracked in the adjacent frame images is enhanced by continuously learning the similarity between the adjacent frames, and the updating stability of the related filtering template is further enhanced.
Drawings
FIG. 1 is a schematic block flow diagram of an embodiment of a target tracking method of the present application;
FIG. 2 is a schematic diagram of a detection frame in an image to be tracked according to the present application;
FIG. 3 is a schematic block flow diagram of one embodiment of a process for building an optimized parameter update model according to the present application;
FIG. 4 is a schematic block flow diagram illustrating one embodiment of updating relevant filter model parameters according to the present application;
FIG. 5 is a schematic block diagram of a circuit configuration of an embodiment of the subject tracking apparatus of the present application;
FIG. 6 is a schematic block diagram of a circuit structure of an embodiment of a computer-readable storage medium of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first" and "second" in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features shown. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those skilled in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic block diagram of a flow of an embodiment of a target tracking method of the present application. It should be noted that, if the result is substantially the same, the flow sequence shown in fig. 1 is not limited in this embodiment. The target tracking method of the embodiment comprises the following steps:
s11: and acquiring a current frame image.
The current frame image may be a frame image obtained from an image sequence, where the image sequence may be a complete video stream sequence captured by the camera device, or may be a part of a video stream sequence cut from the complete video stream sequence.
Optionally, the image sequence includes a first frame image and several frames of images to be tracked located after the first frame image. Specifically, each frame of image in the image sequence has a corresponding frame number, so the images in the image sequence can be divided into a first frame of image (with the smallest frame number) and several frames of images to be tracked after the first frame of image based on the frame numbers. In which the position of the object in the first frame of image of the image sequence is known (either given by man or calculated according to a correlation algorithm), and the position of the object in the other frame of image except the first frame of image is unknown, in other words, the position of the object in the first frame of image is known, and the position of the object in the other frame of image is unknown.
S12: and obtaining the predicted position of the target to be tracked in the current frame image according to the final position of the target to be tracked in the previous frame image.
The target to be tracked may be simply referred to as the above-mentioned target, which may be a vehicle, a pedestrian, or various signboards, warning boards, or the like. The target to be tracked is a target in the first frame image, and therefore the target tracking method provided by the application is essentially used for tracking the position of the target appearing in the first frame image in each image to be tracked. The number of the objects in the first frame image may be one or more.
In the present application, the position of the target to be tracked is also referred to as a detection frame, please refer to fig. 2, and a frame 1 in fig. 2 is a detection frame of the target to be tracked. In the tracking process, each target has a corresponding detection frame.
The previous frame image is the previous frame image of the current frame image in the image sequence. The previous frame image of the current frame image may or may not be the first frame image of the image sequence. In the target tracking process, the positions of the targets in the images to be tracked of the image sequence are sequentially tracked according to the frame number. When the current frame image is the second frame image, the previous frame image is the first frame image; when the current frame image is an image to be tracked of other frames, the previous frame image is not the first frame image.
Since the position of the object in the first frame image is known, when the previous frame image is the first frame image in the image sequence, the final position of the object in the previous frame image refers to the known position of the object in the first frame image.
And when the previous frame image is not the first frame image in the image sequence, the final position of the target in the previous frame image is obtained by performing related filtering calculation on a plurality of predicted positions of the target to be tracked in the previous frame image, wherein the acquisition mode of the predicted position of the target in the previous frame image is the same as that of the current frame image.
S13: and processing the predicted position by using a related filtering template to obtain the final position of the target to be tracked in the current frame image.
In an embodiment, in step S12, a plurality of predicted positions of the target to be tracked in the current frame image are obtained from the final position of the target to be tracked in the previous frame image. And if one of the plurality of prediction positions is required to be determined as the final prediction position of the target to be tracked in the current frame image, inputting each prediction position into a related filtering model for calculation so as to obtain the final position of the target to be tracked in the current frame image.
Specifically, the relevant filtering templates are used for fitting the prediction positions respectively, and the prediction position with the best fitting result is used as the final position of the target to be tracked in the current frame image.
S14: and updating parameters of the related filtering template based on the final position of the target to be tracked in the current frame image, the final position of the target to be tracked in the previous frame image and the similarity between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image.
Because the current frame image and the previous frame image are two adjacent frames in the image sequence, the deviation amplitude of the final position determined by the target to be tracked in the current frame image is small relative to the final position of the target to be tracked in the previous frame image, and the appearance of the final position of the target to be tracked in the current frame image and the appearance of the final position of the target to be tracked in the previous frame image have extremely high similarity. Therefore, the method can be applied to the updating process of the similar filtering template parameters in consideration of the appearance similarity of the target to be tracked in every two adjacent frame images so as to enhance the stability of the updating of the related filtering template parameters.
After the step is executed, the step may jump to step S11, so as to obtain a next frame image of the current frame image from the image sequence as a new current frame image, and re-execute the steps of obtaining the predicted position of the target to be tracked in the current frame image according to the final position of the target to be tracked in the previous frame image, so as to complete the prediction of the target position in each frame image.
Through the embodiment, the target position in the previous frame image can be used for predicting the position of the target to be tracked of the current frame image in the current frame image, the predicted position is processed by using the related filtering template, and a more accurate final position is obtained to be used as the position of the target to be tracked, so that the position of the target to be tracked can be predicted frame by frame, and the target tracking purpose is achieved; and after the tracking of the target to be tracked in the current frame image is finished each time, the online learning of the related filtering template is further carried out based on the image characteristics of the final positions of the target to be tracked in the previous frame image and the current frame image so as to update the parameters of the related filtering template, and the learning of the similarity of the related filtering template to the adjacent frame image is enhanced, so that the tracking of the kernel related filtering model to the next current frame is more effective.
The parameter updating of the related filtering template of the application depends on the following optimized parameter updating model:
Figure 329643DEST_PATH_IMAGE001
(1)
λ1、λ2is a constantNumber parameter, y is the Gaussian true value, z2For the parameters of the relevant filtering template corresponding to the current frame image, z1Parameters of the relevant filter template corresponding to the previous frame of image, C (x)2The cyclic shift conversion result of the final position image feature of the target to be tracked in the current frame, C (x)1For the result of the cyclic shift conversion of the final position image features of the target to be tracked in the previous frame, ATRepresenting the transpose of the matrix A, | | A-B | | non-woven cells1Computing an L1 norm representing the difference between A and B;
z obtained by updating model solution with the optimized parameters2And updating the parameters of the relevant filtering template for the parameters of the relevant filtering template.
Referring to fig. 3, fig. 3 is a schematic block diagram illustrating a process of establishing an optimized parameter update model according to an embodiment of the present application. It should be noted that, if the result is substantially the same, the flow sequence shown in fig. 3 is not limited in this embodiment. The embodiment comprises the following steps:
s141: and constructing a parameter updating model by using the final position of the target to be tracked in the current frame and the final position of the target to be tracked in the previous frame.
In particular, the training set can be passed
Figure 542318DEST_PATH_IMAGE002
And formula (2) establishing a ridge regression model
Figure 74931DEST_PATH_IMAGE003
(2)
Wherein the first term in the formula is to get the regression target closer to the true value yiThe second term is a regularization term in order to prevent overfitting of the model. Equation (2) is a convex optimization problem with a closed solution:
Figure 539410DEST_PATH_IMAGE004
wherein
Figure 473868DEST_PATH_IMAGE005
The ridge regression modeling process is as follows:
consider that X is a matrix generated by cyclic shifting of a base sample X, XTIs the transpose of X of the matrix and I is the identity matrix. When the modeling range is extended to the complex domain, the closed solution can be converted to equation (3):
Figure 608177DEST_PATH_IMAGE006
(3)
wherein XHRepresents the conjugate transpose of the matrix X,
Figure 362507DEST_PATH_IMAGE007
,X*represents the conjugate matrix of X. The circulant matrix can be converted to diagonal form by Fourier transform, as shown in equation (4)
Figure 365098DEST_PATH_IMAGE008
(4)
Wherein
Figure 154062DEST_PATH_IMAGE009
Means converting x into Fourier domain, and
Figure 239699DEST_PATH_IMAGE010
f is a constant matrix independent of the training samples and n represents the length of the base samples. diag denotes the diagonalization of the matrix.
To facilitate applying the kernel function to the cyclic shift matrix of base samples, equation (2) is generally transformed into a dual form, as shown in equation (5).
Figure 746903DEST_PATH_IMAGE011
(5)
Where z is a dual variable, which is also a parameter of the relevant filtering template of the present application, and z and w can be related by equation (6).
Figure 553185DEST_PATH_IMAGE012
(6)
C (X) is a circulant matrix generated from the base sample X, i.e., X = c (X). In conjunction with equations (3), (4), and (6), the dual variable can be represented in the fourier domain as equation (7).
Figure 337602DEST_PATH_IMAGE013
(7)
In this embodiment, the image feature of the final position of the target to be tracked in the current frame image and the image feature of the final position of the target to be tracked in the previous frame image are respectively used as the base samples x, the ridge regression model of the current frame image is established by using the final position of the target to be tracked in the current frame image, the ridge regression model of the previous frame image is established by using the final position of the target to be tracked in the previous frame image, and then the ridge regression model of the current frame image and the ridge regression model of the previous frame image are subjected to dual transformation to obtain the dual variable of the current frame image and the dual variable of the previous frame image.
Constructing a model shown as a formula (8):
Figure 938347DEST_PATH_IMAGE014
(8)
the model shown in equation (8) is a parameter update model.
The definitions of the parameters are explained in formula (1), and are not described again.
S142: and carrying out sparse constraint on the parameter updating model by utilizing the similarity measurement between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image so as to obtain an optimized parameter updating model.
Specifically, taking an L1 norm of a difference between a parameter of a relevant filtering template corresponding to the current frame image and a parameter of a relevant filtering template corresponding to the previous frame image as the similarity measurement, and performing sparse constraint on the parameter updating model. I.e., in | | z2-z1||1And (3) carrying out sparsity constraint on the formula (8) for similarity measurement indexes to obtain an optimized parameter updating model as shown in the formula (1).
S143: and updating the parameters of the related filtering template based on the optimization parameter updating model.
The optimization parameter updating model is updated by solving z2Will z2The updated parameters of the relevant filtering template can be obtained as the parameters of the relevant filtering template.
For the solution of the optimization parameter update model, two auxiliary variables P and q can be introduced to convert the formula (1) into the optimization problem of the formula (9):
Figure 932848DEST_PATH_IMAGE015
Figure 277242DEST_PATH_IMAGE016
(9)
wherein G isk=C(x)kC(x)T k,R=[-I;I],Z=[z1;z2]. For formula (9), the ADMM (fast first-order indexing Method of Multipliers) Method can be used for optimization. Incorporating the equality constraint into the formula by introducing a Lagrange multiplier, resulting in formula (10)
Figure 775219DEST_PATH_IMAGE017
Figure 937079DEST_PATH_IMAGE018
(10)
Wherein<A,B>=Tr(ATB) And represents the matrix inner product. Y is1,kAnd Y2Representing the lagrange multiplier. The variables in equation (10) may be optimized alternately.
Alternating optimization of q, P, z is performed using equations (11), (12), (13):
Figure 153297DEST_PATH_IMAGE019
(11)
Figure 301382DEST_PATH_IMAGE020
(12)
Figure 919445DEST_PATH_IMAGE021
(13)
wherein Q = [ Q ]1;q2]。
The optimization solution is circularly carried out until the target is converged, and the solved q is obtained2As a new parameter for the relevant filtering template.
Wherein q in the formula (11)kFurther represented by formula (14), formula (12) further represented by formula (15), and formula (13) further represented by formula (16):
Figure 268518DEST_PATH_IMAGE022
(14)
Figure 237611DEST_PATH_IMAGE023
(15)
Figure 48441DEST_PATH_IMAGE024
(16)
wherein the content of the first and second substances,
Figure 786590DEST_PATH_IMAGE025
. Solving equations (14), (15) and (16) circularly, and obtaining qkAs parameters of the relevant filtering template.
Referring to fig. 4, in the present embodiment, when the parameters of the related filtering template are solved by using the equations (14), (15), and (16), the following steps are performed:
s21: and respectively carrying out Fourier transform on the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image to obtain a Fourier transform result.
Here, the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image refer to performing feature extraction on a detection frame of the final position of the target to be tracked in the current frame image and performing feature extraction on a detection frame of the final position of the target to be tracked in the previous frame image, and the obtained image features may be an HOG feature, a scale feature of the detection frame, and the like.
And performing Fourier transform on the extracted image features x to obtain a Fourier transform result.
In particular, utilize
Figure 165618DEST_PATH_IMAGE010
Figure 356428DEST_PATH_IMAGE013
The image feature x is fourier transformed.
S22: and applying the Fourier transform result to the optimized parameter updating model to obtain the parameters of the related filtering template.
Applying the Fourier transform result to the expressions (14), (15) and (16), and solving the obtained q2As a new parameter for the relevant filtering template.
Alternatively, in another embodiment, the extracted image features may be subjected to cyclic shift transformation, the cyclic shift transformation result may be applied to equation (1), and the obtained z may be2As a new parameter for the relevant filtering template.
Referring to fig. 5, fig. 5 is a schematic block diagram of a circuit structure of an embodiment of the target tracking device of the present application. The object tracking device 11 comprises a processor 111 and a memory 112 coupled to each other, the memory 112 stores a computer program, and the processor 111 is configured to execute the computer program to implement the steps of the embodiments of the object tracking method of the present application as described above.
For the description of each step executed by the processing, refer to the description of each step in the embodiment of the target tracking method of the present application, which is not described herein again.
In the embodiments of the present application, the disclosed target tracking method and target tracking apparatus may be implemented in other ways. For example, the above-described embodiments of the target tracking apparatus are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on this understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium, or in a part of or all of the technical solutions that contribute to the prior art.
Referring to fig. 6, fig. 6 is a schematic block diagram of a circuit structure of an embodiment of a computer readable storage medium of the present application, a computer program 1001 is stored in a computer storage medium 1000, and when the computer program 1001 is executed, the steps of the embodiments of the target tracking method of the present application are implemented.
The computer storage medium 1000 may be various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are intended to be included within the scope of the present application.

Claims (10)

1. A method of target tracking, the method comprising:
acquiring a current frame image;
obtaining the predicted position of the target to be tracked in the current frame image according to the final position of the target to be tracked in the previous frame image;
processing the predicted position by using a related filtering template to obtain the final position of the target to be tracked in the current frame image;
updating parameters of the related filtering template based on the final position of the target to be tracked in the current frame image, the final position of the target to be tracked in the previous frame image and the similarity between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image;
and the next frame of image is the current frame of image, and the predicted position of the target to be tracked in the current frame of image is obtained according to the final position of the target to be tracked in the previous frame of image, and the subsequent steps are executed again.
2. The method of claim 1,
the method further comprises the following steps:
constructing a parameter updating model by using the final position of the target to be tracked in the current frame and the final position of the target to be tracked in the previous frame;
the updating the parameters of the relevant filtering template based on the final position of the target to be tracked in the current frame image, the final position of the target to be tracked in the previous frame image, and the similarity between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image includes:
performing sparse constraint on the parameter updating model by using the similarity measurement between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image to obtain an optimized parameter updating model;
updating parameters of the relevant filtering templates based on the optimized parameter update model.
3. The method of claim 2,
the performing sparse constraint on the parameter updating model by using the similarity measurement between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image to obtain an optimized parameter updating model comprises:
establishing a ridge regression model of the current frame image by using the final position of the target to be tracked in the current frame image, and establishing a ridge regression model of the previous frame image by using the final position of the target to be tracked in the previous frame image;
performing dual transformation on the ridge regression model of the current frame image and the ridge regression model of the previous frame image respectively to obtain a dual variable of the current frame image and a dual variable of the previous frame image;
and performing sparse constraint on the parameter updating model by taking the L1 norm of the difference between the parameter of the relevant filtering template corresponding to the current frame image and the parameter of the relevant filtering template corresponding to the previous frame image as the similarity measurement so as to obtain the optimized parameter updating model.
4. The method according to claim 3, wherein the updating the parameters of the relevant filtering templates based on the final position of the target to be tracked in the current frame image, the final position of the target to be tracked in the previous frame image, and the image feature similarity between the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image comprises:
the expression of the optimization parameter updating model is as follows:
Figure 246762DEST_PATH_IMAGE001
wherein λ is1、λ2Is a constant parameter, y is a Gaussian true value, z2For the parameters of the relevant filtering template corresponding to the current frame image, z1Parameters of the relevant filter template corresponding to the previous frame of image, C (x)2The cyclic shift conversion result of the final position image feature of the target to be tracked in the current frame, C (x)1For the result of the cyclic shift conversion of the final position image features of the target to be tracked in the previous frame, ATRepresenting the transpose of the matrix A, | | A-B | | non-woven cells1Computing an L1 norm representing the difference between A and B;
is solved to obtain z2In z is2And updating the parameters of the related filtering template.
5. The method of claim 4,
the method further comprises: solving the optimization parameter update model using the following optimization models:
Figure 429481DEST_PATH_IMAGE002
Figure 474798DEST_PATH_IMAGE003
Figure 260220DEST_PATH_IMAGE004
wherein the content of the first and second substances,<A,B>=Tr(ATB) denotes the inner product of the matrix, Y1,kAnd Y2Representing the Lagrangian multiplier, q1、q2P is an auxiliary variable, Gk=C(x)kC(x)T k,R=[-I;I],Z=[z1;z2],Q=[q1;q2],P=RZ,q1=z1,q2=z2
The optimization solution is circularly carried out until the target is converged to obtain the q2As parameters of the relevant filtering template.
6. The method of claim 5,
before the updating the parameters of the relevant filtering template based on the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image, the method includes:
respectively carrying out Fourier transform on the final position of the target to be tracked in the current frame image and the final position of the target to be tracked in the previous frame image to obtain a Fourier transform result;
and applying the Fourier transform result to the optimized parameter updating model to obtain the parameters of the related filtering template.
7. The method of claim 1,
the obtaining of the predicted position of the target to be tracked in the current frame image according to the final position of the target to be tracked in the previous frame image includes:
acquiring a plurality of predicted positions of the target to be tracked in the current frame image according to the final position of the target to be tracked in the previous frame image;
the processing the predicted position by using the relevant filtering template to obtain the final position of the target to be tracked in the current frame image comprises the following steps:
and inputting the plurality of predicted positions into a relevant filtering model respectively for calculation to obtain the final position of the target to be tracked in the current frame image.
8. The method according to claim 7, wherein the processing the predicted position by using the relevant filtering template to obtain a final position of the target to be tracked in the current frame image comprises:
and respectively fitting each predicted position by utilizing a related filtering template, and taking the predicted position with the best fitting result as the final position of the target to be tracked in the current frame image.
9. An object tracking apparatus, characterized in that the apparatus comprises a processor and a memory coupled to each other; the memory has stored therein a computer program for execution by the processor to implement the steps of the method according to any one of claims 1-8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when being executed by a processor, carries out the steps of the method according to any one of claims 1-8.
CN202011468540.4A 2020-12-14 2020-12-14 Target tracking method, device and computer readable storage medium Active CN112233143B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011468540.4A CN112233143B (en) 2020-12-14 2020-12-14 Target tracking method, device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011468540.4A CN112233143B (en) 2020-12-14 2020-12-14 Target tracking method, device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112233143A true CN112233143A (en) 2021-01-15
CN112233143B CN112233143B (en) 2021-05-11

Family

ID=74124048

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011468540.4A Active CN112233143B (en) 2020-12-14 2020-12-14 Target tracking method, device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112233143B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113364150A (en) * 2021-06-10 2021-09-07 内蒙古工业大学 Automatic tracking unmanned aerial vehicle laser charging device and tracking method
CN113393493A (en) * 2021-05-28 2021-09-14 京东数科海益信息科技有限公司 Target object tracking method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107403175A (en) * 2017-09-21 2017-11-28 昆明理工大学 Visual tracking method and Visual Tracking System under a kind of movement background
CN107481264A (en) * 2017-08-11 2017-12-15 江南大学 A kind of video target tracking method of adaptive scale
CN108898621A (en) * 2018-06-25 2018-11-27 厦门大学 A kind of Case-based Reasoning perception target suggests the correlation filtering tracking of window
CN110097579A (en) * 2019-06-14 2019-08-06 中国科学院合肥物质科学研究院 Multiple dimensioned wireless vehicle tracking and device based on pavement texture contextual information
CN110349190A (en) * 2019-06-10 2019-10-18 广州视源电子科技股份有限公司 Method for tracking target, device, equipment and the readable storage medium storing program for executing of adaptive learning
CN111080675A (en) * 2019-12-20 2020-04-28 电子科技大学 Target tracking method based on space-time constraint correlation filtering
CN111815668A (en) * 2020-06-23 2020-10-23 浙江大华技术股份有限公司 Target tracking method, electronic device and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107481264A (en) * 2017-08-11 2017-12-15 江南大学 A kind of video target tracking method of adaptive scale
CN107403175A (en) * 2017-09-21 2017-11-28 昆明理工大学 Visual tracking method and Visual Tracking System under a kind of movement background
CN108898621A (en) * 2018-06-25 2018-11-27 厦门大学 A kind of Case-based Reasoning perception target suggests the correlation filtering tracking of window
CN110349190A (en) * 2019-06-10 2019-10-18 广州视源电子科技股份有限公司 Method for tracking target, device, equipment and the readable storage medium storing program for executing of adaptive learning
CN110097579A (en) * 2019-06-14 2019-08-06 中国科学院合肥物质科学研究院 Multiple dimensioned wireless vehicle tracking and device based on pavement texture contextual information
CN111080675A (en) * 2019-12-20 2020-04-28 电子科技大学 Target tracking method based on space-time constraint correlation filtering
CN111815668A (en) * 2020-06-23 2020-10-23 浙江大华技术股份有限公司 Target tracking method, electronic device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHIQIAN ZHAO 等: "A hybrid tracking framework based on kernel correlation filtering and particle filtering", 《NEOROCOMPUTING》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113393493A (en) * 2021-05-28 2021-09-14 京东数科海益信息科技有限公司 Target object tracking method and device
CN113393493B (en) * 2021-05-28 2024-04-05 京东科技信息技术有限公司 Target object tracking method and device
CN113364150A (en) * 2021-06-10 2021-09-07 内蒙古工业大学 Automatic tracking unmanned aerial vehicle laser charging device and tracking method
CN113364150B (en) * 2021-06-10 2022-08-26 内蒙古工业大学 Automatic tracking unmanned aerial vehicle laser charging device and tracking method

Also Published As

Publication number Publication date
CN112233143B (en) 2021-05-11

Similar Documents

Publication Publication Date Title
Erichson et al. Randomized low-rank dynamic mode decomposition for motion detection
CN107529650B (en) Closed loop detection method and device and computer equipment
Tuncer et al. Random matrix based extended target tracking with orientation: A new model and inference
CN107369166B (en) Target tracking method and system based on multi-resolution neural network
CN112233143B (en) Target tracking method, device and computer readable storage medium
Bissacco et al. Classification and recognition of dynamical models: The role of phase, independent components, kernels and optimal transport
Liu et al. Iterative relaxed collaborative representation with adaptive weights learning for noise robust face hallucination
Ngo et al. Haziness degree evaluator: A knowledge-driven approach for haze density estimation
CN111091101A (en) High-precision pedestrian detection method, system and device based on one-step method
Leng et al. Context-aware attention network for image recognition
CN111027610B (en) Image feature fusion method, apparatus, and medium
Yang et al. Learning nonlinear mixtures: Identifiability and algorithm
CN116977674A (en) Image matching method, related device, storage medium and program product
Wang et al. A depth estimating method from a single image using foe crf
CN114119690A (en) Point cloud registration method based on neural network reconstruction Gaussian mixture model
Ali et al. Incorporating structural prior for depth regularization in shape from focus
CN107590820B (en) Video object tracking method based on correlation filtering and intelligent device thereof
Yu et al. Bottom–up attention: pulsed PCA transform and pulsed cosine transform
Wei et al. Hong li
Zhang et al. CAM R-CNN: End-to-End Object Detection with Class Activation Maps
Wang et al. Sparse representation of local spatial-temporal features with dimensionality reduction for motion recognition
CN109492530B (en) Robust visual object tracking method based on depth multi-scale space-time characteristics
Du et al. Monocular human motion tracking by using DE-MC particle filter
Shabat et al. Accelerating particle filter using randomized multiscale and fast multipole type methods
WO2022252519A1 (en) Image processing method and apparatus, terminal, medium, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20210115

Assignee: ZHEJIANG DAHUA TECHNOLOGY Co.,Ltd.

Assignor: ZHEJIANG DAHUA TECHNOLOGY Co.,Ltd.

Contract record no.: X2021330000117

Denomination of invention: Target tracking method, device and computer-readable storage medium

Granted publication date: 20210511

License type: Common License

Record date: 20210823

EE01 Entry into force of recordation of patent licensing contract