CN112215869A - Group target tracking method and system based on graph similarity constraint - Google Patents

Group target tracking method and system based on graph similarity constraint Download PDF

Info

Publication number
CN112215869A
CN112215869A CN202011084558.4A CN202011084558A CN112215869A CN 112215869 A CN112215869 A CN 112215869A CN 202011084558 A CN202011084558 A CN 202011084558A CN 112215869 A CN112215869 A CN 112215869A
Authority
CN
China
Prior art keywords
target
tracking
graph
candidate
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011084558.4A
Other languages
Chinese (zh)
Other versions
CN112215869B (en
Inventor
胡静
张旭阳
沈宜帆
陈智勇
李彬哲
章正
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN202011084558.4A priority Critical patent/CN112215869B/en
Publication of CN112215869A publication Critical patent/CN112215869A/en
Application granted granted Critical
Publication of CN112215869B publication Critical patent/CN112215869B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a group target tracking method and system based on graph similarity constraint, and belongs to the field of image processing. Because the target motion amplitude is limited between the front frame and the back frame, the invention considers that the group target graph structure should be similar between the front frame and the back frame, based on the similarity index, the invention designs the similarity index to quantize the similarity between the candidate graph structure of the current frame and the graph structure of the previous frame in the group target tracking process, and forms constraint on the group target tracking result, thereby solving the problem of wrong tracking caused by multi-peak of the response graph generated by similar target interference in the tracking process and improving the tracking precision of the group target tracking algorithm; compared with other group target tracking algorithms, the method has higher tracking precision and better tracking real-time performance on similar targets under the same group target tracking test set, and is more suitable for group target tracking.

Description

Group target tracking method and system based on graph similarity constraint
Technical Field
The invention belongs to the field of image processing, and particularly relates to a group target tracking method and system based on graph similarity constraint.
Background
Video target tracking is also widely used in the production and life fields. Object tracking is an important component, both in military and civilian applications. The group target tracking technology has important significance in the fields of ecological environment protection, flight safety, animal husbandry automation and the like. For example, in the aspect of bird repelling in airports, flying birds in airports can also cause hidden dangers to airlines, so that huge economic loss is caused, and serious threats are brought to the safety of passengers. Therefore, the flying birds at the airport need to be tracked as the basis for driving away. In addition, the unmanned aerial vehicle as a new intelligent aircraft has the characteristics of flexible action, low requirements on take-off and flight, no restriction of places, high lift-off speed, long hold-up time, easiness in acquisition, long control distance and the like. Tracking aiming at the unmanned aerial vehicle cluster is also an important means for applying the unmanned aerial vehicle in a large quantity and avoiding the influence of the unmanned aerial vehicle on important facilities such as airports and the like. Meanwhile, in the field of animal husbandry and animal research, many animals such as cattle, sheep, birds, etc. also move in a herd. Whether the migration mode of animals is researched or livestock is prevented from being dislocated in the grazing process, the migration and movement information of the animals needs to be effectively acquired. Researchers often need to track these target groups and obtain the motion trail of each individual.
When tracking a target group, multi-target tracking and group target tracking technologies are generally adopted. The mainstream multi-target tracking technology is multi-target tracking based on detection, target detection is carried out on each frame, and then target tracking is carried out by using a target detection result through methods such as track association and the like. But detection-based multi-target tracking methods require a dataset that holds the target before tracking. This can be difficult in some cases, so some researchers have proposed multiple target tracking methods based on no detection. Multiple target tracking without detection may not be limited to a limited variety of targets due to the absence of detector limitations. Such methods can be viewed as extensions to single target tracking. Some of the methods need the position of an artificial initialization target, and other methods choose to use the result of the inter-frame difference as the initialization target for tracking. These non-detection tracking methods do not require a dataset for the target class, and can track a variety of targets without datasets. However, these tracking methods do not focus on processing similar objects, and focus more on establishing a target motion model and extracting moving objects. Group target tracking is similar to undetected multi-target tracking, but there are some differences between targets that need to be tracked. In contrast to undetected multi-target tracking, the target of group target tracking is usually a plurality of targets with similar tracking appearance and close space.
We define a plurality of objects that are similar in appearance and close in space over time as a group object. Research on group target tracking has mostly focused on tracking for groups of animals. Clark et al [1] [ Clark D, Godsill S.group target tracking with the Gaussian mixture probability filter [ C ]. 20073 rd International Conference on Intelligent Sensors, Sensor Networks and information.2007: 149-. Zhu et al [2] [ Zhu S, Liu W, Weng C, et al.multiple group targets tracking using the generated layered multi-Bernoulli filter [ C ]. 201635 th Chinese Control reference (CCC).2016: 4871-. The algorithm is divided into two stages of dynamic model establishment and group target tracking: in the first stage, a group dynamics model is established by utilizing an adjacency matrix; and in the second stage, an estimated state set of the target is obtained by utilizing a generalized Bernoulli filter based on a random finite set on the premise that all targets are independent. Most group target algorithms have the following 3 steps:
1) inputting the image sequence and the position information of a target to be tracked in a first frame;
2) extracting different characteristics of a plurality of targets by utilizing the position information of the target in the previous frame, and establishing a loudness map of the current frame through a function, wherein the response peak position of the loudness map is the tracked target position;
3) and (4) taking the position of the response maximum point of the response image as a tracking target position and storing the position, continuing tracking the next frame, and repeating the step 2 and the step 3 until the last frame of the image sequence.
In the group target tracking process, because the target similarity of the group target tracking is high and the distance is short, the response graph obtained in the step 2 may have several peaks, and the difference between the peaks is not large, which is called as a response graph multimodal problem. When the existing group target algorithm encounters the problem of multi-peak of the response map, the position of the maximum peak value is mostly used as the position of the tracking target, and other close peak values are discarded, but the tracking precision is affected, and target tracking loss occurs because the target is likely to be at other close peak values. Therefore, the multi-peak problem of the response graph of the group target tracking is well processed, so that the tracking accuracy of the group target tracking algorithm is remarkably improved.
Disclosure of Invention
Aiming at the problem of multi-peak response graphs of the existing group target tracking technology, the invention provides a group target tracking method and system based on graph similarity constraint, aiming at solving the problem of multi-peak response graphs of group target tracking by utilizing the graph similarity constraint and improving the accuracy of group target tracking.
To achieve the above object, according to an aspect of the present invention, there is provided a group target tracking method based on graph similarity constraint, including:
s1, acquiring candidate targets corresponding to all tracking targets in a current frame;
s2, constructing a candidate graph structure:
respectively selecting one candidate target from each tracking target to combine to obtain K1×K2×…KICandidate graph structure
Figure BDA0002719913640000031
Figure BDA0002719913640000032
The k candidate target k in the response map corresponding to the ith targetiI represents the total number of tracking targets, KiIs the ith targetThe number of corresponding candidate targets, c represents the structure serial number of the candidate graph;
s3, calculating a graph similarity evaluation index of each candidate graph structure; the graph similarity evaluation index represents the similarity between the candidate graph structure of the current frame and the preferred graph structure of the previous frame in the group target tracking process;
s4, selecting the candidate graph structure with the maximum graph similarity evaluation index as the optimal graph structure of the current frame, and outputting a target i and a candidate target k in the current frameiThe corresponding tracking result of (2);
s5, judging whether all the frame image sequences are tracked completely, if so, ending and outputting a tracking result; otherwise, the next frame image is loaded, and the process returns to step S1.
Further, step S1 includes:
s1.1, calculating a response graph corresponding to a current frame target i
Figure BDA0002719913640000041
Wherein m is the current frame number, i is the target number, z represents the target template image obtained after the previous frame is processed, and s represents the search image;
s1.2, obtaining a candidate target corresponding to the target i:
selecting all local extreme points of the response map
Figure BDA0002719913640000042
The response value meeting the local extreme point is larger than the maximum response value of the current response map
Figure BDA0002719913640000043
By a threshold coefficient ThrcoeAs a candidate target
Figure BDA0002719913640000044
Threshold coefficient
0 < Thrcoe<1;
S1.3, judging whether all targets of the current frame are processed by the steps S1.1-S1.2, if so, entering the step S2; otherwise i is i +1 and returns to step S1.1 to process the next object.
Further, a threshold coefficient ThrcoeTake 0.7.
Further, step S3 specifically includes:
s3.1, calculating the optimal graph structure G of the previous framem-1And current frame candidate structure GmAdjacent matrix AM (G)m-1) And AM (G)m):
When i ≠ j, the element A in the adjacency matrix(i,j)The value of (a) is the ith cell in the candidate graph structure
Figure BDA0002719913640000045
And the jth cell
Figure BDA0002719913640000046
The Euclidean distance of (c); when i ═ j, the element A(i,i)Is equal to candidate graph structure GmCorresponding response map
Figure BDA0002719913640000047
The response value max (f (z, s)) of the local extreme point in (1);
s3.2. calculating the structure of each candidate graph
Figure BDA0002719913640000048
Each unit of
Figure BDA0002719913640000049
Feature vector of
Figure BDA00027199136400000410
Will adjoin the matrix AM (G)m) Element A of the ith column(i,i)Exclusion to obtain
Figure BDA00027199136400000411
S3.3, calculating the feature vector of the same unit in the two frames before and after
Figure BDA00027199136400000412
And
Figure BDA00027199136400000413
cosine similarity S betweenm(i) And Euclidean distance Dm(i):
Figure BDA0002719913640000051
Figure BDA0002719913640000052
S3.4. cosine similarity Sm(i) And Euclidean distance Dm(i) And (3) carrying out normalization:
Figure BDA0002719913640000053
Figure BDA0002719913640000054
s3.5. Using Window function coefficient wincoeTo the cosine similarity S after normalizationm(i) And euclidean distance:
S"m(i)=(1-wincoe)+wincoe·S'm(i)
D"m(i)=1-wincoe·D'm(i);
s3.6, calculating the preferable graph structure G of the m-1 framem-1And m-th frame candidate graph structure
Figure BDA0002719913640000055
Graph similarity evaluation index of (1):
Figure BDA0002719913640000056
wherein the content of the first and second substances,
Figure BDA0002719913640000057
shows candidate graph structure GmCorresponding response map
Figure BDA0002719913640000058
The response value max (f (z, s)) of the local extreme point of the ith target;
s3.7. selecting
Figure BDA0002719913640000059
The maximum structure diagram is the preferred diagram structure of the current frame.
Further, according to the preferred graph structure GmEach unit
Figure BDA00027199136400000510
Middle kiAnd i, and outputting the target i and the candidate target kiThe corresponding tracking result of (2).
Further, window function coefficients wincoeThe value of (d) is 0.176.
According to another aspect of the present invention, there is provided a group target tracking system based on graph similarity constraint, comprising: a computer-readable storage medium and a processor;
the computer-readable storage medium is used for storing executable instructions;
the processor is configured to read executable instructions stored in the computer-readable storage medium and execute the graph similarity constraint-based group target tracking method.
In general, the above technical solutions contemplated by the present invention can achieve the following advantageous effects compared to the prior art.
(1) Because the target motion amplitude is limited between the front frame and the back frame, the invention considers that the group target graph structure should be similar between the front frame and the back frame, based on the similarity index, the invention designs the similarity index to quantize the similarity between the candidate graph structure of the current frame and the graph structure of the previous frame in the group target tracking process, forms constraint on the group target tracking result, solves the problem of wrong tracking caused by multi-peak of the response graph generated by similar target interference in the tracking process, and improves the tracking precision of the group target tracking algorithm.
(2) Compared with other group target tracking algorithms, the method has higher tracking precision and better tracking real-time performance on similar targets under the same group target tracking test set, and is more suitable for group target tracking.
Drawings
FIG. 1 is a flow chart of a group target tracking method based on graph similarity constraints according to the present invention;
FIG. 2 is a flowchart illustrating a specific implementation of graph similarity constraint computation tracking provided by the present invention;
FIG. 3 is a graph similarity constraint-based group target tracking algorithm performance statistics result accuracy provided by the present invention;
FIG. 4 is a graph of success rate of performance statistics of the group target tracking algorithm based on graph similarity constraints provided by the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Two problems need to be solved in the group target tracking process, namely how to separate the target from the background and obtain an accurate target frame. Second, the tracking method needs to distinguish different track objects and make the correct track number for each tracking result. However, in the group target tracking, many close targets interfere with each other, which affects the solution of the second problem. Therefore, the tracking method of the group target needs to cope with the interference and find the correct track. For tracking group targets, when the targets are close to each other, the responses of the targets overlap, which has a great influence on determining the positions and track numbers of the targets, and therefore, wrong tracking may occur in the tracking process, that is, a situation where a track of one target is related to a track of another target occurs in the tracking process. In order to solve the second problem, the invention provides an evaluation index of similarity of a group target graph structure, and further provides a group target tracking method based on graph similarity constraint.
The graph theory is an important branch of applied mathematics, in the prior art, the graph theory method is an important tool for researching binary relation, for example, in a communication algorithm, a communication network can be regarded as a graph structure, and the minimum flow algorithm is based on the graph theory. In the process of group target tracking, the coordinates of the centers of a plurality of tracking target frames on the image can also be regarded as a graph structure, and the graph structure is called a group target graph structure in the invention. Since the target motion amplitude is limited between previous and subsequent frames, the present invention recognizes that the group target graph structure should be similar between previous and subsequent frames. The graph similarity constraint is a constraint designed for a group target tracking result based on the similarity of a group target graph structure in front and back frames, and can improve the tracking precision of a group target tracking algorithm.
As shown in FIG. 1, the graph similarity constraint-based group target tracking method of the present invention mainly comprises a feature diagram extraction and graph similarity constraint part:
(1) the feature map extraction section includes:
(1-1) calculating a response map corresponding to the target i of the current frame
Figure BDA0002719913640000071
Wherein m is the current frame number, i is the target number, z represents the target template image obtained after the previous frame is processed, and s represents the search image;
for example, if m is 2, z represents an initial target template image given to the first frame, and s represents a search image of the second frame.
(1-2) obtaining a candidate target corresponding to the target i; selecting all local extreme points of the response map
Figure BDA0002719913640000081
xi,yiIs a response chartThe coordinate position of (1); the condition will be satisfied: the response value of the local extreme point is larger than the maximum response value of the current response map
Figure BDA0002719913640000082
By a threshold coefficient ThrcoeAs a candidate target
Figure BDA0002719913640000083
Threshold coefficient
0 < Thrcoe< 1, empirical value taken as 0.7, kiThe sequence number of the kth candidate point of the ith target is;
(1-3) judging whether all targets of the current frame are processed by the steps (1-1) and (1-2), if so, entering the step (2-1); otherwise, i is i +1, and the step (1-1) is returned to process the next target;
(2) and calculating a tracking result by using the graph similarity constraint, wherein the specific process refers to the following steps of 2:
(2-1) constructing a candidate graph structure; selecting one candidate target from each target to be combined to obtain K1×K2×…KICandidate graph structure of all targets
Figure BDA0002719913640000084
Dot
Figure BDA0002719913640000085
Is a unit of the candidate graph structure, I represents the total number of input targets, KiThe number of candidate points in the ith target is the number of candidate points in the ith target, and each target i is only positioned in
Figure BDA0002719913640000086
C represents a graph structure number;
(2-2) calculating the Structure of each candidate graph
Figure BDA0002719913640000087
A graph similarity evaluation index of (a); the graph similarity evaluation index is a new concept proposed by the invention, and represents and quantifies the similarity between the candidate graph structure of the current frame and the graph structure of the previous frame in the group target tracking processThe best candidate graph structure of the current frame can be found through the index;
(a) computation graph Structure Gm-1And candidate graph structure GmAdjacent matrix AM (G)m-1) And AM (G)m) (ii) a The calculation method is to combine the candidate graph structure GmConversion into contiguous matrix AM (G)m) Of the form of an adjacency matrix of element A(i,j)The value of (a) is the ith cell in the candidate graph structure
Figure BDA0002719913640000088
And the jth cell
Figure BDA0002719913640000089
When i is equal to j, the element A(i,i)Is equal to candidate graph structure GmCorresponding response map
Figure BDA00027199136400000810
The response value max (f (z, s)) of the local extreme point in (1), graph structure Gm-1The optimal candidate graph structure for the previous frame is shown in the step (2-4) in detail;
(b) calculating each candidate graph structure
Figure BDA00027199136400000811
Each unit of
Figure BDA0002719913640000091
Feature vector of
Figure BDA0002719913640000092
Will adjoin the matrix AM (G)m) Element A of the ith column(i,i)Get rid of to obtain
Figure BDA0002719913640000093
(c) Calculating the feature vector of the same unit in the two frames before and after
Figure BDA0002719913640000094
And
Figure BDA0002719913640000095
cosine similarity S betweenm(i) And Euclidean distance Dm(i) The calculation method is as follows:
Figure BDA0002719913640000096
Figure BDA0002719913640000097
(d) two values were normalized:
Figure BDA0002719913640000098
Figure BDA0002719913640000099
(e) using window function coefficients wincoeTo the cosine similarity S after normalizationm(i) And euclidean distance:
S"m(i)=(1-wincoe)+wincoe·S'm(i)
D"m(i)=1-wincoe·D'm(i)
window function coefficient wincoeHas an empirical value of 0.176;
(f) calculate the m-1 st frame and the m-th frame Gm-1And
Figure BDA00027199136400000910
graph similarity evaluation index of (1):
Figure BDA00027199136400000911
(2-3) preferred graph structure;
Figure BDA00027199136400000912
the maximum graph structure is the preferred graph structure;
(2-4) calculating the tracking result according to the graph structure
Figure BDA00027199136400000913
Each unit
Figure BDA00027199136400000914
Middle kiAnd i, and outputting the target i and the candidate target kiThe corresponding tracking result of (2);
(3) judging whether to track all late frame image sequences, if so, ending and outputting a tracking result; otherwise, loading the next frame of image and returning to execute the step (1).
In order to verify the effectiveness of the method of the present invention, the embodiment of the present invention designs an experiment for the group target tracking data set as shown in table 1,
TABLE 1
Figure BDA0002719913640000101
This example was tested according to the parameters of table 1. Templates in Table 1
Figure BDA0002719913640000102
For the full convolution twin network template, Test1 is the original full convolution twin network algorithm reuse, using the first frame input target as the template, + Graph is the incremental use of the method proposed by the present invention. The method specifically comprises the following steps:
(1) a full convolution twin network training process, wherein the full convolution twin network is a single target tracking algorithm based on deep learning and proposed by Luca Bertinetto in 2016; the hardware environment used in the experiment of the embodiment of the invention comprises: a CPU with model number of Intel (R) core (TM) i7-6850K, 6 core, 12 thread and main frequency of 3.60 GHz; two GPUs with model number of Nvidia GTX 1080Ti and memory 64 GB; the experimental software environment included: the Ubuntu 16.04 operating system and the deep learning framework are TensorFlow; the method comprises the following steps that an ILSVRC-VID data set is used as training data of a network, the data set is a commonly-used known data set in the field of single target tracking, and all 4417 video data in the data set are used for training and adjusting network parameters; the loss function adopts a logistic function; the iterative training method adopts a random gradient descent method; the parameter initialization method is an Xavier method; the number of training iteration rounds is 50, 50000 image pairs are trained in each round, the batch size is set to 32, the initial learning rate is set to 0.01, the learning rate exponentially decays, and the lowest learning rate is 0.00001; the maximum interval frame between the input picture pairs is 100. When not specifically stated, the template update coefficient β is empirically 0.5.
(2) Inputting all target initial positions and initial frames;
(3) initializing a network and a tracking template;
(4) calculating a response map for each target i
Figure BDA0002719913640000103
After the full-convolution twin network is trained and initialized, a feature extraction function can be obtained, and a response graph of a tracking target can be obtained through the function;
(5) obtaining candidate targets, i.e. selecting all local extreme points of the response map
Figure BDA0002719913640000104
x and y are coordinate positions in the response diagram; judging whether the response value of the local extreme point is larger than the maximum response value of the current response graph or not
Figure BDA0002719913640000111
By a threshold coefficient Thrcoe(0<Thrcoe< 1), the empirical value of the threshold coefficient takes 0.7. Retaining all candidate targets that satisfy a condition
Figure BDA0002719913640000112
kiThe sequence number of the kth candidate target of the ith target. Judging whether all the targets respond to the graph
Figure BDA0002719913640000113
All are processed, if yes, enter(6) Otherwise, i is i +1, and returns to step (4) to process the next target.
(6) Constructing a candidate graph structure, combining all candidate target points of all targets, selecting one candidate point for each target to be combined, and obtaining a plurality of target candidate graph structures:
Figure BDA0002719913640000114
i represents the total number of input targets, KiThe number of candidate targets in the ith target is
Figure BDA0002719913640000115
C represents a graph structure number;
(7) calculating each candidate graph structure
Figure BDA0002719913640000116
Graph similarity evaluation index of (1):
(a) computation graph Structure Gm-1And candidate graph structure GmAdjacent matrix AM (G)m-1) And AM (G)m). The calculation method is to combine the candidate graph structure GmConversion into contiguous matrix AM (G)m) Of the form of an adjacency matrix of element A(i,j)The value of (a) is the ith cell in the candidate graph structure
Figure BDA0002719913640000117
And the jth cell
Figure BDA0002719913640000118
When i is equal to j, the element A(i,i)Is equal to candidate graph structure GmCorresponding response map
Figure BDA0002719913640000119
The response value max (f (z, s)) of the local extreme point in (1), graph structure Gm-1The optimal candidate graph structure for the previous frame;
(b) calculating each candidate graph structure
Figure BDA00027199136400001110
Each unit of
Figure BDA00027199136400001111
Feature vector of
Figure BDA00027199136400001112
The specific process is as follows: will adjoin the matrix AM (G)m) Element A of the ith column(i,i)Get rid of to obtain
Figure BDA00027199136400001113
(c) Calculating the feature vector of the same point in the two frames before and after
Figure BDA00027199136400001114
And
Figure BDA00027199136400001115
cosine similarity S betweenm(i) And Euclidean distance Dm(i) The calculation method is as follows:
Figure BDA00027199136400001116
Figure BDA0002719913640000121
(d) two values were normalized:
Figure BDA0002719913640000122
Figure BDA0002719913640000123
(e) using window function coefficients wincoeTo the cosine similarity S after normalizationm(i) Heohu (Chinese character of 'He-European')Weighting the distance;
S"m(i)=(1-wincoe)+wincoe·S'm(i)
D"m(i)=1-wincoe·D'm(i)
window function coefficient wincoeHas an empirical value of 0.176;
(f) calculate previous and subsequent frames Gm-1And
Figure BDA0002719913640000124
the similarity evaluation index of (2):
Figure BDA0002719913640000125
(8) the structure of the drawing is preferred and,
Figure BDA0002719913640000126
the maximum graph structure is the preferred candidate graph structure;
(9) calculating a tracking result: according to the graph structure
Figure BDA0002719913640000127
Each unit
Figure BDA0002719913640000128
Middle kiAnd i, and outputting the target i and the candidate target kiThe corresponding tracking result of (2);
(10) and (4) whether the complete partial sequence is tracked or not, if so, ending the tracking and outputting the result, and if not, m is m +1 and returning to the step (4).
And (4) counting a tracking result: FIG. 3 and FIG. 4 are a graph of performance statistics versus accuracy and a success rate for a graph similarity constrained group target tracking algorithm, respectively; the abscissa of the accuracy map is a central position error threshold value, the unit is a pixel, and the ordinate is distance precision corresponding to the threshold value; the abscissa of the success rate graph is the intersection ratio threshold, and the ordinate is the overlapping accuracy corresponding to the threshold. Table 2 reports the statistical results of the comparative experiments. It can be seen for both Test1 and Test1+ Graph that Test1+ Graph in combination with the Graph similarity constraint improves accuracy by 0.0558 and success by 0.0418 relative to using only the features of the first frame as a template. Under two indexes of the area of the accuracy curve and the area of the success rate curve, the result of Test1+ Graph is better than the result of Test1, and the method for restraining the similarity of the graphs provided by the invention is considered to be effective.
TABLE 2
Serial number Name of experiment Area of accuracy curve Area of success rate curve
1 Test1 0.4859 0.4108
2 Test1+Graph 0.5417 0.4526
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (7)

1. A group target tracking method based on graph similarity constraint is characterized by comprising the following steps:
s1, acquiring candidate targets corresponding to all tracking targets in a current frame;
s2, constructing a candidate graph structure:
respectively selecting one candidate target from each tracking target to combine to obtain K1×K2×…KICandidate graph structure
Figure FDA0002719913630000011
Figure FDA0002719913630000012
The k candidate target k in the response map corresponding to the ith targetiI represents the total number of tracking targets, KiC represents the structure serial number of the candidate graph for the number of the candidate targets corresponding to the ith target;
s3, calculating a graph similarity evaluation index of each candidate graph structure; the graph similarity evaluation index represents the similarity between the candidate graph structure of the current frame and the preferred graph structure of the previous frame in the group target tracking process;
s4, selecting the candidate graph structure with the maximum graph similarity evaluation index as the optimal graph structure of the current frame, and outputting a target i and a candidate target k in the current frameiThe corresponding tracking result of (2);
s5, judging whether all the frame image sequences are tracked completely, if so, ending and outputting a tracking result; otherwise, the next frame image is loaded, and the process returns to step S1.
2. The method for tracking the group target based on the graph similarity constraint according to claim 1, wherein the step S1 includes:
s1.1, calculating a response graph corresponding to a current frame target i
Figure FDA0002719913630000013
Wherein m is the current frame number, i is the target number, and z represents the target module obtained after the previous frame is processedA plate image, s denotes a search image;
s1.2, obtaining a candidate target corresponding to the target i:
selecting all local extreme points of the response map
Figure FDA0002719913630000014
The response value meeting the local extreme point is larger than the maximum response value of the current response map
Figure FDA0002719913630000015
By a threshold coefficient ThrcoeAs a candidate target
Figure FDA0002719913630000016
Threshold coefficient 0 < Thrcoe<1;
S1.3, judging whether all targets of the current frame are processed by the steps S1.1-S1.2, if so, entering the step S2; otherwise i is i +1 and returns to step S1.1 to process the next object.
3. The method of claim 2, wherein the threshold coefficient Thr is a set of coefficientscoeTake 0.7.
4. The method for tracking the group target based on the graph similarity constraint according to any one of claims 1 to 3, wherein the step S3 specifically includes:
s3.1, calculating the optimal graph structure G of the previous framem-1And current frame candidate structure GmAdjacent matrix AM (G)m-1) And AM (G)m):
When i ≠ j, the element A in the adjacency matrix(i,j)The value of (a) is the ith cell in the candidate graph structure
Figure FDA0002719913630000021
And the jth cell
Figure FDA0002719913630000022
The Euclidean distance of (c); when i ═ j, the element A(i,i)Is equal to candidate graph structure GmCorresponding response map
Figure FDA0002719913630000023
The response value max (f (z, s)) of the local extreme point in (1);
s3.2. calculating the structure of each candidate graph
Figure FDA0002719913630000024
Each unit of
Figure FDA0002719913630000025
Feature vector of
Figure FDA0002719913630000026
Will adjoin the matrix AM (G)m) Element A of the ith column(i,i)Exclusion to obtain
Figure FDA0002719913630000027
S3.3, calculating the feature vector of the same unit in the two frames before and after
Figure FDA0002719913630000028
And
Figure FDA0002719913630000029
cosine similarity S betweenm(i) And Euclidean distance Dm(i):
Figure FDA00027199136300000210
Figure FDA00027199136300000211
S3.4. cosine similarity Sm(i) And Euclidean distance Dm(i) Go on to unityAnd (3) conversion:
Figure FDA00027199136300000212
Figure FDA00027199136300000213
s3.5. Using Window function coefficient wincoeTo the cosine similarity S after normalizationm(i) And euclidean distance:
S"m(i)=(1-wincoe)+wincoe·S'm(i)
D"m(i)=1-wincoe·D'm(i);
s3.6, calculating the preferable graph structure G of the m-1 framem-1And m-th frame candidate graph structure
Figure FDA0002719913630000036
Graph similarity evaluation index of (1):
Figure FDA0002719913630000031
wherein the content of the first and second substances,
Figure FDA0002719913630000032
shows candidate graph structure GmCorresponding response map
Figure FDA0002719913630000033
The response value max (f (z, s)) of the local extreme point of the ith target;
s3.7. selecting
Figure FDA0002719913630000034
The maximum structure diagram is the preferred diagram structure of the current frame.
5. Root of herbaceous plantThe method for group target tracking based on graph similarity constraint according to claim 1 or 4, wherein the step S4 is specifically based on the preferred graph structure GmEach unit
Figure FDA0002719913630000035
Middle kiAnd i, and outputting the target i and the candidate target kiThe corresponding tracking result of (2).
6. The method of claim 4, wherein the window function coefficient win is a window function coefficientcoeThe value of (d) is 0.176.
7. A group target tracking system based on graph similarity constraints, comprising: a computer-readable storage medium and a processor;
the computer-readable storage medium is used for storing executable instructions;
the processor is configured to read executable instructions stored in the computer-readable storage medium and execute the graph similarity constraint-based group target tracking method according to any one of claims 1 to 6.
CN202011084558.4A 2020-10-12 2020-10-12 Group target tracking method and system based on graph similarity constraint Active CN112215869B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011084558.4A CN112215869B (en) 2020-10-12 2020-10-12 Group target tracking method and system based on graph similarity constraint

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011084558.4A CN112215869B (en) 2020-10-12 2020-10-12 Group target tracking method and system based on graph similarity constraint

Publications (2)

Publication Number Publication Date
CN112215869A true CN112215869A (en) 2021-01-12
CN112215869B CN112215869B (en) 2022-08-02

Family

ID=74053266

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011084558.4A Active CN112215869B (en) 2020-10-12 2020-10-12 Group target tracking method and system based on graph similarity constraint

Country Status (1)

Country Link
CN (1) CN112215869B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663773A (en) * 2012-03-26 2012-09-12 上海交通大学 Dual-core type adaptive fusion tracking method of video object
CN102930539A (en) * 2012-10-25 2013-02-13 江苏物联网研究发展中心 Target tracking method based on dynamic graph matching
AU2013205548A1 (en) * 2013-04-30 2014-11-13 Canon Kabushiki Kaisha Method, system and apparatus for tracking objects of a scene
CN107886048A (en) * 2017-10-13 2018-04-06 西安天和防务技术股份有限公司 Method for tracking target and system, storage medium and electric terminal
CN109859238A (en) * 2019-03-14 2019-06-07 郑州大学 One kind being based on the optimal associated online multi-object tracking method of multiple features
CN110163890A (en) * 2019-04-24 2019-08-23 北京航空航天大学 A kind of multi-object tracking method towards space base monitoring
CN110675429A (en) * 2019-09-24 2020-01-10 湖南人文科技学院 Long-range and short-range complementary target tracking method based on twin network and related filter
CN111080675A (en) * 2019-12-20 2020-04-28 电子科技大学 Target tracking method based on space-time constraint correlation filtering
CN111161315A (en) * 2019-12-18 2020-05-15 北京大学 Multi-target tracking method and system based on graph neural network

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663773A (en) * 2012-03-26 2012-09-12 上海交通大学 Dual-core type adaptive fusion tracking method of video object
CN102930539A (en) * 2012-10-25 2013-02-13 江苏物联网研究发展中心 Target tracking method based on dynamic graph matching
AU2013205548A1 (en) * 2013-04-30 2014-11-13 Canon Kabushiki Kaisha Method, system and apparatus for tracking objects of a scene
CN107886048A (en) * 2017-10-13 2018-04-06 西安天和防务技术股份有限公司 Method for tracking target and system, storage medium and electric terminal
CN109859238A (en) * 2019-03-14 2019-06-07 郑州大学 One kind being based on the optimal associated online multi-object tracking method of multiple features
CN110163890A (en) * 2019-04-24 2019-08-23 北京航空航天大学 A kind of multi-object tracking method towards space base monitoring
CN110675429A (en) * 2019-09-24 2020-01-10 湖南人文科技学院 Long-range and short-range complementary target tracking method based on twin network and related filter
CN111161315A (en) * 2019-12-18 2020-05-15 北京大学 Multi-target tracking method and system based on graph neural network
CN111080675A (en) * 2019-12-20 2020-04-28 电子科技大学 Target tracking method based on space-time constraint correlation filtering

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
CHENGYUAN LIU ET AL.: "Correlation filter with motion detection for robust tracking of shape-deformed targets", 《IEEE ACCESS》 *
HSIAO-PING TSAI ET AL.: "Mining group movement patterns for tracking moving objects efficiently", 《IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING》 *
JIARUI XU ET AL.: "Spatial-temporal relation networks for multi-object tracking", 《THE COMPUTER VISION FOUNDATION》 *
张君昌等: "融合超像素和与动态图匹配的视频跟踪", 《西北工业大学学报》 *
王鹏: "多目标跟踪中目标数量准确识别算法", 《光学与光电技术》 *

Also Published As

Publication number Publication date
CN112215869B (en) 2022-08-02

Similar Documents

Publication Publication Date Title
Kang et al. Operation-aware soft channel pruning using differentiable masks
Yu et al. Autoslim: Towards one-shot architecture search for channel numbers
Ozguven et al. Automatic detection and classification of leaf spot disease in sugar beet using deep learning algorithms
Cheng et al. Pest identification via deep residual learning in complex background
Mou et al. Vehicle instance segmentation from aerial image and video using a multitask learning residual fully convolutional network
CN109859209B (en) Remote sensing image segmentation method and device, storage medium and server
US20210383149A1 (en) Method for identifying individuals of oplegnathus punctatus based on convolutional neural network
CN115359074B (en) Image segmentation and training method and device based on hyper-voxel clustering and prototype optimization
Xu et al. UP-CNN: Un-pooling augmented convolutional neural network
CN112949383A (en) Waveform agility radar radiation source identification method based on Hydeep-Att network
An et al. Object recognition algorithm based on optimized nonlinear activation function-global convolutional neural network
CN110738166B (en) Fishing administration monitoring system infrared target identification method based on PCNN and PCANet and storage medium
CN113723572B (en) Ship target identification method, computer system, program product and storage medium
Chan et al. VGGreNet: A light-weight VGGNet with reused convolutional set
Wang et al. Lightweight deep neural networks for ship target detection in SAR imagery
CN111985375B (en) Visual target tracking self-adaptive template fusion method
CN112215869B (en) Group target tracking method and system based on graph similarity constraint
Mohammed et al. Convolutional neural network for color images classification
CN113158806B (en) OTD (optical time Domain _ Logistic) -based SAR (synthetic Aperture Radar) data ocean target detection method
Sari et al. The Effect of Batch Size and Epoch on Performance of ShuffleNet-CNN Architecture for Vegetation Density Classification
Bona et al. Semantic segmentation and segmentation refinement using machine learning case study: Water turbidity segmentation
Mirnateghi et al. Deep bayesian image set classification: a defence approach against adversarial attacks
Pratik et al. A multi facet deep neural network model for various plant disease detection
CN113283390B (en) SAR image small sample target identification method based on gating multi-scale matching network
Zhang et al. Learning Optimal Data Augmentation Policies via Bayesian Optimization for Image Classification Tasks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant