CN111681266A - Ship tracking method, system, equipment and storage medium - Google Patents

Ship tracking method, system, equipment and storage medium Download PDF

Info

Publication number
CN111681266A
CN111681266A CN202010512688.7A CN202010512688A CN111681266A CN 111681266 A CN111681266 A CN 111681266A CN 202010512688 A CN202010512688 A CN 202010512688A CN 111681266 A CN111681266 A CN 111681266A
Authority
CN
China
Prior art keywords
ship
image
tracked
frame
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010512688.7A
Other languages
Chinese (zh)
Inventor
杨星海
王凤娇
袁健峰
张玉璘
王景景
徐凌伟
施威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao University of Science and Technology
Original Assignee
Qingdao University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao University of Science and Technology filed Critical Qingdao University of Science and Technology
Priority to CN202010512688.7A priority Critical patent/CN111681266A/en
Publication of CN111681266A publication Critical patent/CN111681266A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure discloses a ship tracking method, system, device and storage medium, comprising: acquiring a video to be analyzed, and carrying out ship identification on each frame of image in the video to be analyzed; continuously tracking the ship based on the nuclear correlation filter; estimating the scale of the tracked ship, and determining the ship scale of each frame of image; and judging whether the tracked ship is shielded or not, and re-detecting the tracked target when the tracked ship is shielded.

Description

Ship tracking method, system, equipment and storage medium
Technical Field
The present disclosure relates to the field of target tracking technologies, and in particular, to a ship tracking method, system, device, and storage medium.
Background
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
Computer vision is slowly affecting people's daily lives, and target tracking is rapidly developing as an important research direction for computer vision. The target tracking not only has important research value, but also has wide application prospect. The system plays an important role in the fields of robot sensing systems, intelligent traffic monitoring systems, radar early warning, medical diagnosis and the like, particularly in the aspect of marine traffic. With the development of the marine transportation industry, intelligent navigation becomes a large application of computer vision, the complex situation of the sea area can be monitored and the situation can be analyzed by tracking the ship, and the occurrence probability of collision accidents is greatly reduced.
In the course of implementing the present disclosure, the inventors found that the following technical problems exist in the prior art:
the traditional target tracking method such as the frame difference method, the background difference method, the optical flow method and the like is very sensitive to the motion of an object and is easily interfered by noise. Particularly, the optical flow method is adopted, and the ship is obviously influenced by wind waves and shakes on the sea, so that the optical flow is easily interfered. The Gaussian mixture model is complex in calculation and is not suitable for ship tracking with rapid change of motion. In recent years, the popular Mean Shift tracking method based on target color modeling and the relevant filtering algorithm based on the MOSSE model optimize the extraction of target features, but accurate follow-up tracking cannot be carried out when a ship target is shielded. Although the TLD tracking algorithm combining the detection module and the learning module can deal with the occlusion problem, the real-time performance of tracking is poor due to too many constraint conditions. In addition, the target tracking method cannot dynamically adjust the scale of the tracked target, so that the tracking accuracy is reduced.
Aiming at the technical problems faced by the ship tracking technology, such as the problems that a video shot by a ship shakes, a tracking target is lost when the target is shielded, a fixed tracking frame causes unreasonable extraction of a training set, and target scale estimation cannot be performed in the tracking process, a ship tracking method capable of solving the problems is needed, and the accuracy of ship target tracking is improved.
Disclosure of Invention
In order to address the deficiencies of the prior art, the present disclosure provides methods, systems, devices and storage media for vessel tracking;
in a first aspect, the present disclosure provides a vessel tracking method;
a vessel tracking method, comprising:
acquiring a video to be analyzed, and carrying out ship identification on each frame of image in the video to be analyzed;
continuously tracking the ship based on the nuclear correlation filter;
estimating the scale of the tracked ship, and determining the ship scale of each frame of image;
and judging whether the tracked ship is shielded or not, and re-detecting the tracked target when the tracked ship is shielded.
In a second aspect, the present disclosure provides a vessel tracking system;
a vessel tracking system comprising:
an identification module configured to: acquiring a video to be analyzed, and carrying out ship identification on each frame of image in the video to be analyzed;
a tracking module configured to: continuously tracking the ship based on the nuclear correlation filter;
a scale estimation module configured to: estimating the scale of the tracked ship, and determining the ship scale of each frame of image;
a re-detection module configured to: and judging whether the tracked ship is shielded or not, and re-detecting the tracked target when the tracked ship is shielded.
In a third aspect, the present disclosure also provides an electronic device, including: one or more processors, one or more memories, and one or more computer programs; wherein a processor is connected to the memory, the one or more computer programs being stored in the memory, and the processor executes the one or more computer programs stored in the memory when the electronic device is running, so as to cause the electronic device to perform the method of the first aspect.
In a fourth aspect, the present disclosure also provides a computer readable storage medium for storing computer instructions which, when executed by a processor, perform the method of the first aspect.
In a fifth aspect, the present disclosure also provides a computer program (product) comprising a computer program for implementing the method of any one of the preceding first aspects when run on one or more processors.
Compared with the prior art, the beneficial effect of this disclosure is:
(1) the method eliminates the shake of the marine shooting video caused by wind waves through an electronic image stabilization technology to obtain a stable video sequence, and realizes the subsequent accurate ship target tracking;
(2) the method comprises the steps of extracting FHog characteristics of a ship target area as a training set, introducing a Gaussian kernel function to train a kernel correlation filter, and calculating the kernel correlation function to obtain an image sample with the largest response as the position of a ship target so as to realize the tracking of a ship;
(3) according to the method, the target tracking frame is subjected to scale estimation by introducing a DSST scale filter, so that unreasonable extraction of sample background and target information caused by the fact that the scale changes but the size of the tracking frame does not change in the ship tracking process is avoided;
(4) the reliability detection is carried out by using the peak side lobe ratio (PSR), whether the ship tracking target is shielded or not is judged, and the loss of the ship target is avoided;
(5) according to the method, when the ship tracking target is shielded, the Kalman filtering is used for quickly estimating the position of the tracking target, so that the target tracking is continuously completed, and the robustness and the accuracy of the ship tracking are improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure and are not to limit the disclosure.
Fig. 1 is a flowchart of a ship tracking method according to an embodiment of the present disclosure;
FIG. 2 is a block diagram illustrating a method for motion compensation of a fixed frame reference according to an embodiment of the present disclosure;
FIG. 3 is a flowchart illustrating the calculation of FHog feature according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a circulant matrix according to an embodiment of the disclosure.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present disclosure. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and it should be understood that the terms "comprises" and "comprising", and any variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
Example one
The embodiment provides a ship tracking method;
as shown in fig. 1, a vessel tracking method includes:
s101: acquiring a video to be analyzed, and carrying out ship identification on each frame of image in the video to be analyzed;
s102: continuously tracking the ship based on the nuclear correlation filter;
s103: estimating the scale of the tracked ship, and determining the ship scale of each frame of image;
s104: and judging whether the tracked ship is shielded or not, and re-detecting the tracked target when the tracked ship is shielded.
As one or more embodiments, after the step of obtaining the video to be analyzed, before the step of performing ship identification on each frame of image in the video to be analyzed, the method further includes:
s1011: eliminating jitter of a video to be analyzed;
s1012: extracting the image characteristics of each frame of image in the video after the jitter is eliminated;
s1013: and identifying the ship in the image based on the image characteristics.
It should be understood that the video image is first subjected to image stabilization to remove the jitter due to wind and wave, resulting in a stabilized video sequence. And extracting the characteristics of the area around the ship tracking target as a training sample to train the nuclear correlation filter, and calculating the position of the target ship with the maximum response value of the nuclear correlation filter. The scale of the tracking frame can be dynamically adjusted during target tracking, whether the target of the tracked ship is shielded or not can be judged, and timely processing is performed during shielding, so that accurate tracking of the ship is realized.
As one or more embodiments, in S1011, a video to be analyzed is subjected to a debounce process; the method comprises the following specific steps:
performing motion estimation on a video to be analyzed by adopting an optical flow estimation algorithm based on block matching: aggregating a plurality of image block optical flow information to establish an optical flow field motion model to obtain video jitter information by calculating the optical flow information of each image block;
and carrying out spatial transformation on the jitter information by adopting a fixed reference frame mode to obtain a compensated image sequence.
In an exemplary case, in S1011, the video to be analyzed is subjected to a dithering elimination process; the method comprises the following specific steps:
and establishing a motion model by adopting an optical flow estimation algorithm based on block matching for motion estimation, and performing motion compensation by adopting a fixed frame compensation method to obtain a stabilized video sequence without jitter. As shown in fig. 2, specifically:
s10111: representing the motion of each pixel point by the motion of the image block, and searching the best matching block according to a matching rule formula so as to obtain the motion vector between image frames.
Firstly, averagely dividing a current frame I into a plurality of blocks, and finding the position of any image block M in an adjacent frame J according to a matching rule formula, wherein the matching rule formula is to calculate the optimal translation amount of M from the current frame I to the adjacent frame J:
Figure BDA0002528961530000061
where ψ is a distance metric function, comparing the distance of the pixel at point (x, y) of the current frame I and the pixel at (x + dx, y + dy) of the reference frame J;
after the matching criterion is determined, determining a search path, searching all image blocks in the current frame in a reference frame, and finding out the optimal translation amount;
s10112: in S10111, motion estimation is performed to obtain intentional jitter information, and motion compensation is performed by using a fixed reference frame method to obtain a compensated image sequence; the jitter information refers to the difference between the optimal translation amount and the actual translation amount;
any frame of the video is selected as a reference frame by adopting a fixed reference frame method, and all subsequent frames are subjected to motion compensation relative to the frame, so that the compensated image and the previous image are kept on a stable motion track.
Because the shipborne camera is fixed, the tracking target does not need to be rotated and tracked, no scanning motion exists, the operation is simple, no accumulated error exists, and the method is suitable for the application scene of fixed-point shooting of the shipborne camera.
Exemplarily, in S10111, the determining of the search path specifically includes:
adopting a hexagon search algorithm, using two hexagon templates with different sizes, firstly using the big hexagon template to carry out coarse positioning in a reference frame, and calculating the translation amount of a search point by taking a search window (0,0) as a template center;
if the obtained minimum translation amount is positioned at the central point, constructing a small hexagonal template by taking the minimum translation amount as the center, and then calculating the translation amount of each point to obtain the minimum translation amount as the best matching point;
if the obtained minimum translation amount is not at the central point, the best matching point is obtained by taking the minimum translation amount as the center and calculating the best matching point.
Considering that the motion conditions of the images between adjacent frames in the video sequence have certain relevance, the relevance is utilized to predict the initial search point, namely the motion vector of the previous frame relative to the reference frame is taken as the initial search point of all target object blocks of the frame, thereby realizing rapid search.
And the calculated optimal translation motion amount corresponds to the optical flow information of the image blocks, and the optical flow information of the plurality of image blocks is aggregated to obtain an optical flow field motion model.
The optical flow field motion model is obtained by combining the optical flow values of all the blocks divided in a frame of image, wherein the optical flow values correspond to the optimal translation amount, and the optimal translation amount of each block of the frame of image is obtained and then the optimal translation amount is combined into the optical flow field motion model which can be used in motion estimation of other frames of the video. The input of the optical flow field motion model is the distance between pixel points, and the output of the optical flow field motion model is the optimal translation amount of the image.
The side length of the large hexagon is 2, and the side length of the small hexagon is 1.
As one or more embodiments, in S1012, image features of each frame image in the video after the removal of the jitter are extracted; the method comprises the following steps: and extracting FHo characteristics of each frame of image in the video after the jitter is eliminated.
Illustratively, in S1012, the image characteristics of each frame of image in the video after the removal of the jitter are extracted; the method comprises the following steps:
as shown in fig. 3, extracting FHog features of the target region, which can effectively describe target characteristics, for training is faster than the conventional Hog features; the specific method comprises the following steps:
and (3) canceling a block in the original Hog characteristic, only reserving a cell, directly normalizing a region formed by the current cell and 4 surrounding cells during normalization, and adopting a combination strategy of signed (0-360 degrees) and unsigned (0-180 degrees) gradient characteristic information during gradient calculation.
Each unit has 36-dimensional characteristics, the 36-dimensional characteristic vector is regarded as a 4 x 9 matrix, wherein the rows or the columns in the first 11 characteristic vector matrixes are invariable, each row and each column of the matrix are respectively summed to obtain 13 numbers as a vector, and the 13-dimensional characteristic vector basically contains all information of the original 36-dimensional characteristic vector, so that the original description effect can be achieved.
In addition, in order to describe the signed gradient direction, 18-dimensional signed eigenvectors are added, and the 31-dimensional eigenvectors are counted, namely the final Fhog characteristic.
As one or more embodiments, in S1013, a ship in the image is identified based on the image features; the method comprises the following steps: and inputting the image characteristics into a trained kernel correlation filter for classification, and identifying the ship in the image.
Further, the trained kernel correlation filter is a kernel correlation filter introducing a gaussian kernel function.
Further, the training step of the trained kernel correlation filter includes:
constructing a nuclear correlation filter; constructing a training set; the training set generates training samples through cyclic displacement;
and inputting the training set into the kernel correlation filter, training the kernel correlation filter, and stopping training when the loss function of the kernel correlation filter reaches the minimum value to obtain the trained kernel correlation filter.
Illustratively, the training step of the trained kernel correlation filter includes:
s10131: in the tracking process, a P-N training method is adopted, the tracked target ship is a positive sample, the background is a negative sample, and the number of the positive samples is smaller than that of the negative samples in the acquisition process, so that the positive samples are generated by a cyclic displacement method; the specific method comprises the following steps:
taking the n × 1 dimensional direction of the target ship image region of interest as a basic sample, and expressing x as [ x ═ x [ ]1x2…xn]TModeling the direction for the extended sample by a cyclic shift operatorOne-dimensional translation of the quantity, cyclic shift operator as permutation matrix:
Figure BDA0002528961530000091
product Px ═ xn,x1,x2,…,xn-1)TMoving X by one element creates a small translation. Due to the cyclic characteristic, the same signal x is obtained every n periods, being mapped toux | u ═ 0,1, …, n-1} can obtain all training sample data of the first frame:
Figure BDA0002528961530000092
as shown in fig. 4, the circulant matrix can be diagonalized by a Discrete Fourier Transform (DFT) matrix according to its properties to obtain:
Figure BDA0002528961530000093
where F is a discrete Fourier matrix and ^ symbols represent the Discrete Fourier Transform (DFT) of the vector.
S10132: the training of the target kernel correlation filter is to obtain the optimal solution of the ridge regression under the condition of the minimum loss function, and the linear ridge regression target function is
Figure BDA0002528961530000101
Where λ is the regularization coefficient that controls the overfitting, w is the decision variable weight coefficient, xiI data of X, yiFor the i-th data of the desired output y, f is a linear combination of the base samples, i.e.
f(xi)=wTxi; (6)
A closed-form solution of the ridge regression in the fourier domain is obtained:
w=(XHX+λI)-1XHy; (7)
where I is the identity matrix and y is the desired output of the filter.
The property of formula (4) can be used to obtain:
Figure BDA0002528961530000102
s10133: the solution of the vector is transferred to the fourier domain, the calculation amount is greatly reduced, but the classification only can be applied to the linear classification condition, and a kernel function is introduced to be also suitable in the nonlinear regression condition, and the nonlinear mapping function is set to be phi (X), so that the new kernel correlation filter is as follows:
f(xi)=wTφ(xi); (9)
wherein w is located at [ phi (x)1),φ(x2),φ(x3),…,φ(xn)]In the constructed vector space. W can be expressed as:
Figure BDA0002528961530000103
substituting equation (8) and equation (9) into equation (4) yields a solution of ridge regression in kernel space as:
α=(K+λI)-1y; (11)
in the above formula, α is coefficient αiThe vector of (a); k is the nuclear correlation matrix between samples:
K=φ(x)φ(x′)T; (12)
from K being a circulant matrix, the solution of the ridge regression of equation (9) in the frequency domain can be obtained as:
Figure BDA0002528961530000111
wherein k isxxIs the first row of the core correlation matrix K.
Exemplarily, S102: continuously tracking the ship based on the nuclear correlation filter; the method comprises the following specific steps:
after the kernel correlation filter is trained, extracting an original test sample in each candidate region in the next frame of image, inputting the cyclic shift of the original test sample into the kernel correlation filter to calculate the response of different candidate regions, and combining equations (9) to (13), obtaining the response of the test sample as follows:
Figure BDA0002528961530000112
wherein the content of the first and second substances,
Figure BDA0002528961530000113
zjfor training samples obtained in a new frame, x is a target model obtained by learning in the previous frame, and x is selected to be equal to
Figure BDA0002528961530000114
The position with the largest response is taken as the position of the target ship in the next frame.
And meanwhile, updating the weight parameters of the kernel correlation filter, and then using the new position area to train and update to obtain a new kernel correlation filter for predicting the next frame.
As one or more embodiments, in S103, estimating the scale of the tracked ship, and determining the ship scale of each frame of image; the scale of the tracked ship is estimated by adopting a scale filter of a discriminant scale space tracker DSST.
Illustratively, the estimation of the scale of the tracked ship determines the ship scale of each frame of image; the method comprises the following steps:
the size of a target tracking window of a traditional KCF algorithm is determined, and in the tracking process of a ship, the scale of the target may change along with the movement of a tracking target or a camera, so that the tracking window generates an error with the reality. Therefore, after the target position is determined, a scale filter of a Discriminant Scale Space Tracker (DSST) is used to estimate the target scale to be tracked. The DSST scale filter is a discriminant correlation filter, and a training sample is extracted by establishing a target scale pyramid, wherein the specific method comprises the following steps:
s1031: let the training sample be x1,…xiThe training sample is a plurality of images obtained by circularly sampling the images of each frame of target frame;
the corresponding scale filter response output of each sample is g1,…,tThe expected output function is denoted as gjWith its peak at xj. Solving for a scale-dependent filter s using ridge regressionj
Figure BDA0002528961530000121
Converting equation (15) to the frequency domain, yields:
Figure BDA0002528961530000122
wherein S ist,XjAnd GjAre all M × N in size,
Figure BDA0002528961530000123
representing a complex conjugate. Solving equation (16) yields:
Figure BDA0002528961530000124
filter and its manufacturing method
Figure BDA0002528961530000125
The numerator denominator of (A) is respectivelyjAnd BjTo represent, pair AjAnd BjUpdating:
Figure BDA0002528961530000126
Figure BDA0002528961530000127
where θ is the learning rate, AjAnd Aj-1Numerator representing the current frame and the previous frame, respectively, BjAnd Bj-1The denominators of the current frame and the previous frame are indicated, respectively. Extraction of scale test samples using a method similar to that used to train position filtersz, the maximum scale correlation filter response is obtained for z in the input samples of the new frame:
Figure BDA0002528961530000128
s1032: after the scale filter is trained, carrying out scale selection; the method specifically comprises the following steps: in the current frame, firstly, a KCF kernel correlation filter is used for determining the candidate position of the target, then, a scale correlation filter is used for obtaining S candidate targets with different scales by taking the current central position as a central point, and the value with the maximum response is calculated by a formula (19) and is taken as the current scale for updating.
In S1031, the construction of the scale filter training sample specifically includes:
in order to construct a training sample, image blocks with different sizes are selected by taking a target as a center to extract features, the size of the target in a current frame is P multiplied by R, P and R respectively represent width and height, and s is the size of a scale filter.
For each scale level
Figure BDA0002528961530000131
Extracting a size a centered on the targetnP×anImage block I of RnWhere a represents the scale factor between feature layers.
In S1032, the selection of the kernel correlation filter training set specifically includes:
in the traditional nuclear correlation filtering algorithm, the size of a target tracking frame is fixed, and a training set is an area which is 1.5 times of the periphery of a fixed extraction target. When the characteristics of the area around the target are extracted, the updated scale of the scale filter of the DSST is transmitted to the kernel correlation filtering algorithm of the next frame, then the target is tracked, and the size of the training set of the current frame is dynamically extracted:
Figure BDA0002528961530000132
where n is the scale level, P and R represent the width and height of the object, respectively, and a represents the scale factor between feature layers.
As one or more embodiments, in S104, it is determined whether the tracked ship is occluded, and when the tracked ship is occluded, the tracked target is re-detected; judging whether the tracked ship is shielded or not by adopting a peak side lobe ratio; and adopting a Kalman filter to detect the tracked target again.
Exemplarily, the peak sidelobe ratio is adopted to judge whether the tracked ship is shielded; the method comprises the following specific steps:
in the tracking process, a target is possibly shielded, in order to prevent the target from being lost, reliability detection is carried out when the target is tracked, a peak side lobe ratio (PSR) is adopted to match the detected target with the tracked target, a threshold value is set for the peak side lobe ratio, when the PSR is smaller than the threshold value T, the tracked target is shielded, the characteristics of the target position of the current frame are extracted again, and otherwise, tracking is continued.
The PSR is defined as:
Figure BDA0002528961530000141
where σ is the standard deviation of the response value, pmaxIs the maximum value in the response matrix and μ is the average of all response values.
Illustratively, the method comprises the steps of re-detecting a tracked target by adopting a Kalman filter; the method comprises the following specific steps:
when the PSR judges that the current tracking target is shielded, a Kalman filter is used for predicting the position of the tracking target possibly appearing in the next frame, the search area is reduced, and the position of the target can be rapidly predicted. And detecting a window around the target by the KCF, and calibrating a Kalman filter by using the detection result to finally obtain the position of the target ship of the image of the frame.
The working process of the Kalman filter is specifically as follows:
the Kalman filter is divided into two parts: one is the equation of state, and the other is the prediction equation:
xk=Akxk-1; (23)
zk=Hk+vk; (24)
wherein x isk-1Representing a state vector at the moment-1, A being a state transition matrix; z is a radical ofk、HkAnd vkRespectively representing the measurement vector, the measurement matrix and the measurement noise at the time k. State transition matrix A, prediction matrix H, and initial error estimate P0Respectively as follows:
Figure BDA0002528961530000142
Figure BDA0002528961530000143
Figure BDA0002528961530000151
where, t isk-tk-1
The prediction equation of the Kalman filtering system is as follows:
Figure BDA0002528961530000152
Figure BDA0002528961530000153
Figure BDA0002528961530000154
wherein
Figure BDA0002528961530000155
For state prediction vectors, PkFor predicting estimation errors, QkIs a variance matrix of the noise sequence, KkIs a Kalman gain coefficient matrix, RkIs a covariance matrix of the noise vector.
The calibration equation of the kalman filter system is:
Figure BDA0002528961530000156
Pk=(1-KkHk)Pk; (32)
wherein z iskRepresenting the target vessel center point coordinates detected by the KCF at time k.
Example two
The present embodiment provides a vessel tracking system;
a vessel tracking system comprising:
an identification module configured to: acquiring a video to be analyzed, and carrying out ship identification on each frame of image in the video to be analyzed;
a tracking module configured to: continuously tracking the ship based on the nuclear correlation filter;
a scale estimation module configured to: estimating the scale of the tracked ship, and determining the ship scale of each frame of image;
a re-detection module configured to: and judging whether the tracked ship is shielded or not, and re-detecting the tracked target when the tracked ship is shielded.
It should be noted here that the above identification module, tracking module, scale estimation module and re-detection module correspond to steps S101 to S104 in the first embodiment, and the above modules are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the disclosure of the first embodiment. It should be noted that the modules described above as part of a system may be implemented in a computer system such as a set of computer-executable instructions.
In the foregoing embodiments, the descriptions of the embodiments have different emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The proposed system can be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the above-described modules is merely a logical functional division, and in actual implementation, there may be other divisions, for example, multiple modules may be combined or integrated into another system, or some features may be omitted, or not executed.
EXAMPLE III
The present embodiment also provides an electronic device, including: one or more processors, one or more memories, and one or more computer programs; wherein, a processor is connected with the memory, the one or more computer programs are stored in the memory, and when the electronic device runs, the processor executes the one or more computer programs stored in the memory, so as to make the electronic device execute the method of the embodiment.
It should be understood that in this embodiment, the processor may be a central processing unit CPU, and the processor may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and so on. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may include both read-only memory and random access memory, and may provide instructions and data to the processor, and a portion of the memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software.
The method in the first embodiment may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in the processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, among other storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor. To avoid repetition, it is not described in detail here.
Those of ordinary skill in the art will appreciate that the various illustrative elements, i.e., algorithm steps, described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Example four
The present embodiments also provide a computer-readable storage medium storing computer instructions that, when executed by a processor, perform a method of an embodiment.
The above description is only a preferred embodiment of the present disclosure and is not intended to limit the present disclosure, and various modifications and changes may be made to the present disclosure by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.

Claims (10)

1. A ship tracking method is characterized by comprising the following steps:
acquiring a video to be analyzed, and carrying out ship identification on each frame of image in the video to be analyzed;
continuously tracking the ship based on the nuclear correlation filter;
estimating the scale of the tracked ship, and determining the ship scale of each frame of image;
and judging whether the tracked ship is shielded or not, and re-detecting the tracked target when the tracked ship is shielded.
2. The method of claim 1, wherein after the step of obtaining the video to be analyzed, before the step of performing vessel identification on each frame of image in the video to be analyzed, the method further comprises:
eliminating jitter of a video to be analyzed;
extracting the image characteristics of each frame of image in the video after the jitter is eliminated;
and identifying the ship in the image based on the image characteristics.
3. The method of claim 2, wherein the video to be analyzed is de-jittered; the method comprises the following specific steps:
establishing an optical flow field motion model based on an optical flow estimation algorithm of block matching;
based on the optical flow field motion model, performing motion estimation on a video to be analyzed to obtain jitter information;
and carrying out spatial transformation on the jitter information by adopting a fixed reference frame mode to obtain a compensated image sequence.
4. The method of claim 2, wherein the image characteristics of each frame of image in the video after the dithering are removed are extracted; the method comprises the following steps: and extracting FHo characteristics of each frame of image in the video after the jitter is eliminated.
5. The method of claim 2, wherein identifying the vessel in the image is based on image characteristics; the method comprises the following steps: and inputting the image characteristics into a trained kernel correlation filter for classification, and identifying the ship in the image.
6. The method of claim 1, wherein the dimensions of the vessel being tracked are estimated, and the vessel dimensions for each frame of the image are determined; the scale of the tracked ship is estimated by adopting a scale filter of a discriminant scale space tracker DSST.
7. The method of claim 1, wherein determining if the tracked vessel is occluded, and when the tracked vessel is occluded, re-detecting the tracked object; judging whether the tracked ship is shielded or not by adopting a peak side lobe ratio; and adopting a Kalman filter to detect the tracked target again.
8. A vessel tracking system, comprising:
an identification module configured to: acquiring a video to be analyzed, and carrying out ship identification on each frame of image in the video to be analyzed;
a tracking module configured to: continuously tracking the ship based on the nuclear correlation filter;
a scale estimation module configured to: estimating the scale of the tracked ship, and determining the ship scale of each frame of image;
a re-detection module configured to: and judging whether the tracked ship is shielded or not, and re-detecting the tracked target when the tracked ship is shielded.
9. An electronic device, comprising: one or more processors, one or more memories, and one or more computer programs; wherein a processor is connected to the memory, the one or more computer programs being stored in the memory, the processor executing the one or more computer programs stored in the memory when the electronic device is running, to cause the electronic device to perform the method of any of claims 1-7.
10. A computer-readable storage medium storing computer instructions which, when executed by a processor, perform the method of any one of claims 1 to 7.
CN202010512688.7A 2020-06-08 2020-06-08 Ship tracking method, system, equipment and storage medium Withdrawn CN111681266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010512688.7A CN111681266A (en) 2020-06-08 2020-06-08 Ship tracking method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010512688.7A CN111681266A (en) 2020-06-08 2020-06-08 Ship tracking method, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111681266A true CN111681266A (en) 2020-09-18

Family

ID=72435080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010512688.7A Withdrawn CN111681266A (en) 2020-06-08 2020-06-08 Ship tracking method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111681266A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140501A (en) * 2022-01-30 2022-03-04 南昌工程学院 Target tracking method and device and readable storage medium
CN116228817A (en) * 2023-03-10 2023-06-06 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140501A (en) * 2022-01-30 2022-03-04 南昌工程学院 Target tracking method and device and readable storage medium
CN116228817A (en) * 2023-03-10 2023-06-06 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering
CN116228817B (en) * 2023-03-10 2023-10-03 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering

Similar Documents

Publication Publication Date Title
CN109035299B (en) Target tracking method and device, computer equipment and storage medium
US11823429B2 (en) Method, system and device for difference automatic calibration in cross modal target detection
CN108765458B (en) Sea surface target scale self-adaptive tracking method of high-sea-condition unmanned ship based on correlation filtering
CN110223323B (en) Target tracking method based on depth feature adaptive correlation filtering
CN111860670A (en) Domain adaptive model training method, image detection method, device, equipment and medium
CN108961308B (en) Residual error depth characteristic target tracking method for drift detection
CN110059700B (en) Image moire recognition method and device, computer equipment and storage medium
CN111582349B (en) Improved target tracking algorithm based on YOLOv3 and kernel correlation filtering
CN112446378A (en) Target detection method and device, storage medium and terminal
CN108537822B (en) Moving target tracking method based on weighted confidence estimation
CN115063454B (en) Multi-target tracking matching method, device, terminal and storage medium
CN112036381B (en) Visual tracking method, video monitoring method and terminal equipment
CN111681266A (en) Ship tracking method, system, equipment and storage medium
CN113033356B (en) Scale-adaptive long-term correlation target tracking method
CN113887699A (en) Knowledge distillation method, electronic device and storage medium
CN110827327A (en) Long-term target tracking method based on fusion
KR101821770B1 (en) Techniques for feature extraction
CN117115436A (en) Ship attitude detection method and device, electronic equipment and storage medium
CN116030300A (en) Progressive domain self-adaptive recognition method for zero-sample SAR target recognition
CN110781710B (en) Target object clustering method and device
Zhu et al. Visual tracking with dynamic model update and results fusion
CN111612816A (en) Method, device and equipment for tracking moving target and computer storage medium
CN106909934B (en) Target tracking method and device based on self-adaptive search
CN110660079A (en) Single target tracking method based on space-time context
Motwake et al. Enhancing land cover classification in remote sensing imagery using an optimal deep learning model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200918

WW01 Invention patent application withdrawn after publication