CN111105444B - Continuous tracking method suitable for grabbing underwater robot target - Google Patents
Continuous tracking method suitable for grabbing underwater robot target Download PDFInfo
- Publication number
- CN111105444B CN111105444B CN201911408004.2A CN201911408004A CN111105444B CN 111105444 B CN111105444 B CN 111105444B CN 201911408004 A CN201911408004 A CN 201911408004A CN 111105444 B CN111105444 B CN 111105444B
- Authority
- CN
- China
- Prior art keywords
- tracking
- target
- sample
- cyclic
- kernel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a continuous tracking method suitable for grabbing an underwater robot target, and belongs to the technical field of visual target tracking; the method comprises the steps of establishing a nuclear related motion model; extracting HOG characteristics; establishing an appearance model; updating and iterating the model. The invention realizes dense sampling through the cyclic displacement of the base samples based on the kernel correlation filtering theory so as to extract a richer training sample set, establishes an apparent model of a tracking target to introduce a kernel function through extracting HOG characteristics so as to solve the nonlinear regression problem, improves the calculation efficiency, judges whether reinitialization tracking is needed or not according to a feedback result, and provides a self-judging mechanism based on the system confidence degree so as to realize continuous tracking of the target. The invention not only can ensure stable tracking of the underwater target, but also can judge the shielding and error tracking conditions by itself, thereby re-identifying the tracking and further completing continuous tracking and reliable grabbing of the underwater target.
Description
Technical Field
The invention relates to a continuous tracking method suitable for grabbing an underwater robot target, and belongs to the technical field of visual target tracking.
Background
Conventionally, marine organism fishing operations such as sea cucumber, scallop, sea urchin, etc. are usually performed by manpower. The underwater robot is limited by the diving time, diving depth, working condition and fishing cost, and diving personnel are prone to have objective conditions such as occupational diseases, so that the fishing cost can be reduced and the operation safety can be improved by adopting the underwater robot. The field of computer vision has evolved rapidly over the last two decades. The tracking method is from the earliest mean shift algorithm, subspace algorithm and the like to the sparse representation theory, and then to the current mainstream related filtering type algorithm and the current deep learning algorithm. To date, object tracking algorithms have made tremendous progress, but still present challenges such as motion blur, occlusion, and the like.
Visual target tracking is one of important research directions in the field of computer vision, is also a key link for realizing autonomous grabbing operation of targets by an underwater robot, and integrates advanced technologies and core ideas in the fields of pattern recognition, image processing, computers and the like. The target tracking of the underwater robot is based on target recognition, and aims at the target recognition result of the first frame, and the accurate position coordinates of the determined target in the next frame are given to the continuous video sequence. Only by continuous and stable tracking of the underwater target, the underwater robot can be ensured to realize accurate and reliable grabbing. However, underwater target tracking often results in tracking and grabbing failures due to the similarity of the target to the surrounding environment, and the occlusion of the target by the paw or other parts of the robot.
The patent document "a deep learning and monocular vision based underwater tracking method (application number: 201910474803.3)" relates to a deep learning based underwater robot target method, but the deep learning has a slow tracking speed, and the underwater robot target gripping control requires fast and high-frequency tracking feedback, especially considering the situation of being blocked by a paw or the like when the underwater target is gripped, so that the patent cannot support the implementation of the underwater robot target gripping. The patent literature 'detection tracking method and system of underwater small moving target (application number: 201910595413.1)' relates to a target tracking method of underwater small moving target, but is mainly used for detecting target by underwater sonar and has great difference with implementation method of visual detection tracking. The patent document 'an image recognition and tracking system (application number: 201710424070.3)' relates to an image recognition and target tracking system of an underwater robot, but mainly fuses multi-line features of a target image, analyzes correlation with a test image, and has low requirements on system resources and low time complexity.
Disclosure of Invention
The invention aims to solve the problem that tracking and grabbing of an underwater robot target fail due to shielding of the target by a paw or other parts of the robot due to similarity of the target and the surrounding environment at present.
The invention aims to realize the continuous tracking method suitable for grabbing the underwater robot target, which specifically comprises the following steps:
step 1, establishing a nuclear related motion model;
the establishment of the nuclear correlation motion model requires that a KCF method based on correlation filtering adopts a dense sampling mode to extract a richer training sample set; dense sampling is achieved by cyclic displacement of the base samples, generating a sample cyclic matrix during sampling:
the dense sampling is to circularly shift the interested region through the characteristic of a circulation matrix to provide training samples for the filter; according to the theorem of the kernel function, the sample cyclic matrix still maintains the cyclic characteristic after being mapped to the kernel space;
step 2, HOG feature extraction;
step 2.1, dividing a sample area into a plurality of areas, and extracting 32-dimensional features, namely 3 x n number+5, from each area, wherein n number=9 is the number of boxes, namely bins, divided by gradient directions, three features are extracted from each direction, wherein each three features comprises two boxes sensitive, one is insensitive, the other 4 features are apparent textures, and the last one is zero, and represents a stage feature;
step 2.2, connecting the 31 extracted features of each cell unit in parallel, and assuming that the dividing result of the units is m×n, then the f-HOG extracting result is m×n×31, and the 31 directions are called as channels;
step 2.3, generating samples through the cyclic drift of the cell units, wherein the corresponding positions of all the channels are shifted, and the ith channel of all the samples is obtained through the cyclic drift of the ith channel of the generated image;
step 3, establishing an appearance model;
step 3.1, training a tracker based on kernel correlation filtering by adopting a regularized least square classifier, and training the classifier f (z) based on the generated sample set to minimize regularized risks:
wherein y is i Representing the expected output of the regression target corresponding to the training sample, y overall expectation following a gaussian distribution, λ being a regularization parameter, preventing overfitting; the objective function is: f (z) =w T z, the minimum value obtained by the above formula has a closed loop solution, namely the correlation filtering template:
w=(X T X+λI) -1 X H y
wherein I is a unit array;
step 3.2, introducing a kernel function method aiming at the nonlinear problem, assuming thatIs a mapping from a low-dimensional input space to a high-dimensional space, the kernel function +.>Representing cross-correlation in high-dimensional space, the filter template coefficients w of the above are linear combinations of sample sets generated by cyclic drift, i.e
The problem of previous solving w is converted into solution alpha= [ alpha ] of kernel regularized least squares classifier 0 ,...,α n-1 ] T Is to:
α=(K+λI) -1 y
wherein K is a kernel function matrix, and its elements are K ij =κ(P i x,P j x); if the kernel satisfies kappa (x, x ')=kappa (Px, px'),then the kernel function matrix is a cyclic matrix;
converting the above equation to frequency domain calculations based on the nature of the circulant matrix:
wherein k is xx Is the first row k=c (K xx ) That is, the training process of the filter only needs to calculate the autocorrelation of the base sample in a high-dimensional space;
in the test, K is used Z Representing a nuclear matrix between a training sample and a test sample, the training sample being generated by a training base sample cyclic drift, the test sample being generated by a candidate base sample cyclic drift; finally, the response of each test sample is obtained:
finding the position of the largest f (Z) as a tracking target;
step 4, updating and iterating the model;
aiming at the continuous tracking requirement, a self-judging mechanism based on the system confidence is provided to judge whether the error tracking is caused by the similarity or shielding of the targets; the self-discrimination mechanism is as follows:
wherein, delta is an extremely small number, and AS & gtto & gtinfinity is prevented; f (f) max ,f ave And f xy Respectively representing the response maximum value, average value and response values of elements of the x row and the y column of the current frame; for a sharp and low-noise response chart, AS indexes become large, and the tracked target is considered to be accurate at the moment; otherwise, AS indexes are obviously reduced; when AS is less than or equal to ζ 1 AS ave When the current frame is considered to be blocked, the tracker is reinitialized through the detector; the mechanism determines whether the object is maskedAnd stopping updating when the target is blocked, so as to improve the tracking robustness.
Compared with the prior art, the invention has the beneficial effects that: the invention designs a continuous tracking method suitable for grabbing an underwater robot target, which is based on a kernel correlation filtering theory, realizes dense sampling through cyclic displacement of a base sample so as to extract a richer training sample set, and establishes an apparent model of a tracked target and updates and iterates the model to realize continuous tracking of the target by extracting HOG characteristics. The invention not only can ensure stable tracking of the underwater target, but also can judge the shielding and error tracking conditions by itself, thereby re-identifying the tracking and further completing continuous tracking and reliable grabbing of the underwater target.
Drawings
FIG. 1 is a tracking flow diagram;
FIG. 2 is a diagram of a false trace;
fig. 3 is a trace grab diagram.
Detailed Description
The invention is described in further detail below with reference to the drawings and the detailed description.
A continuous tracking method suitable for grabbing an underwater robot target mainly comprises the following steps: and establishing a nuclear related motion model, extracting HOG characteristics, establishing an apparent model, updating the model, iterating and the like. The method for establishing the kernel correlation motion model needs to extract a richer training sample set by adopting a dense sampling mode based on a KCF method of correlation filtering. Dense sampling is achieved by cyclic displacement of the base samples, creating a sample cyclic matrix during sampling.
Dense sampling is to cyclically shift the region of interest by the characteristics of a cyclic matrix to provide training samples for the filter. According to the theorem of the kernel function, the sample cyclic matrix still maintains the cyclic characteristic after being mapped to the kernel space.
HOG feature extraction, the HOG feature is dividing the image into cells (cells) of dense mesh and uniform size. In one image, the appearance and shape of the local object can be described by gradient directional distribution. To reduce the effects of illumination, the direction histograms of these cell units are contrast normalized over a larger range of the image. Finally, the histograms of all the cell units are connected in series to obtain the characteristic of the image. The sample region is divided into a plurality of regions, and then each region is extracted with 32-dimensional features, namely 3×n-number+5, where n-number=9 is the number of boxes (bins) divided by gradient directions, three features are extracted in each direction, including two boxes sensitive, one insensitive, the other 4 features are apparent textures, and the last is zero (representing a phase feature). Assuming that the division of cells is mxn, then the result of the f-HOG extraction is mxn×31, we call the 31 directions channels. The two-dimensional image matrix is a cell unit after the characteristics are extracted, samples are generated through the cyclic drift of the cell unit, the corresponding position of each channel is shifted correspondingly, and the ith channel of all the samples is obtained through the cyclic drift of the ith channel of the generated image.
The establishment of the nuclear correlation motion model requires that a KCF method based on correlation filtering adopts a dense sampling mode to extract a richer training sample set. Dense sampling is achieved by cyclic displacement of the base samples, generating a sample cyclic matrix during sampling, as follows:
dense sampling is to cyclically shift the region of interest by the characteristics of a cyclic matrix to provide training samples for the filter. According to the theorem of the kernel function, the sample cyclic matrix still maintains the cyclic characteristic after being mapped to the kernel space.
HOG feature extraction, the HOG feature is dividing the image into cells (cells) of dense mesh and uniform size. In one image, the appearance and shape of the local object can be described by gradient directional distribution. To reduce the effects of illumination, the direction histograms of these cell units are contrast normalized over a larger range of the image. Finally, the histograms of all the cell units are connected in series to obtain the characteristic of the image. The sample region is divided into a plurality of regions, and then each region is extracted with 32-dimensional features, namely 3×n-number+5, where n-number=9 is the number of boxes (bins) divided by gradient directions, three features are extracted in each direction, including two boxes sensitive, one insensitive, the other 4 features are apparent textures, and the last is zero (representing a phase feature). Assuming that the division of cells is mxn, then the result of the f-HOG extraction is mxn×31, we call the 31 directions channels. The two-dimensional image matrix is a cell unit after the characteristics are extracted, samples are generated through the cyclic drift of the cell unit, the corresponding position of each channel is shifted correspondingly, and the ith channel of all the samples is obtained through the cyclic drift of the ith channel of the generated image.
And (3) establishing an apparent model, and training a tracker based on kernel correlation filtering by adopting a regularized least square classifier. Training the classifier f (z) based on the generated sample set minimizes the regularization risk.
Wherein y is i Representing the expected output of the regression target for the training sample, y overall is expected to follow a gaussian distribution, λ is the regularization parameter, preventing overfitting. The objective function is: f (z) =w T z。
The minimum value obtained by the above formula has a closed loop solution, namely the correlation filtering template:
w=(X T X+λI) -1 X H y (3)
wherein I is a unit array.
For the nonlinear problem, we introduce a kernel function method. Assume thatIs a mapping from a low-dimensional input space to a high-dimensional space, the kernel function +.>Representing and cross-correlating in a high dimensional space. The above filter template coefficients w are linear combinations of the sample sets generated by cyclic drift, i.e
The problem of previous solving w is converted into solution alpha= [ alpha ] of kernel regularized least squares classifier 0 ,...,α n-1 ] T Is to:
α=(K+λI) -1 y (4)
wherein K is a kernel function matrix, and its elements are K ij =κ(P i x,P j x). If the kernel satisfies κ (x, x ')=κ (Px, px'), then the kernel matrix is a cyclic matrix.
The above equation can be converted to a frequency domain calculation based on the nature of the circulant matrix:
wherein k is xx Is the first row k=c (K xx ) I.e. the training process of the filter only requires the computation of the autocorrelation of the base samples in a high dimensional space.
In the test, K is used Z Representing a matrix of nuclei between the training samples and the test samples. The training samples are generated by training base sample cycle drift and the test samples are generated by candidate base sample cycle drift. And finally, obtaining the response of each test sample as follows, and finding the position where the maximum f (Z) is located to obtain the tracking target.
d. The updating iterative strategy of the model provides a self-judging mechanism based on the system confidence degree aiming at the continuous tracking requirement so as to judge whether the error tracking is caused by the similarity or shielding of the targets. The self-discrimination mechanism is as follows:
wherein, delta is an extremely small number, and AS & gtto & gtinfinity is prevented; f (f) max ,f ave And f xy The maximum value, the average value and the response values of the elements of the x-th row and the y-th column of the current frame response are respectively represented. For a sharp and low-noise response graph, AS indexes become larger, and the tracked target is considered to be accurate at the moment; otherwise, the AS index is significantly reduced. When AS is less than or equal to ζ 1 AS ave When the current frame is considered to be occluded, the tracker is reinitialized by the detector. The mechanism judges whether the target is blocked or not, and stops updating when the target is blocked, so that the tracking robustness is improved.
Fig. 1 is a tracking flow chart. First, the observation model is initialized in the first frame (this procedure is either manually calibrated or given by the detection algorithm). The tracking frame is expanded 2.5 times to become a search frame, and then feature vectors (matrices) are extracted in the search frame to be used as basic samples. The basic sample circulation drift generates a large number of virtual samples to be used as training samples, and a fast calculation formula obtains a parameter vector at the momentExtracting features according to the predicted target area of the previous frame to be used as a basic sample, then carrying out cyclic drift on the sample to generate a test sample, calculating response of the trained classifier by using a formula to obtain a detected confidence map, and obtaining the most probable target position at the maximum response. And forming a new search box at the current position, extracting features for the next input sample, and continuously training and detecting to complete the tracking task.
As shown in fig. 2, the underwater robot tracks and grabs the scallop target by identifying it first in the pool. Firstly, a small target recognition algorithm detects and recognizes a scallop target, and then the scallop target is tracked based on a KCF tracking algorithm.
The mechanism judges whether the tracking needs to be reinitialized according to the feedback result. A self-discrimination mechanism based on system confidence is provided.
Wherein, delta is an extremely small number, and AS & gtto & gtinfinity is prevented; f (f) max ,f ave And f xy The maximum value, the average value and the response values of the elements of the x-th row and the y-th column of the current frame response are respectively represented. For a sharp and low-noise response graph, AS indexes become larger, and the tracked target is considered to be accurate at the moment; otherwise, the AS index is significantly reduced. When AS is less than or equal to ζ 1 AS ave When the current frame is considered to be occluded, the tracker is reinitialized by the detector. The mechanism judges whether the target is blocked or not, and stops updating when the target is blocked, so that the tracking robustness is improved.
To further enhance the anti-occlusion effect of our algorithm, if there are consecutive N frames of picturesWe recognize that tracking drift may occur at this time, and the tracker is reinitialized. w and h are the length and width of the picture, respectively. Because the robot is always moving relative to the target during the robot gripping process, if the amount of change in the pixel position of the center of the target for N consecutive frames is less than the threshold value, it is likely that the robot has been misplaced on the paw or the robot body. The object recognition module will be recalled to re-recognize and track the object.
According to the third graph, the effect of the method on continuous tracking over a long period of time is further demonstrated. The experiment was continued with 128 seconds of scallop target tracking for a total of 3200 frames of pictures and completed the capture, with 25 frames of pictures per second. In the tracking process, the target is shielded by the paw, the shielding condition is detected by the method, the target is identified and tracked again, and continuous tracking of the target is realized.
In summary, the invention provides a continuous tracking method suitable for grabbing an underwater robot target, which not only can ensure stable tracking of the underwater target, but also can judge shielding and error tracking conditions by itself, thereby re-identifying and tracking, and further completing continuous tracking and reliable grabbing of the underwater target. The invention realizes dense sampling through the cyclic displacement of the base samples based on the kernel correlation filtering theory so as to extract a richer training sample set, establishes an apparent model of a tracking target to introduce a kernel function through extracting HOG characteristics so as to solve the nonlinear regression problem, improves the calculation efficiency, judges whether reinitialization tracking is needed or not according to a feedback result, and provides a self-judging mechanism based on the system confidence degree so as to realize continuous tracking of the target.
Claims (1)
1. The continuous tracking method suitable for grabbing the underwater robot target is characterized by comprising the following steps of:
step 1, establishing a nuclear correlation motion model based on a KCF method of correlation filtering; inputting a video sequence, initializing a model in a first frame, expanding a tracking frame by 2.5 times to become a search frame, and extracting a feature vector in the search frame to be used as a basic sample; generating a virtual sample by cyclic displacement from the basic sample to serve as a training sample;
dense sampling is achieved by cyclic displacement of the base samples, generating a sample cyclic matrix during sampling:
the dense sampling is to circularly shift the interested region through the characteristic of a circulation matrix to provide training samples for the filter; according to the theorem of the kernel function, the sample cyclic matrix still maintains the cyclic characteristic after being mapped to the kernel space;
step 2, extracting HOG characteristics of a training sample, and training by a regularized least square classifier by a tracker based on kernel correlation filtering to establish an apparent model;
the specific method for extracting the HOG features comprises the following steps:
step 2.1, dividing a sample area into a plurality of areas, and extracting 32-dimensional features, namely 3 x n number+5, from each area, wherein n number=9 is the number of boxes, namely bins, divided by gradient directions, three features are extracted from each direction, wherein each three features comprises two boxes sensitive, one is insensitive, the other 4 features are apparent textures, and the last one is zero, and represents a stage feature;
step 2.2, connecting the 31 extracted features of each cell unit in parallel, and assuming that the dividing result of the units is m×n, then the f-HOG extracting result is m×n×31, and the 31 directions are called as channels;
step 2.3, generating samples through the cyclic drift of the cell units, wherein the corresponding positions of all the channels are shifted, and the ith channel of all the samples is obtained through the cyclic drift of the ith channel of the generated image; the specific method for establishing the appearance model is as follows:
the tracker based on the kernel correlation filtering is trained by adopting a regularized least square classifier, and the regularized risk is minimized by training the classifier f (z) based on the generated sample set:
wherein y is i Representing the expected output of the regression target corresponding to the training sample, y overall expectation following a gaussian distribution, λ being a regularization parameter, preventing overfitting; the objective function is: f (z) =w T z, the minimum value obtained by the above formula has a closed loop solution, namely the correlation filtering template:
w=(X T X+λI) -1 X H y
wherein I is a unit array;
for nonlinear problem, a kernel function method is introduced, assuming thatIs a mapping from a low-dimensional input space to a high-dimensional space, the kernel function +.>Representing cross-correlation in high-dimensional space, the filter template coefficients w of the above are linear combinations of sample sets generated by cyclic drift, i.e
The problem of previous solving w is converted into solution alpha= [ alpha ] of kernel regularized least squares classifier 0 ,...,α n-1 ] T Is to:
α=(K+λI) -1 y
wherein K is a kernel function matrix, and its elements are K ij =κ(P i x,P j x); if the kernel satisfies κ (x, x ')=κ (Px, px'), then the kernel matrix is a cyclic matrix;
converting the above equation to frequency domain calculations based on the nature of the circulant matrix:
wherein k is xx Is the first row of the kernel function matrix K, k=c (K xx ) That is, the training process of the filter only needs to calculate the autocorrelation of the base sample in a high-dimensional space;
in the test, K is used Z Representing a nuclear matrix between a training sample and a test sample, the training sample being generated by a training base sample cyclic drift, the test sample being generated by a candidate base sample cyclic drift; finally, the response of each test sample is obtained:
finding the position of the largest f (Z) as a tracking target;
step 3: carrying out self-discrimination on the pixel position variation of the centers of the continuous N frames of targets, if the pixel position variation of the centers of the continuous N frames of targets is smaller than a given threshold value, considering that tracking drift occurs, and reinitializing the tracker;
aiming at the continuous tracking requirement, a self-judging mechanism based on the system confidence is provided to judge whether the error tracking is caused by the similarity or shielding of the targets; the self-discrimination mechanism is as follows:
wherein, delta is an extremely small number, and AS & gtto & gtinfinity is prevented; f (f) max ,f ave And f xy Respectively representing the response maximum value, average value and response values of elements of the x row and the y column of the current frame; for a sharp and low-noise response chart, AS indexes become large, and the tracked target is considered to be accurate at the moment; otherwise, AS indexes are obviously reduced; when AS is less than or equal to ζ 1 AS ave When the current frame is considered to be blocked, the tracker is reinitialized through the detector; the mechanism judges whether the target is blocked or not, and stops updating when the target is blocked, so that the tracking robustness is improved.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911408004.2A CN111105444B (en) | 2019-12-31 | 2019-12-31 | Continuous tracking method suitable for grabbing underwater robot target |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911408004.2A CN111105444B (en) | 2019-12-31 | 2019-12-31 | Continuous tracking method suitable for grabbing underwater robot target |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111105444A CN111105444A (en) | 2020-05-05 |
CN111105444B true CN111105444B (en) | 2023-07-25 |
Family
ID=70424777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911408004.2A Active CN111105444B (en) | 2019-12-31 | 2019-12-31 | Continuous tracking method suitable for grabbing underwater robot target |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111105444B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112100067B (en) * | 2020-09-10 | 2023-08-25 | 北京完美赤金科技有限公司 | Regression analysis-based test method, system and storage medium |
CN114371784B (en) * | 2022-01-14 | 2023-11-03 | 天津大学 | Brain-computer interface decoding method oriented to steady-state visual evoked potential |
CN114708307B (en) * | 2022-05-17 | 2022-11-01 | 北京航天晨信科技有限责任公司 | Target tracking method, system, storage medium and device based on correlation filter |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103246896A (en) * | 2013-05-24 | 2013-08-14 | 成都方米科技有限公司 | Robust real-time vehicle detection and tracking method |
CN108090919A (en) * | 2018-01-02 | 2018-05-29 | 华南理工大学 | Improved kernel correlation filtering tracking method based on super-pixel optical flow and adaptive learning factor |
CN109360225A (en) * | 2018-10-16 | 2019-02-19 | 北京理工大学 | A kind of optimization system and method for motion model |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006318350A (en) * | 2005-05-16 | 2006-11-24 | Sony Corp | Object tracking method, program for object tracking method, recording medium where program for object tracking method is recorded, and object tracking device |
US8089407B2 (en) * | 2005-12-16 | 2012-01-03 | Alcatel Lucent | System and method for model-free position estimation and tracking |
WO2011102416A1 (en) * | 2010-02-19 | 2011-08-25 | 株式会社 東芝 | Moving object tracking system and moving object tracking method |
RU2014113049A (en) * | 2014-04-03 | 2015-10-10 | ЭлЭсАй Корпорейшн | IMAGE PROCESSOR CONTAINING A GESTURE RECOGNITION SYSTEM WITH OBJECT TRACKING ON THE BASIS OF COMPUTING SIGNS OF CIRCUITS FOR TWO OR MORE OBJECTS |
JP6349272B2 (en) * | 2015-02-27 | 2018-06-27 | セコム株式会社 | Moving object tracking device |
EP3118814A1 (en) * | 2015-07-15 | 2017-01-18 | Thomson Licensing | Method and apparatus for object tracking in image sequences |
CN106557774B (en) * | 2015-09-29 | 2019-04-30 | 南京信息工程大学 | The method for real time tracking of multichannel core correlation filtering |
CN105760826B (en) * | 2016-02-03 | 2020-11-13 | 歌尔股份有限公司 | Face tracking method and device and intelligent terminal |
US9874931B1 (en) * | 2016-02-22 | 2018-01-23 | Rockwell Collins, Inc. | Head-tracking system and method |
KR102366779B1 (en) * | 2017-02-13 | 2022-02-24 | 한국전자통신연구원 | System and method for tracking multiple objects |
JP2018139086A (en) * | 2017-02-24 | 2018-09-06 | 三菱電機株式会社 | Correlation tracking device, correlation tracking method and correlation tracking program |
CN107644430A (en) * | 2017-07-27 | 2018-01-30 | 孙战里 | Target following based on self-adaptive features fusion |
CN109753846A (en) * | 2017-11-03 | 2019-05-14 | 北京深鉴智能科技有限公司 | Target following system for implementing hardware and method |
CN108010067B (en) * | 2017-12-25 | 2018-12-07 | 北京航空航天大学 | A kind of visual target tracking method based on combination determination strategy |
WO2019129255A1 (en) * | 2017-12-29 | 2019-07-04 | 华为技术有限公司 | Target tracking method and device |
CN108288062B (en) * | 2017-12-29 | 2022-03-01 | 中国电子科技集团公司第二十七研究所 | Target tracking method based on kernel correlation filtering |
EP3534250B1 (en) * | 2018-01-18 | 2021-09-15 | Autel Robotics Co., Ltd. | Target detection method and unmanned aerial vehicle |
CN108573217B (en) * | 2018-03-21 | 2021-12-07 | 南京邮电大学 | Compression tracking method combined with local structured information |
CN108765458B (en) * | 2018-04-16 | 2022-07-12 | 上海大学 | Sea surface target scale self-adaptive tracking method of high-sea-condition unmanned ship based on correlation filtering |
CN108830879A (en) * | 2018-05-29 | 2018-11-16 | 上海大学 | A kind of unmanned boat sea correlation filtering method for tracking target suitable for blocking scene |
CN109308713B (en) * | 2018-08-02 | 2021-11-19 | 哈尔滨工程大学 | Improved nuclear correlation filtering underwater target tracking method based on forward-looking sonar |
CN109685073A (en) * | 2018-12-28 | 2019-04-26 | 南京工程学院 | A kind of dimension self-adaption target tracking algorism based on core correlation filtering |
CN110569723A (en) * | 2019-08-02 | 2019-12-13 | 西安工业大学 | Target tracking method combining feature fusion and model updating |
CN110533690B (en) * | 2019-08-08 | 2022-02-11 | 河海大学 | Nuclear correlation filtering underwater target tracking method based on feature fusion and learning rate optimization |
-
2019
- 2019-12-31 CN CN201911408004.2A patent/CN111105444B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103246896A (en) * | 2013-05-24 | 2013-08-14 | 成都方米科技有限公司 | Robust real-time vehicle detection and tracking method |
CN108090919A (en) * | 2018-01-02 | 2018-05-29 | 华南理工大学 | Improved kernel correlation filtering tracking method based on super-pixel optical flow and adaptive learning factor |
CN109360225A (en) * | 2018-10-16 | 2019-02-19 | 北京理工大学 | A kind of optimization system and method for motion model |
Also Published As
Publication number | Publication date |
---|---|
CN111105444A (en) | 2020-05-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110232350B (en) | Real-time water surface multi-moving-object detection and tracking method based on online learning | |
Huang et al. | Faster R-CNN for marine organisms detection and recognition using data augmentation | |
CN111105444B (en) | Continuous tracking method suitable for grabbing underwater robot target | |
CN107909082B (en) | Sonar image target identification method based on deep learning technology | |
CN108154118B (en) | A kind of target detection system and method based on adaptive combined filter and multistage detection | |
CN107358623B (en) | Relevant filtering tracking method based on significance detection and robustness scale estimation | |
CN108765458B (en) | Sea surface target scale self-adaptive tracking method of high-sea-condition unmanned ship based on correlation filtering | |
CN107369166B (en) | Target tracking method and system based on multi-resolution neural network | |
CN109101897A (en) | Object detection method, system and the relevant device of underwater robot | |
CN108109162B (en) | Multi-scale target tracking method using self-adaptive feature fusion | |
CN110334762B (en) | Feature matching method based on quad tree combined with ORB and SIFT | |
CN109977912B (en) | Video human body key point detection method and device, computer equipment and storage medium | |
CN107316316A (en) | The method for tracking target that filtering technique is closed with nuclear phase is adaptively merged based on multiple features | |
Zhang et al. | Multitask learning for ship detection from synthetic aperture radar images | |
CN112285712A (en) | Method for improving detection precision of ship on shore in SAR image | |
CN110647802A (en) | Remote sensing image ship target detection method based on deep learning | |
CN109242019B (en) | Rapid detection and tracking method for optical small target on water surface | |
CN109166139B (en) | Scale self-adaptive target tracking method combined with rapid background suppression | |
Mathias et al. | Underwater object detection based on bi-dimensional empirical mode decomposition and Gaussian Mixture Model approach | |
CN113705331B (en) | SAR ship detection method based on quaternary feature pyramid network | |
CN108665484A (en) | A kind of dangerous source discrimination and system based on deep learning | |
CN113822352A (en) | Infrared dim target detection method based on multi-feature fusion | |
CN112489088A (en) | Twin network visual tracking method based on memory unit | |
CN115375737A (en) | Target tracking method and system based on adaptive time and serialized space-time characteristics | |
CN112509014B (en) | Robust interpolation light stream computing method matched with pyramid shielding detection block |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |