CN107886048A - Method for tracking target and system, storage medium and electric terminal - Google Patents

Method for tracking target and system, storage medium and electric terminal Download PDF

Info

Publication number
CN107886048A
CN107886048A CN201710952448.7A CN201710952448A CN107886048A CN 107886048 A CN107886048 A CN 107886048A CN 201710952448 A CN201710952448 A CN 201710952448A CN 107886048 A CN107886048 A CN 107886048A
Authority
CN
China
Prior art keywords
target
tracking
frame image
current frame
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710952448.7A
Other languages
Chinese (zh)
Other versions
CN107886048B (en
Inventor
韩雪云
胡锦龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XIAN TIANHE DEFENCE TECHNOLOGY Co Ltd
Original Assignee
XIAN TIANHE DEFENCE TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XIAN TIANHE DEFENCE TECHNOLOGY Co Ltd filed Critical XIAN TIANHE DEFENCE TECHNOLOGY Co Ltd
Priority to CN201710952448.7A priority Critical patent/CN107886048B/en
Publication of CN107886048A publication Critical patent/CN107886048A/en
Application granted granted Critical
Publication of CN107886048B publication Critical patent/CN107886048B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

This disclosure relates to image processing field, and in particular to a kind of tracking, a kind of tracking system, a kind of storage medium and a kind of electric terminal.Methods described includes:Obtain current frame image;On the current frame image centered on tracking target in the place coordinate of previous frame image, candidate samples are chosen within a preset range;The arest neighbors similarity of each candidate samples is calculated, and it is candidate target to choose the maximum candidate samples of arest neighbors Similarity value;Judge whether the candidate target is the tracking target according to the continuous multiple image of the current frame image;When judging the candidate target for the tracking target, it is tracked according to the coordinate information of the tracking target.The disclosure can be carried out judging to choose candidate target to the candidate samples in each two field picture, and candidate target is judged, so as to accurately obtain the tracking target in each two field picture, and then realize and the long-time for tracking target is tracked.

Description

Target tracking method and system, storage medium and electronic terminal
Technical Field
The present disclosure relates to the field of image processing, and in particular, to a tracking method, a tracking system, a storage medium, and an electronic terminal.
Background
In an infrared search and tracking system, tracking of a small target under a low signal-to-noise ratio condition is an important research topic. The target is greatly influenced by background noise and background clutter, and meanwhile, the low-altitude target is easily influenced by a ground object, so that great challenge is brought to target tracking.
In addition, most of the existing target tracking methods can only track the target in a short time, and the method is rarely researched compared with the method for stably tracking the target for a long time. However, in practical engineering applications, long-term stable tracking of targets is more of a concern.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide a target tracking method, a target tracking apparatus, a storage medium, and an electronic terminal, thereby overcoming, at least to some extent, one or more of the problems due to the limitations and disadvantages of the related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a target tracking method, including:
acquiring a current frame image;
selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as a center;
calculating the nearest neighbor similarity of each candidate sample, and selecting the candidate sample with the maximum nearest neighbor similarity as a candidate target;
judging whether the candidate target is the tracking target or not according to a plurality of frame images continuous with the current frame image;
and when the candidate target is judged to be the tracking target, tracking according to the coordinate information of the tracking target.
In an exemplary embodiment of the present disclosure, the acquiring the current frame image includes:
receiving an initial image containing the tracking target information and a tracking video containing the initial image;
and intercepting a current frame image of the tracking video.
In an exemplary embodiment of the present disclosure, the calculating the nearest neighbor similarity value of each candidate sample includes:
sequentially calculating the nearest neighbor similarity value of each candidate sample according to a nearest neighbor classifier; wherein the establishing of the nearest neighbor classifier comprises:
selecting a positive sample and a negative sample within a preset range by taking the tracking target in the initial image as a center;
and establishing the nearest neighbor classifier according to the positive samples and the negative samples.
In an exemplary embodiment of the present disclosure, the selecting the positive sample within the preset range includes:
taking the central point of the tracking target as a center on the initial image, and sliding a window to select a first window in a preset neighborhood range in a mode that the step length is 1 and the size of the window is equal to that of the tracking target;
calculating the overlapping rate of each first window and the tracking target;
the first window with the overlap ratio larger than a first threshold is taken as the positive sample.
In an exemplary embodiment of the present disclosure, the selecting the negative sample within the preset range includes:
randomly selecting a preset number of second windows with the same size as the tracking target in a preset area by taking the central point of the tracking target as a center on the initial image;
calculating the overlapping rate of each second window and the tracking target;
and taking the second window with the overlapping rate smaller than a second threshold value as the negative sample.
In an exemplary embodiment of the present disclosure, the overlap ratio is calculated by:
wherein, ioU is the overlapping rate; r t To track a target area, R c Being the first or second window region.
In an exemplary embodiment of the present disclosure, the establishing the nearest neighbor classifier based on the positive and negative examples comprises:
normalizing the positive sample and the negative sample;
respectively calculating mean vectors of the positive samples and the negative samples after normalization processing so as to establish the nearest neighbor classifier;
wherein: the NNS represents a nearest neighbor similarity value;andrespectively represent the ith candidate sample x i Probability of belonging to positive and negative examples.
In an exemplary embodiment of the present disclosure, the acquiring the current frame image further includes:
judging whether the current frame image is the m x n frame image;
when the current frame image is judged to be the m x n frame image, the m x n frame image is used for obtaining the negative sample again so as to update the nearest neighbor classifier;
wherein m and n are both positive integers.
In an exemplary embodiment of the present disclosure, the determining whether the candidate target is the tracking target according to a plurality of frame images consecutive to the current frame image includes:
judging whether the frame number of the current frame image is less than a preset value;
when the frame number of the current frame image is judged to be smaller than the preset value, judging whether the difference value of the nearest neighbor similarity of the candidate target in the current frame image and the candidate target in the previous frame image is larger than a third threshold value;
when all the continuous n frames of images taking the current frame of image as the last frame are judged to be larger than the current frame of image, judging that the candidate target in the current frame of image is not the tracking target; wherein n >0.
In an exemplary embodiment of the present disclosure, the determining whether the candidate target is the tracking target according to a plurality of frame images consecutive to the current frame image further includes:
judging whether the frame number of the current frame image is less than a preset value;
when the frame number of the current frame image is judged to be larger than the preset value, acquiring the minimum value of the nearest neighbor similarity of the candidate target in each frame image before the current frame image;
judging whether the nearest neighbor similarity value of the candidate target in the current frame image is smaller than the minimum value;
when the continuous n frames of images taking the current frame of image as the last frame are all judged to be less than the current frame of image, judging that the candidate target in the current frame of image is not the tracking target; wherein n >0.
In an exemplary embodiment of the present disclosure, the determining whether the candidate target is the target according to a plurality of frame images consecutive to the current frame image includes:
judging whether the nearest neighbor similarity of the candidate target in the current frame image is smaller than the nearest neighbor similarity of the candidate target in the previous frame image;
when the n continuous frame images taking the current frame image as the last frame are all judged to be smaller than the current frame image, and the descending amplitude of the nearest neighbor similarity in the n continuous frame images is larger than a fourth threshold value, judging that the candidate target in the current frame image is not the tracking target; wherein n >0.
In an exemplary embodiment of the present disclosure, the determining whether the candidate target is the target according to a plurality of frame images consecutive to the current frame image includes:
calculating a first mean value and a first floating change value of the nearest neighbor similarity of the candidate target in each frame image before the current frame image;
calculating a second mean value and a second floating change value of the nearest neighbor similarity of the candidate target in the continuous n frame images taking the current frame image as the last frame; wherein n >0;
and when the difference value between the first mean value and the second mean value is judged to be larger than a fifth threshold value, and the difference value between the first floating change value and the second floating change value is judged to be smaller than a sixth threshold value, judging that the candidate target in the current frame image is not the tracking target.
In an exemplary embodiment of the present disclosure, the tracking method further includes:
when the candidate target in the current frame image is judged not to be the tracking target, calculating and acquiring the position offset of the candidate target in the current frame image and the previous frame image, and judging whether the position offset is smaller than a seventh threshold value or not;
and when the position offset is judged to be smaller than a seventh threshold value, judging that the candidate target in the current frame image is determined to be the tracking target.
In an exemplary embodiment of the present disclosure, the tracking method further includes:
acquiring the position offset of the candidate target in the current frame image and the previous frame image;
and when the position offset is judged to be larger than an eighth threshold value, judging that the candidate target in the current frame image is not the target.
In an exemplary embodiment of the present disclosure, the tracking method further includes:
and when the candidate target in the current frame image is judged not to be the tracking target, establishing a search area on the current frame or a next frame image of the current frame by taking the position of the tracking target in a previous frame image as a center to detect the tracking target again.
In an exemplary embodiment of the present disclosure, the establishing a search area on a current frame or a next frame image of the current frame with a position of the tracking target in a previous frame image as a center to re-detect the tracking target includes:
on the current frame image or the next frame image of the current frame image, establishing a search area by taking the position of the tracking target in the previous frame image of the current frame image as the center, and performing morphology Top-Hat transformation on the search area so as to obtain a binary image containing a candidate target area;
performing cluster analysis on the binary image so as to obtain the number and marks of the candidate target regions;
calculating the local contrast of each candidate target area and screening according to a preset rule;
taking the screened candidate target area as the central point of the candidate target, and calculating the nearest neighbor similarity value of each candidate target according to the nearest neighbor classifier; judging whether each candidate target is the tracking target according to a preset rule;
and when the candidate sample is judged to be the tracking target, tracking according to the candidate target.
In an exemplary embodiment of the present disclosure, the method further comprises:
and when the tracking target is not detected in the search area within a preset time, sending instruction information so that a detection module can detect the tracking target again according to the instruction information.
According to a second aspect of the present disclosure, there is provided a target tracking system comprising:
the detection module is used for acquiring tracking target position information on the initial frame image;
the tracking module is used for tracking the tracking target according to the initial frame image;
and the re-detection module is used for establishing a search area on the current frame or the next frame of image of the current frame by taking the position of the tracking target in the previous frame of image as the center so as to detect the tracking target again when the tracking loss is judged in the current frame of image.
In an exemplary embodiment of the present disclosure, the tracking module includes:
the image acquisition module is used for acquiring a current frame image;
the candidate sample acquisition module is used for selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as the center;
the candidate target selection module is used for calculating the nearest neighbor similarity of each candidate sample and selecting the candidate sample with the maximum nearest neighbor similarity as a candidate target;
the target judgment module is used for judging whether the candidate target is the tracking target according to a multi-frame image continuous with the current frame image;
and the tracking execution module is used for tracking according to the coordinate information of the tracking target when the candidate target is judged to be the tracking target.
In an exemplary embodiment of the present disclosure, the tracking module further includes:
and the result judging module is used for judging whether the judgment result of the tracking target is correct or not according to the position offset of the candidate target in the current frame image and the previous frame image.
According to a third aspect of the present disclosure, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described object tracking method.
According to a fourth aspect of the present disclosure, there is provided an electronic terminal comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the following via execution of the executable instructions:
acquiring a current frame image;
selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as a center;
calculating the nearest neighbor similarity of each candidate sample, and selecting the candidate sample with the largest nearest neighbor similarity as a candidate target;
judging whether the candidate target is the tracking target or not according to a plurality of frame images continuous with the current frame image;
and when the candidate target is judged to be the tracking target, tracking according to the coordinate information of the tracking target.
In the target tracking method provided by an embodiment of the present disclosure, candidate samples are selected from a current frame image, the nearest neighbor similarity of each candidate sample is calculated, the candidate sample with the highest nearest neighbor similarity is selected as a candidate target in the current frame image, and the candidate target is determined, so as to accurately determine a tracking target in the current frame image. The candidate target is selected by judging the candidate sample in each frame of image, and the candidate target is judged, so that the tracking target in each frame of image is accurately obtained, and the long-time tracking of the tracking target is realized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It should be apparent that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived by those of ordinary skill in the art without inventive effort.
FIG. 1 schematically illustrates a target tracking method in an exemplary embodiment of the disclosure;
fig. 2 schematically illustrates a nearest neighbor similarity calculation method in an exemplary embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow chart of a method of confirming whether a candidate target is a tracking target in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates another schematic diagram of a target tracking method in an exemplary embodiment of the disclosure;
FIG. 5 is a diagram schematically illustrating a re-detection method in a target tracking method according to an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a diagram of an initial image received in an exemplary embodiment of the present disclosure;
FIG. 7 is a schematic diagram illustrating an initial state of detection of a tracking target in an exemplary embodiment of the present disclosure;
FIG. 8 is a diagram schematically illustrating a tracking result of one frame in a stable tracking process in an exemplary embodiment of the present disclosure;
FIG. 9 is a schematic diagram illustrating an impending loss of a tracking target in an exemplary embodiment of the disclosure;
FIG. 10 is a schematic diagram illustrating a state of reacquiring a tracked target in an exemplary embodiment of the present disclosure;
fig. 11 is a diagram schematically illustrating a tracking result of one frame in the re-stabilization tracking process in the exemplary embodiment of the present disclosure;
FIG. 12 is a schematic diagram illustrating an acknowledgment tracking loss state in an exemplary embodiment of the disclosure;
fig. 13 schematically illustrates a state diagram of reacquiring a tracking target in an exemplary embodiment of the present disclosure;
FIG. 14 schematically illustrates yet another state diagram during steady tracking in an exemplary embodiment of the disclosure;
FIG. 15 schematically illustrates a component schematic of a target tracking system in an exemplary embodiment of the disclosure;
FIG. 16 schematically illustrates a component schematic of a detection module in an exemplary embodiment of the disclosure;
FIG. 17 schematically illustrates a workflow diagram of a target tracking system in an exemplary embodiment of the present disclosure;
FIG. 18 schematically illustrates a schematic view of a target tracking device in an exemplary embodiment of the disclosure;
FIG. 19 is a schematic illustration of a further exemplary object tracking device in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first provides a target tracking method, which can be applied to long-time tracking or early warning of targets in military or civil fields. The target tracking method can realize long-time stable tracking of the target with long detection distance, low imaging contrast of the tracked target and small imaging size of the tracked target, namely realize long-time stable tracking of the low-slow target. Such as long-term tracking of an aircraft, such as a drone, helicopter or scout. Referring to fig. 1, the above-mentioned target tracking method may include the steps of:
s1, acquiring a current frame image;
s2, selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as a center;
s3, calculating the nearest neighbor similarity of each candidate sample, and selecting the candidate sample with the maximum nearest neighbor similarity as a candidate target;
s4, judging whether the candidate target is the tracking target or not according to a plurality of frame images continuous with the current frame image;
and S5, when the candidate target is judged to be the tracking target, tracking according to the coordinate information of the tracking target.
In the target tracking method provided by the present exemplary embodiment, candidate samples are selected from the current frame image, the nearest neighbor similarity of each candidate sample is calculated, the candidate sample with the highest nearest neighbor similarity is selected as the candidate target in the current frame image, and the candidate target is determined, so as to accurately determine the tracking target in the current frame image. The candidate target is selected by judging the candidate sample in each frame of image, and the candidate target is judged, so that the tracking target in each frame of image is accurately obtained, and the long-time tracking of the low, small and slow target is realized.
Hereinafter, each step of the target tracking method in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
Step S1, acquiring a current frame image.
In this exemplary embodiment, the above-mentioned target tracking method may receive an initial image including tracking target information and a tracking video including the initial image at the time of initial tracking. The initial image may include coordinate information of the tracking target and information of the tracking target, such as coordinates of a center point of the tracking target and information of a width and a height of the tracking target.
And tracking the tracking target according to the received information of the tracking target contained in the initial image, and using the tracking target as an initialization image for judging the tracking target position information in each subsequent frame image. The information of the tracking target in the initial image can be automatically acquired through a detection algorithm, and can also be identified in the initial image in a manual framing mode. Referring to fig. 6, the tracked objects are identified by boxes. Meanwhile, each frame of image in the tracking video is sequentially intercepted, and the position of the tracking target is sequentially judged for each frame of image. The above-mentioned acquisition mode of the tracking target in the initial image is not particularly limited by the present disclosure.
And S2, selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as the center.
In the present exemplary embodiment, after the tracking is started, when the tracking is performed in the nth frame image (n > 1), the position of the central point of the tracking target in the nth-1 frame image is first obtained, the central point is used as the center in the current frame image, and a window is selected as the candidate sample in a sliding manner within a neighborhood range with a preset size, where the step size is 1 and the window size is equal to the size of the tracking target in the nth-1 frame image.
For example, in the initial tracking, the tracking is started according to the initial image shown in fig. 6, and in the second frame image, the candidate samples are selected in the neighborhood range of 21 × 21 on the second frame image with the coordinates of the central point of the tracking target on the initial image as the center, so that a total of 441 candidate samples can be obtained. In other exemplary embodiments of the present disclosure, the neighborhood range with different sizes may also be selected according to specific situations, and the window may be selected in other selection manners, which are not specifically limited by the present disclosure.
Based on the above, after the candidate samples are selected within the preset range, normalization processing may be performed on the candidate samples, so that the pixel value range in each candidate sample is completely converted to 0-1. By normalizing the candidate sample, the selection of the candidate sample can be free from the influence of illumination change, and the tracking method provided by the embodiment has robustness to the illumination change.
And S3, calculating the nearest neighbor similarity of each candidate sample, and selecting the candidate sample with the maximum nearest neighbor similarity as a candidate target.
In this example embodiment, as shown in fig. 2, the calculating the nearest neighbor similarity specifically includes:
s31, selecting a positive sample and a negative sample within a preset range according to the tracking target in the initial image as a center;
step S32, establishing the nearest neighbor classifier according to the positive sample and the negative sample;
and step S33, sequentially calculating the nearest neighbor similarity value of each candidate sample according to the nearest neighbor classifier.
For example, the selecting the positive sample in the preset range in the step S31 may specifically include:
taking the central point of the tracking target as a center on the initial image, and sliding a window to select a first window in a preset neighborhood range in a mode that the step length is 1 and the size of the window is equal to that of the tracking target;
calculating the overlapping rate of each first window and the tracking target;
the first window with the overlap ratio larger than a first threshold is taken as the positive sample.
The selecting the negative sample within the preset range may specifically include:
randomly selecting a preset number of second windows with the same size as the tracking target in a preset area by taking the central point of the tracking target as a center on the initial image;
calculating the overlapping rate of each second window and the tracking target;
and taking the second window with the overlapping rate smaller than a second threshold value as the negative sample.
The overlap ratio IoU may have the following definition:
wherein, ioU is the overlapping rate; r t To track a target area, R c Is the first window or the second window region.
For example, 121 first windows are selected by sliding in a neighborhood of 11 × 11 size with the center point of the target as the center on the initial frame image in a manner that the step size is 1 and the window size is equal to the target size. Then, the overlapping rate of each first window and the target area is calculated, and the first window with the overlapping rate larger than 0.8 is taken as a final positive sample.
And randomly picking out 300 second windows with the same size as the target in an area with the target center point as the center and the radius between 5 and 30 on the initial frame image. And then, calculating the overlapping rate of each second window and the target, and taking the second window with the overlapping rate less than 0.2 as a final negative sample.
The step S32 may specifically include:
s321, normalizing the positive sample and the negative sample.
S322, respectively calculating mean vectors of the positive sample and the negative sample after normalization processing to establish the nearest neighbor classifier;
wherein: the NNS represents a nearest neighbor similarity value;andrespectively represent the ith candidate sample x i Probability of belonging to positive and negative examples.
In the exemplary embodiment, considering that the tracking target may be affected by illumination changes during the flight process, all the obtained positive and negative samples are normalized and the pixel value range is transformed to be between 0 and 1 before the nearest neighbor classifier is established. Therefore, the pixel values of the positive and negative samples and the candidate sample are all between 0 and 1, and the pixel values correspond to the positive and negative samples when the nearest neighbor similarity of the candidate sample is solved. Thereby ensuring the stability of the tracking method.
Specifically, the solving of the above-mentioned normalized mean vectors of the positive samples and the negative samples may specifically include the following:
let Pex = { px) in normalized positive sample set 1 ,px 2 ,...,px m The normalized negative sample set is Nex = { nx = 1 ,nx 2 ,...,nx k And f, the normalized mean vector of the positive sample and the negative sample is:
wherein, the first and the second end of the pipe are connected with each other,respectively representing mean vectors of the normalized positive sample and the normalized negative sample; m and k respectively represent the number of the positive and negative samples after normalization, px i 、nx j Respectively representing the ith positive sample and the jth negative sample after normalization.
Based on the above, the nearest neighbor classifier is established as follows:
wherein: the NNS represents a nearest neighbor similarity value;andrespectively represent the ith candidate sample x i Probability of belonging to positive and negative examples.
The solving method is as follows:
wherein, the first and the second end of the pipe are connected with each other,andrespectively representing candidate samples x i And withNormalized cross-correlation coefficient value (NCC).
When the positive and negative samples are selected, the positive and negative samples are selected one by one in a sliding window mode, and then the improper positive and negative samples are further removed by utilizing the overlapping rate of each window and a target, so that the calculation complexity is reduced, the quality of the positive and negative samples is improved, and the quality and the effectiveness of the nearest neighbor classifier are ensured.
And respectively calculating the Nearest Neighbor Similarity (NNS) value of each candidate sample through a nearest neighbor classifier, selecting the candidate sample with the maximum NNS value, wherein the position of the candidate sample is the position where the target is most likely to appear in the current frame image, and taking the candidate sample as the candidate target in the current frame image.
For example, the 441 candidate samples are respectively input into the nearest neighbor classifier established in the above step, and the nearest neighbor similarity value of each candidate sample is calculated. A larger NNS value indicates a greater likelihood that the candidate sample belongs to the target. Therefore, the candidate sample with the largest NNS value is selected from all the candidate samples, and at this time, the candidate sample is most likely to be the tracking target, that is, the position of the candidate sample is the most likely position of the current frame target.
Further, in order to ensure that the nearest neighbor classifier can adapt to the change of a complex background, thereby ensuring the accuracy of the judgment of the tracking target on each frame of image, the tracking method may further include:
judging whether the current frame image is an m x n frame image;
when the current frame image is judged to be the m x n frame image, the m x n frame image is used for obtaining the negative sample again so as to update the nearest neighbor classifier;
wherein m and n are both positive integers.
Specifically, in an actual scene, the target is far away from the detector, the target is blurred in imaging, the size and the shape of the target are small, and the contrast is low, so that in order to avoid the occurrence of a drift phenomenon in the tracking process, the nearest neighbor classifier can be updated in a mode of only updating the negative sample without updating the positive sample. For example, the nearest neighbor classifier is updated by updating the negative samples once every 5, 8, 10 or 15 frames, and the positive samples are not updated, so that the tracking accuracy is ensured and the drift is avoided. After updating the nearest neighbor classifier, steps S1-S4 are repeated until tracking stops.
For example, let m =5, 8, 10 or 15, when n =1, i.e. the 5 th, 8 th, 10 th or 15 th frame image is tracked, the current frame image is the last frame of the current cycle, and the current frame image still uses the current nearest neighbor classifier. But now negative samples can be re-selected on the current frame image, and the nearest neighbor classifier is re-established according to the above positive samples and the newly generated negative samples, and the updated nearest neighbor classifier is applied in the next cycle, i.e., frame 6, 9, 11 or 16.
Based on the above, in order to adapt to the motion change of the tracking target and the change of the background area where the tracking target is located, when it is determined that the tracking image reaches a loop, the sampling range of the candidate sample may be adaptively corrected. For example, the sampling range of the candidate target may be expanded or reduced according to the motion change of the tracking target. Or according to the change of the background, the boundary of the sampling range of the candidate target is corrected into a circular, rectangular, oval or irregular boundary. The specific modification mode can be specifically adjusted according to the real-time situation, and the disclosure does not make any special limitation on this.
And S4, judging whether the candidate target is the tracking target or not according to a plurality of frame images continuous with the current frame image.
In the present exemplary embodiment, after the candidate object is determined in step S3, the correctness of the candidate object needs to be determined. Specifically, the determination may be made by the following first determination method:
step S41-1, judging whether the frame number of the current frame image is less than a preset value;
step S41-2, when the frame number of the current frame image is judged to be less than the preset value, judging whether the difference value of the nearest neighbor similarity of the candidate target in the current frame image and the candidate target in the previous frame image is greater than a third threshold value;
step S41-3, when the continuous n frame images taking the current frame image as the last frame are all judged to be larger than the last frame, judging that the candidate target in the current frame image is not the tracking target; wherein n >0.
After the candidate target is obtained, the number of frames of the current frame image may be determined. For example, counting is started from the initial image, and the number of frames of the current frame image is judged&And (lt) 10, the judgment can be carried out by the first judgment method. Assuming that the current frame image is the nth frame, the candidate target nearest neighbor similarity is NNS n At this time, the nearest neighbor similarity value NNS of the candidate target (i.e. the tracking target of the image of the n-1 frame) in the current n-1 frame image and the current n-1 frame image can be judged n And NNS n-1 Whether the difference therebetween is less than a third threshold. And if the third threshold is 0.2 and the current frame is the 9 th frame image, judging whether the difference value before the nearest neighbor similarity value of the candidate target of the 8 th frame and the candidate target of the 9 th frame is less than 0.2. And when judging that the nearest neighbor similarity value difference values between the 8 th frame and the 9 th frame, between the 7 th frame and the 8 th frame, between the 6 th frame and the 7 th frame, between the 5 th frame and the 6 th frame, and between the 4 th frame and the 5 th frame are all less than 0.2, judging that the candidate target in the 9 th frame image is not the tracking target. That is, when all the continuous 5 frames of images taking the current frame of image as the last frame are judged to be smaller than the third threshold, the tracking loss of the current frame of image is judged.
On the contrary, if the situation does not exist through judgment, the candidate target in the current frame image is judged to be the tracking target, and the current frame image is successfully tracked.
When the current frame image is judged, it can be considered that all the frame images before the current frame image are successfully tracked, and all the frames are in a normal tracking state. At this time, the candidate target in each frame image before the current frame image is the tracking target in the frame image.
In addition, in the present exemplary embodiment, the step S4 may further include a second determination method, specifically including:
s42-1, judging whether the frame number of the current frame image is less than a preset value;
step S42-2, when the frame number of the current frame image is judged to be larger than the preset value, acquiring the minimum NNS of the nearest neighbor similarity of the candidate target in each frame image before the current frame image min-n
Step S42-3, judging whether the nearest neighbor similarity value NNSn of the candidate target in the current frame image is smaller than the minimum value NNS min-n
Step S42-4, when the continuous n frame images taking the current frame image as the last frame are all judged to be less than, judging that the candidate target in the current frame image is not the tracking target; wherein n >0.
For example, when n =5, the preset value is 10 frames. If the current frame image is the 20 th frame from the initial image, the nearest neighbor similarity of the candidate target of the current frame image is NNS 20 Minimum NNS of nearest neighbor similarity during normal tracking min-20 (ii) a Now NNS can be compared min-20 And NNS 20 Is judging NNS 20 <NNS min-20 And the second NNS 19 <NNS min-19 、NNS 18 <NNS min-18 、NNS 17 <NNS min-17 、NNS 16 <NNS min-16 NNS for judging continuous 5 frames as current frame image n <NNS min-n Then, it is determined that the candidate target in the 20 th frame is not the tracking target, and the frame tracking is lost.
On the contrary, if the situation does not exist, the candidate target in the current frame image is judged to be the tracking target, and the frame image is successfully tracked.
In other exemplary embodiments of the present disclosure, the first and second determination methods may not determine the number of frames of the current image, that is, the first and second determination methods may also be applied to the determination of each frame image.
In addition, in this exemplary embodiment, the step S4 may further include a third determination method, specifically including:
step S43-1, judging whether the nearest neighbor similarity of the candidate target in the current frame image is smaller than the nearest neighbor similarity of the candidate target in the previous frame image;
step S43-2, when the continuous n frame images taking the current frame image as the last frame are all judged to be smaller than the current frame image, and the descending amplitude of the nearest neighbor similarity in the continuous n frame images is larger than a fourth threshold value, judging that the candidate target in the current frame image is not the tracking target; wherein n >0.
When the current frame image is determined to be successfully tracked by the second determination method, the current frame image may be determined again by the third determination method. For example, when n =5, the fourth threshold value is 1.5 times the average floating variation of the NNS value at the time of normal tracking. Wherein, the floating change can be the difference value of NNS values of two adjacent frames of images.
Assume that the current frame is frame 35. If NNS is judged 35 <NNS 34 And determining NNS 34 <NNS 33 、NNS 33 <NNS 32 、NNS 32 <NNS 31 And NNS 31 <NNS 30 I.e. NNS of successive 5-frame images n And if the maxNNS-minNNS is more than 1.5 times of the average floating change of the NNS during normal tracking, judging that the candidate target in the current frame image is not the tracking target and the current frame image is lost in tracking.
On the contrary, if the situation does not exist, the candidate target in the current frame image is judged to be the tracking target, and the frame image is successfully tracked.
Further, in the present exemplary embodiment, the step S4 may further include a fourth determination method, including:
step S44-1, calculating the current frame imageFirst mean value mean _ NNS of nearest neighbor similarity of candidate target in previous frame image n And a first float variation value;
step S44-2, calculating a second mean value and a second floating change value of the candidate target nearest neighbor similarity in the continuous n frame images taking the current frame image as the last frame; wherein n >0;
and step S44-3, when the difference value between the first average value and the second average value is judged to be larger than a fifth threshold value, and the first floating change value is smaller than the second floating change value, judging that the candidate target in the current frame image is not the tracking target.
When the third determination method described above determines that the current frame image is correctly tracked, the determination may be performed again by the fourth determination method. For example, when n =5 and the fifth threshold =0.2, NNS of the current frame and the consecutive previous 5 frame images are calculated n The mean value, i.e. the second mean value; and calculating NNSs between adjacent two of the 6 frame images n The difference between the values, i.e., the second float variation value. Will be the first mean value mean _ NNS n And subtracting the second average value, and if the difference value is greater than 0.2 and the second floating change value is smaller than the first floating change value, judging that the candidate target in the current frame image is not the tracking target, namely the current frame image is lost in tracking.
On the contrary, if the judgment result does not exist, the candidate target in the current frame image is judged to be the tracking target, and the frame image is successfully tracked.
Based on the above, in order to further ensure the accuracy of the determination of the tracking target in the current frame image, after the determination by the above determination method, the determination result may be finally determined according to the position offset, and whether the determination result is misjudged may be determined. Specifically, the step S4 may further include a fifth determining method, which specifically includes:
step S45-1, when the candidate target in the current frame image is judged not to be the tracking target, calculating and acquiring the position offset of the candidate target in the current frame image and the previous frame image, and judging whether the position offset is smaller than a seventh threshold value or not;
and when the position offset is judged to be smaller than a seventh threshold value, judging that the candidate target in the current frame image is determined to be the tracking target.
S45-2, acquiring the position offset of the candidate target in the current frame image and the previous frame image;
and when the position offset is judged to be larger than an eighth threshold value, judging that the candidate target in the current frame image is not the target.
Considering that the real scene is complex and changeable, the tracking target may encounter many similar objects in the flight process. When the target is occluded and the candidate area contains the similar object, the candidate sample is likely to be the object. At the moment, misjudgment is likely to occur when the NNS value is only used for judging whether the candidate sample is the tracking target, so that tracking errors are caused. In addition, when the background of the area around the target changes, such as a high-voltage line, the solved NNS value may be greatly affected, resulting in a tracking failure. Therefore, in order to solve the above two problems, the offset between the position of the current frame image candidate sample and the previous frame target position may be calculated. If the judgment method in the previous step judges that the current frame image is lost in tracking, judging that the candidate target in the current frame image is not a tracking target; however, the offset in the X direction and the offset in the Y direction are both smaller than the threshold, and it is considered that a tracking failure misjudgment occurs in the previous step, that is, the judgment of the target behind in the current frame image is correct, and the current frame image is normally tracked. And if the offset in the X direction or the Y direction is larger than the threshold value, the current frame image tracking loss is directly confirmed regardless of the judgment result of the previous step.
When the current frame image tracking is determined to be lost by the above determination method, the determination result may be finally confirmed by the above step S45-1 to determine whether a misdetermination occurs. If the current frame image tracking is determined to be normal by the above determination method, the determination result may be finally confirmed in step S45-2.
Referring to fig. 3, after the candidate target is determined in step S3, the number of frames of the current frame image may be determined, and when the number of frames is smaller than the preset value, the first determination method is used for determining; when the value is larger than the preset value, the second judgment method can be used for judging, and multi-level confirmation can be performed through the third judgment method and the fourth judgment method in sequence. After the judgment of each method is completed, the fifth judgment method can be used to detect whether the judgment result is misjudged and generate a final judgment result. By carrying out multi-level judgment on the candidate targets by using various judgment methods, accurate identification of the tracked targets in the image can be effectively realized.
In addition, in other exemplary embodiments of the present disclosure, the determination result of the tracking target may also be confirmed by using any one of the above determination methods alone or in combination of any several methods. The present disclosure is not limited thereto.
And S5, when the candidate target is judged to be the tracking target, tracking according to the coordinate information of the tracking target.
After the tracking target is confirmed by the various judgment methods, stable tracking in the current frame image can be realized. Meanwhile, the current frame image may be applied for tracking of the next frame image.
Further, in order to further ensure the tracking effect and realize stable tracking for a long time, referring to fig. 4, the tracking method may further include:
and S6, when the candidate target in the current frame image is judged not to be the tracking target, establishing a search area on the current frame or the next frame image of the current frame by taking the position of the tracking target in the previous frame image as the center to detect the tracking target again.
Specifically, referring to fig. 5, step S6 may include:
step S61, establishing a search area on the current frame image or the next frame image of the current frame image by taking the position of the tracking target in the previous frame image of the current frame image as the center, and performing morphological Top-Hat transformation on the search area so as to obtain a binary image containing a candidate target area;
step S62, carrying out cluster analysis on the binary image so as to obtain the number and the marks of the candidate target areas;
s63, calculating the local contrast of each candidate target area and screening according to a preset rule;
step S64, taking the screened candidate target area as the central point of the candidate target, and calculating the nearest neighbor similarity value of each candidate target according to the nearest neighbor classifier; judging whether each candidate target is the tracking target according to a preset rule;
and step S65, when the candidate sample is judged to be the tracking target, tracking according to the candidate target.
In the exemplary embodiment, in the tracking process, the local contrast of the tracking target, the coordinate information and the center position of the tracking target and the corresponding nearest neighbor similarity value NNS in each frame of image in the tracking process can be determined n And (5) storing.
When a certain array of images is judged to be lost, selecting a search area with a certain size on a current frame image or a next frame image of the current frame image by taking the central position of a tracking target in the previous frame image of the current frame image as the central point of the search area, performing morphological Top Hat transformation on the search area to obtain a binary image containing a candidate target area, and performing cluster analysis on the binary image to obtain the number and the mark of the candidate target area. For example, in the tracking process, when the nth frame judges that the tracking is lost, a search area can be established on the nth frame image or the (n + 1) th frame image by taking the position of the tracking target on the (n-1) th frame image as a central point.
Then, the local contrast of each candidate target area in the original image is calculated and compared with the local contrast of the tracking target in a continuous multi-frame image before the current frame image, and the candidate target with the contrast smaller than a preset value can be removed. The NNS value of each candidate sample can be calculated by taking the residual candidate target point as the center point of the candidate sample, and compared with the NNS of the tracking target in the continuous multi-frame image before the current frame image, and the candidate sample with the NNS value larger than another preset value can be taken as the tracking target to track the tracking target. If the tracking target is not detected, the steps S61 to S65 may be repeated to perform the detection again, and finally the tracking target is obtained.
Through setting the re-detection step, after the current frame image tracking loss is judged, the current frame image can be re-detected, and the position of a tracking target in the current frame image is determined, so that the continuous tracking of each frame image is realized. Similarly, the redetection can be carried out on the next frame image of the current frame to obtain the tracking target, so that the excessive time consumption on the current frame image is avoided, the tracking real-time performance is further ensured, and the tracking result can be provided in real time; thereby realizing long-time stable tracking.
In other exemplary embodiments of the present disclosure, when the above-mentioned re-detection step is performed, if the position offset amount of the tracking target is small, the center point coordinates of the candidate target in the current frame image may be used as the center point of the search area. Although the candidate target is determined not to be the tracking target, because the position offset is not large, the difference between the actual position of the tracking target and the position of the candidate target is not large, and the tracking target can still be quickly searched in the current frame image or the next frame image of the current frame.
Further, on the basis of the above, in the present exemplary embodiment, the above target tracking method may further include:
and S7, when the tracking target is not detected in the search area within a preset time, sending instruction information so that a detection module can detect the tracking target again according to the instruction information.
When the tracking target is not found after the current frame image is redetected through the step S6 within a preset time length, an instruction message may be sent at this time, so that a detection module reacquires the tracking video or image according to the instruction message and redetects the tracking target.
Of course, in other exemplary embodiments of the present disclosure, after it is determined that the current frame image is lost in tracking in step S4, the subsequent frame image may also be directly re-detected in step S7, so as to avoid consuming more time in the current frame image detection, thereby effectively achieving tracking of the target.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 15, in the present exemplary embodiment, there is also provided a target tracking system 1, including: a detection module 11, a tracking module 12 and a re-detection module 13. Wherein:
the detection module 11 may be configured to obtain tracking target position information on an initial frame image.
The tracking module 12 may be configured to track the tracking target according to the initial frame image.
The re-detection module 13 may be configured to, when it is determined that the tracking is lost in the current frame image, establish a search area on the current frame image by taking the position of the tracking target in the previous frame image as a center, so as to re-detect the tracking target.
As shown in fig. 17, when the target tracking system is in operation, the detection module may first detect a tracking target of the received video data, and after detecting the tracking target, send information of the tracking target to the tracking module so that the tracking module can track the target for a long time. In the tracking process, the tracking state can be judged through the tracking module; and when the tracking failure of a certain frame of image is judged, the re-detection module is used for detecting the tracking target of the current frame of image again. If the tracking target is detected within the preset time, the information of the tracking target is sent to the tracking module so that the tracking module can track continuously; if the tracking target is not detected within the preset time length, an instruction message can be sent to the detection module, so that the detection module detects the video information again to obtain the position information of the tracking target.
In the present exemplary embodiment, the tracking module 12 described above may include: an image acquisition module 121, a candidate sample acquisition module 122, a candidate target selection module 123, a target judgment module 124, and a tracking execution module 125. Wherein:
the image obtaining module 121 may be configured to obtain a current frame image.
The candidate sample collection module 122 may be configured to select a candidate sample within a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as a center.
The candidate target selection module 123 may be configured to calculate a nearest neighbor similarity of each candidate sample, and select the candidate sample with the largest nearest neighbor similarity as the candidate target.
The target determination module 124 may be configured to determine whether the candidate target is the tracking target according to a multi-frame image consecutive to the current frame image.
The tracking performing module 125 may be configured to perform tracking according to the coordinate information of the tracking target when the candidate target is determined to be the tracking target.
Furthermore, in other exemplary embodiments of the present disclosure, the tracking module 12 described above further includes: and a result determination module 126.
The result determining module 126 may be configured to determine whether the determination result of the tracking target is correct according to the position offset of the candidate target in the current frame image and the previous frame image.
The specific details of each module in the target tracking system are already described in detail in the corresponding target tracking method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In order to further verify the accuracy of the target tracking method in tracking the weak, small and slow targets in different scenes in the low-altitude environment for a long time, actual recorded data is used for testing, and the tracked target in the scene is an unmanned aerial vehicle. Fig. 7-14 are schematic diagrams illustrating a set of tracking states for long-time tracking of an object in the same scene. Referring to fig. 7, the target identified by the detection module may be used as an initial frame image for tracking. Referring to fig. 8, a tracking result is tracked for a certain frame after a period of time. Fig. 9 is a schematic diagram illustrating a state where tracking confirmation is disabled when a tracking target is about to be blocked by a building. Referring to fig. 10, after 81 frames, the target reappears, and the re-detection module detects the tracking target and performs tracking again. Referring to fig. 11, a diagram of the state of one frame of image when stably tracking again is shown.
Fig. 12 is a schematic diagram showing a state where it is confirmed that the tracking is lost when the tracking target is again obstructed by the building. Referring to fig. 14, the frame image may be used as a new initial tracking image when the detection module detects the tracking target again. Referring to fig. 14, the tracking result of a certain frame after tracking again is shown. As can be seen from the whole group of images, the tracking method and the tracking system provided by the disclosure can stably track the weak, small and slow target under the complex background of low altitude for a long time.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Accordingly, various aspects of the present invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 18. The electronic device 600 shown in fig. 18 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 18, the electronic device 600 is in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, a bus 630 connecting different system components (including the memory unit 620 and the processing unit 610), and a display unit 640.
Wherein the storage unit stores program code that is executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 610 may perform step S1 as shown in fig. 1: acquiring a current frame image; step S2: selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as a center; and step S3: calculating the nearest neighbor similarity of each candidate sample, and selecting the candidate sample with the maximum nearest neighbor similarity as a candidate target; and step S4: judging whether the candidate target is the tracking target or not according to a plurality of frame images continuous with the current frame image; step S5: and when the candidate target is judged to be the tracking target, tracking according to the coordinate information of the tracking target.
The storage unit 620 may include readable media in the form of volatile storage units, such as a random access memory unit (RAM) 6201 and/or a cache storage unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 can be any bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. As shown, the network adapter 660 communicates with the other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, and may also be implemented by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 19, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this respect, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily appreciated that the processes illustrated in the above figures are not intended to indicate or limit the temporal order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (22)

1. A target tracking method, comprising:
acquiring a current frame image;
selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as a center;
calculating the nearest neighbor similarity of each candidate sample, and selecting the candidate sample with the maximum nearest neighbor similarity as a candidate target;
judging whether the candidate target is a tracking target or not according to a plurality of frame images continuous with the current frame image;
and when the candidate target is judged to be the tracking target, tracking according to the coordinate information of the tracking target.
2. The object tracking method of claim 1, wherein the obtaining the current frame image comprises:
receiving an initial image containing the tracking target information and a tracking video containing the initial image;
and intercepting a current frame image of the tracking video.
3. The method of claim 2, wherein the calculating the nearest neighbor similarity value for each of the candidate samples comprises:
sequentially calculating the nearest neighbor similarity value of each candidate sample according to a nearest neighbor classifier; wherein the establishing of the nearest neighbor classifier comprises:
selecting a positive sample and a negative sample within a preset range by taking the tracking target in the initial image as a center;
and establishing the nearest neighbor classifier according to the positive samples and the negative samples.
4. The method according to claim 3, wherein the selecting the positive sample within the preset range comprises:
taking the central point of the tracking target as a center on the initial image, and sliding a window to select a first window in a preset neighborhood range according to a mode that the step length is 1 and the size of the window is equal to that of the tracking target;
calculating the overlapping rate of each first window and the tracking target;
the first window with the overlap ratio larger than a first threshold is taken as the positive sample.
5. The method of claim 3, wherein the selecting negative examples within the preset range comprises:
randomly selecting a preset number of second windows with the same size as the tracking target in a preset area by taking the central point of the tracking target as a center on the initial image;
calculating the overlapping rate of each second window and the tracking target;
and taking the second window with the overlapping rate smaller than a second threshold value as the negative sample.
6. The target tracking method according to claim 4 or 5, wherein the overlap ratio is calculated by:
wherein, ioU is the overlapping rate; r t To track a target area, R c Being the first or second window region.
7. The method of claim 3, wherein the building the nearest neighbor classifier from the positive and negative examples comprises:
normalizing the positive sample and the negative sample;
respectively calculating mean vectors of the positive samples and the negative samples after normalization processing so as to establish the nearest neighbor classifier;
wherein: the NNS represents a nearest neighbor similarity value;andrespectively represent the ith candidate sample x i Probability of belonging to positive and negative examples.
8. The object tracking method of claim 3, wherein the obtaining the current frame image further comprises:
judging whether the current frame image is the m x n frame image;
when the current frame image is judged to be the m x n frame image, the m x n frame image is used for obtaining the negative sample again so as to update the nearest neighbor classifier;
wherein m and n are both positive integers.
9. The target tracking method according to claim 1, wherein the determining whether the candidate target is the tracking target from a plurality of frame images that are consecutive to the current frame image comprises:
judging whether the frame number of the current frame image is less than a preset value;
when the frame number of the current frame image is judged to be smaller than the preset value, judging whether the difference value of the nearest neighbor similarity of the candidate target in the current frame image and the candidate target in the previous frame image is larger than a third threshold value;
when all the continuous n frames of images taking the current frame of image as the last frame are judged to be larger than the current frame of image, judging that the candidate target in the current frame of image is not the tracking target; wherein n >0.
10. The target tracking method according to claim 1, wherein said determining whether the candidate target is the tracking target from a plurality of frame images consecutive to the current frame image further comprises:
judging whether the frame number of the current frame image is less than a preset value;
when the frame number of the current frame image is judged to be larger than the preset value, acquiring the minimum value of the nearest neighbor similarity of the candidate target in each frame image before the current frame image;
judging whether the nearest neighbor similarity value of the candidate target in the current frame image is smaller than the minimum value;
when the continuous n frames of images taking the current frame of image as the last frame are all judged to be less than the current frame of image, judging that the candidate target in the current frame of image is not the tracking target; wherein n >0.
11. The object tracking method according to claim 1 or 10, wherein the determining whether the candidate object is the object according to a plurality of frame images consecutive to the current frame image comprises:
judging whether the nearest neighbor similarity of the candidate target in the current frame image is smaller than the nearest neighbor similarity of the candidate target in the previous frame image;
when the continuous n frames of images taking the current frame of image as the last frame are all judged to be smaller than the current frame of image and the descending amplitude of the nearest neighbor similarity in the continuous n frames of images is larger than a fourth threshold value, judging that the candidate target in the current frame of image is not the tracking target; wherein n >0.
12. The method for tracking the target according to claim 1 or 11, wherein the determining whether the candidate target is the target according to a plurality of frame images consecutive to the current frame image includes:
calculating a first mean value and a first floating change value of the nearest neighbor similarity of the candidate target in each frame image before the current frame image;
calculating a second mean value and a second floating change value of the nearest neighbor similarity of the candidate target in the continuous n frame images taking the current frame image as the last frame; wherein n >0;
and when the difference value between the first mean value and the second mean value is judged to be larger than a fifth threshold value, and the difference value between the first floating change value and the second floating change value is judged to be smaller than a sixth threshold value, judging that the candidate target in the current frame image is not the tracking target.
13. The object tracking method according to claims 9-12, characterized in that the tracking method further comprises:
when the candidate target in the current frame image is judged not to be the tracking target, calculating and acquiring the position offset of the candidate target in the current frame image and the previous frame image, and judging whether the position offset is smaller than a seventh threshold value or not;
and when the position offset is judged to be smaller than a seventh threshold value, judging that the candidate target in the current frame image is determined to be the tracking target.
14. The target tracking method of claims 9-12, wherein the tracking method further comprises:
acquiring the position offset of the candidate target in the current frame image and the previous frame image;
and when the position offset is judged to be larger than an eighth threshold value, judging that the candidate target in the current frame image is not the target.
15. The target tracking method according to claim 1 or 14, characterized in that the tracking method further comprises:
and when the candidate target in the current frame image is judged not to be the tracking target, establishing a search area on the current frame or a next frame image of the current frame by taking the position of the tracking target in a previous frame image as a center to detect the tracking target again.
16. The target tracking method of claim 15, wherein the re-detecting the tracking target by establishing a search area on a current frame or a next frame image of the current frame centered on a position of the tracking target on a previous frame image comprises:
on the current frame image or the next frame image of the current frame image, establishing a search area by taking the position of the tracking target in the previous frame image of the current frame image as the center, and performing morphological Top-Hat transformation on the search area so as to obtain a binary image containing a candidate target area;
performing cluster analysis on the binary image so as to obtain the number and marks of the candidate target regions;
calculating the local contrast of each candidate target area and screening according to a preset rule;
calculating the nearest neighbor similarity value of each candidate target by taking the screened candidate target area as the central point of the candidate target; judging whether each candidate target is the tracking target according to a preset rule;
and when the candidate sample is judged to be the tracking target, tracking according to the candidate target.
17. The target tracking method of claim 15 or 16, further comprising:
and when the tracking target is not detected in the search area within a preset time, sending instruction information so that a detection module can detect the tracking target again according to the instruction information.
18. An object tracking system, comprising:
the detection module is used for acquiring tracking target position information on the initial frame image;
the tracking module is used for tracking the tracking target according to the initial frame image;
and the re-detection module is used for establishing a search area on the current frame or the next frame of image of the current frame by taking the position of the tracking target in the previous frame of image as the center so as to detect the tracking target again when the tracking loss is judged in the current frame of image.
19. The target tracking system of claim 18, wherein the tracking module comprises:
the image acquisition module is used for acquiring a current frame image;
the candidate sample acquisition module is used for selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as the center;
the candidate target selection module is used for calculating the nearest neighbor similarity of each candidate sample and selecting the candidate sample with the maximum nearest neighbor similarity as a candidate target;
the target judgment module is used for judging whether the candidate target is the tracking target according to a plurality of frames of images continuous with the current frame of image;
and the tracking execution module is used for tracking according to the coordinate information of the tracking target when the candidate target is judged to be the tracking target.
20. The target tracking system of claim 19, wherein the tracking module further comprises:
and the result judging module is used for judging whether the judgment result of the tracking target is correct or not according to the position offset of the candidate target in the current frame image and the previous frame image.
21. A storage medium having stored thereon a computer program which, when executed by a processor, implements the object tracking method according to any one of claims 1 to 17.
22. An electronic terminal, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the following via execution of the executable instructions:
acquiring a current frame image;
selecting a candidate sample in a preset range on the current frame image by taking the coordinate of the tracking target in the previous frame image as a center;
calculating the nearest neighbor similarity of each candidate sample, and selecting the candidate sample with the largest nearest neighbor similarity as a candidate target;
judging whether the candidate target is the tracking target or not according to a plurality of frame images continuous with the current frame image;
and when the candidate target is judged to be the tracking target, tracking according to the coordinate information of the tracking target.
CN201710952448.7A 2017-10-13 2017-10-13 Target tracking method and system, storage medium and electronic terminal Active CN107886048B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710952448.7A CN107886048B (en) 2017-10-13 2017-10-13 Target tracking method and system, storage medium and electronic terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710952448.7A CN107886048B (en) 2017-10-13 2017-10-13 Target tracking method and system, storage medium and electronic terminal

Publications (2)

Publication Number Publication Date
CN107886048A true CN107886048A (en) 2018-04-06
CN107886048B CN107886048B (en) 2021-10-08

Family

ID=61781572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710952448.7A Active CN107886048B (en) 2017-10-13 2017-10-13 Target tracking method and system, storage medium and electronic terminal

Country Status (1)

Country Link
CN (1) CN107886048B (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596957A (en) * 2018-04-26 2018-09-28 北京小米移动软件有限公司 Object tracking methods and device
CN108694724A (en) * 2018-05-11 2018-10-23 西安天和防务技术股份有限公司 A kind of long-time method for tracking target
CN108765452A (en) * 2018-05-11 2018-11-06 西安天和防务技术股份有限公司 A kind of detection of mobile target in complex background and tracking
CN109671103A (en) * 2018-12-12 2019-04-23 易视腾科技股份有限公司 Method for tracking target and device
CN109670462A (en) * 2018-12-24 2019-04-23 北京天睿空间科技股份有限公司 Continue tracking across panorama based on the aircraft of location information
CN109727268A (en) * 2018-12-29 2019-05-07 西安天和防务技术股份有限公司 Method for tracking target, device, computer equipment and storage medium
CN110211158A (en) * 2019-06-04 2019-09-06 海信集团有限公司 Candidate region determines method, apparatus and storage medium
CN110278484A (en) * 2019-05-15 2019-09-24 北京达佳互联信息技术有限公司 Video is dubbed in background music method, apparatus, electronic equipment and storage medium
CN110458861A (en) * 2018-05-04 2019-11-15 佳能株式会社 Object detection and tracking and equipment
CN110555405A (en) * 2019-08-30 2019-12-10 北京迈格威科技有限公司 Target tracking method and device, storage medium and electronic equipment
CN110619254A (en) * 2018-06-19 2019-12-27 海信集团有限公司 Target tracking method and device based on disparity map and terminal
CN111369590A (en) * 2020-02-27 2020-07-03 北京三快在线科技有限公司 Multi-target tracking method and device, storage medium and electronic equipment
CN111428663A (en) * 2020-03-30 2020-07-17 北京百度网讯科技有限公司 Traffic light state identification method and device, electronic equipment and storage medium
CN111487998A (en) * 2020-04-13 2020-08-04 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) Automatic target capturing method and device for two-axis four-frame photoelectric tracking equipment
CN111507999A (en) * 2019-01-30 2020-08-07 北京四维图新科技股份有限公司 FDSST algorithm-based target tracking method and device
CN111539986A (en) * 2020-03-25 2020-08-14 西安天和防务技术股份有限公司 Target tracking method and device, computer equipment and storage medium
CN112215869A (en) * 2020-10-12 2021-01-12 华中科技大学 Group target tracking method and system based on graph similarity constraint
CN112258553A (en) * 2020-09-21 2021-01-22 中国人民解放军战略支援部队航天工程大学 All-day-time target tracking method based on multi-source image fusion
WO2021036373A1 (en) * 2019-08-27 2021-03-04 北京京东尚科信息技术有限公司 Target tracking method and device, and computer readable storage medium
CN112489085A (en) * 2020-12-11 2021-03-12 北京澎思科技有限公司 Target tracking method, target tracking device, electronic device, and storage medium
CN112508962A (en) * 2020-09-27 2021-03-16 绍兴文理学院 Target image region subsequence separation method based on time correlation image sequence
WO2021208252A1 (en) * 2020-04-15 2021-10-21 上海摩象网络科技有限公司 Tracking target determination method, device, and hand-held camera
CN114511792A (en) * 2020-11-17 2022-05-17 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle ground detection method and system based on frame counting
WO2024012371A1 (en) * 2022-07-11 2024-01-18 影石创新科技股份有限公司 Target tracking method and apparatus, and device and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102163280A (en) * 2011-04-12 2011-08-24 华中科技大学 Method for identifying, tracking and converting target based on confidence degree and multi-frame judgement
US20130070105A1 (en) * 2011-09-15 2013-03-21 Kabushiki Kaisha Toshiba Tracking device, tracking method, and computer program product
CN103914853A (en) * 2014-03-19 2014-07-09 华南理工大学 Method for processing target adhesion and splitting conditions in multi-vehicle tracking process
CN105069813A (en) * 2015-07-20 2015-11-18 阔地教育科技有限公司 Stable moving target detection method and device
CN105374050A (en) * 2015-10-12 2016-03-02 浙江宇视科技有限公司 Moving target tracking recovery method and device
CN105931269A (en) * 2016-04-22 2016-09-07 海信集团有限公司 Tracking method for target in video and tracking device thereof
CN106204649A (en) * 2016-07-05 2016-12-07 西安电子科技大学 A kind of method for tracking target based on TLD algorithm
CN106326924A (en) * 2016-08-23 2017-01-11 武汉大学 Object tracking method and object tracking system based on local classification
CN107146240A (en) * 2017-05-05 2017-09-08 西北工业大学 The video target tracking method of taking photo by plane detected based on correlation filtering and conspicuousness

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102163280A (en) * 2011-04-12 2011-08-24 华中科技大学 Method for identifying, tracking and converting target based on confidence degree and multi-frame judgement
US20130070105A1 (en) * 2011-09-15 2013-03-21 Kabushiki Kaisha Toshiba Tracking device, tracking method, and computer program product
CN103914853A (en) * 2014-03-19 2014-07-09 华南理工大学 Method for processing target adhesion and splitting conditions in multi-vehicle tracking process
CN105069813A (en) * 2015-07-20 2015-11-18 阔地教育科技有限公司 Stable moving target detection method and device
CN105374050A (en) * 2015-10-12 2016-03-02 浙江宇视科技有限公司 Moving target tracking recovery method and device
CN105931269A (en) * 2016-04-22 2016-09-07 海信集团有限公司 Tracking method for target in video and tracking device thereof
CN106204649A (en) * 2016-07-05 2016-12-07 西安电子科技大学 A kind of method for tracking target based on TLD algorithm
CN106326924A (en) * 2016-08-23 2017-01-11 武汉大学 Object tracking method and object tracking system based on local classification
CN107146240A (en) * 2017-05-05 2017-09-08 西北工业大学 The video target tracking method of taking photo by plane detected based on correlation filtering and conspicuousness

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596957B (en) * 2018-04-26 2022-07-22 北京小米移动软件有限公司 Object tracking method and device
CN108596957A (en) * 2018-04-26 2018-09-28 北京小米移动软件有限公司 Object tracking methods and device
CN110458861B (en) * 2018-05-04 2024-01-26 佳能株式会社 Object detection and tracking method and device
CN110458861A (en) * 2018-05-04 2019-11-15 佳能株式会社 Object detection and tracking and equipment
CN108694724A (en) * 2018-05-11 2018-10-23 西安天和防务技术股份有限公司 A kind of long-time method for tracking target
CN108765452A (en) * 2018-05-11 2018-11-06 西安天和防务技术股份有限公司 A kind of detection of mobile target in complex background and tracking
CN110619254B (en) * 2018-06-19 2023-04-18 海信集团有限公司 Target tracking method and device based on disparity map and terminal
CN110619254A (en) * 2018-06-19 2019-12-27 海信集团有限公司 Target tracking method and device based on disparity map and terminal
CN109671103A (en) * 2018-12-12 2019-04-23 易视腾科技股份有限公司 Method for tracking target and device
CN109670462A (en) * 2018-12-24 2019-04-23 北京天睿空间科技股份有限公司 Continue tracking across panorama based on the aircraft of location information
CN109670462B (en) * 2018-12-24 2019-11-01 北京天睿空间科技股份有限公司 Continue tracking across panorama based on the aircraft of location information
CN109727268A (en) * 2018-12-29 2019-05-07 西安天和防务技术股份有限公司 Method for tracking target, device, computer equipment and storage medium
CN111507999A (en) * 2019-01-30 2020-08-07 北京四维图新科技股份有限公司 FDSST algorithm-based target tracking method and device
CN111507999B (en) * 2019-01-30 2023-07-18 北京四维图新科技股份有限公司 Target tracking method and device based on FDSST algorithm
CN110278484A (en) * 2019-05-15 2019-09-24 北京达佳互联信息技术有限公司 Video is dubbed in background music method, apparatus, electronic equipment and storage medium
CN110211158B (en) * 2019-06-04 2023-03-28 海信集团有限公司 Candidate area determination method, device and storage medium
CN110211158A (en) * 2019-06-04 2019-09-06 海信集团有限公司 Candidate region determines method, apparatus and storage medium
WO2021036373A1 (en) * 2019-08-27 2021-03-04 北京京东尚科信息技术有限公司 Target tracking method and device, and computer readable storage medium
CN110555405B (en) * 2019-08-30 2022-05-06 北京迈格威科技有限公司 Target tracking method and device, storage medium and electronic equipment
CN110555405A (en) * 2019-08-30 2019-12-10 北京迈格威科技有限公司 Target tracking method and device, storage medium and electronic equipment
CN111369590A (en) * 2020-02-27 2020-07-03 北京三快在线科技有限公司 Multi-target tracking method and device, storage medium and electronic equipment
CN111539986A (en) * 2020-03-25 2020-08-14 西安天和防务技术股份有限公司 Target tracking method and device, computer equipment and storage medium
CN111539986B (en) * 2020-03-25 2024-03-22 西安天和防务技术股份有限公司 Target tracking method, device, computer equipment and storage medium
CN111428663B (en) * 2020-03-30 2023-08-29 阿波罗智能技术(北京)有限公司 Traffic light state identification method and device, electronic equipment and storage medium
CN111428663A (en) * 2020-03-30 2020-07-17 北京百度网讯科技有限公司 Traffic light state identification method and device, electronic equipment and storage medium
CN111487998A (en) * 2020-04-13 2020-08-04 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) Automatic target capturing method and device for two-axis four-frame photoelectric tracking equipment
CN111487998B (en) * 2020-04-13 2023-07-25 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) Automatic target capturing method and device for two-axis four-frame photoelectric tracking equipment
WO2021208252A1 (en) * 2020-04-15 2021-10-21 上海摩象网络科技有限公司 Tracking target determination method, device, and hand-held camera
CN112258553A (en) * 2020-09-21 2021-01-22 中国人民解放军战略支援部队航天工程大学 All-day-time target tracking method based on multi-source image fusion
CN112508962A (en) * 2020-09-27 2021-03-16 绍兴文理学院 Target image region subsequence separation method based on time correlation image sequence
CN112215869A (en) * 2020-10-12 2021-01-12 华中科技大学 Group target tracking method and system based on graph similarity constraint
CN114511792A (en) * 2020-11-17 2022-05-17 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle ground detection method and system based on frame counting
CN114511792B (en) * 2020-11-17 2024-04-05 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle ground detection method and system based on frame counting
CN112489085A (en) * 2020-12-11 2021-03-12 北京澎思科技有限公司 Target tracking method, target tracking device, electronic device, and storage medium
WO2024012371A1 (en) * 2022-07-11 2024-01-18 影石创新科技股份有限公司 Target tracking method and apparatus, and device and storage medium

Also Published As

Publication number Publication date
CN107886048B (en) 2021-10-08

Similar Documents

Publication Publication Date Title
CN107886048B (en) Target tracking method and system, storage medium and electronic terminal
CN109035304B (en) Target tracking method, medium, computing device and apparatus
US10984556B2 (en) Method and apparatus for calibrating relative parameters of collector, device and storage medium
CN107992790B (en) Target long-time tracking method and system, storage medium and electronic terminal
EP2660753B1 (en) Image processing method and apparatus
CN111209978B (en) Three-dimensional visual repositioning method and device, computing equipment and storage medium
CN110349212B (en) Optimization method and device for instant positioning and map construction, medium and electronic equipment
CN110706262B (en) Image processing method, device, equipment and storage medium
CN111931864B (en) Method and system for multiple optimization of target detector based on vertex distance and cross-over ratio
CN110874853B (en) Method, device, equipment and storage medium for determining target movement
KR20210012012A (en) Object tracking methods and apparatuses, electronic devices and storage media
CN108010052A (en) Method for tracking target and system, storage medium and electric terminal in complex scene
CN113420682A (en) Target detection method and device in vehicle-road cooperation and road side equipment
CN111476814B (en) Target tracking method, device, equipment and storage medium
CN112528927A (en) Confidence determination method based on trajectory analysis, roadside equipment and cloud control platform
CN115049954A (en) Target identification method, device, electronic equipment and medium
US20230306615A1 (en) Target tracking method, apparatus, device and storage medium
CN113052019A (en) Target tracking method and device, intelligent equipment and computer storage medium
CN113763466A (en) Loop detection method and device, electronic equipment and storage medium
CN109934185B (en) Data processing method and device, medium and computing equipment
CN111640134A (en) Face tracking method and device, computer equipment and storage device thereof
CN113869163B (en) Target tracking method and device, electronic equipment and storage medium
CN114429631B (en) Three-dimensional object detection method, device, equipment and storage medium
CN113160258B (en) Method, system, server and storage medium for extracting building vector polygon
CN112819859B (en) Multi-target tracking method and device applied to intelligent security

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant