CN112184772A - Target tracking method and device - Google Patents
Target tracking method and device Download PDFInfo
- Publication number
- CN112184772A CN112184772A CN202011061084.1A CN202011061084A CN112184772A CN 112184772 A CN112184772 A CN 112184772A CN 202011061084 A CN202011061084 A CN 202011061084A CN 112184772 A CN112184772 A CN 112184772A
- Authority
- CN
- China
- Prior art keywords
- tracking
- target
- frames
- current frame
- target image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 238000001514 detection method Methods 0.000 claims abstract description 153
- 239000011159 matrix material Substances 0.000 claims abstract description 21
- 238000004422 calculation algorithm Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 14
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 239000000463 material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001007 puffing effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a target tracking method and a target tracking device, wherein the method comprises the following steps: carrying out target detection on the current frame 3D target image to obtain detection frames of N target objects in the current frame 3D target image; acquiring tracking frames of N target objects of a current frame; judging whether the detection frames and the tracking frames of the N target objects in the current frame are successfully matched; if the matching of the tracking frames of the m target objects in the current frame fails, calculating a homography matrix between the tracking frames of the m target objects in the 3D of the current frame and the detection frames of the m target objects in the corresponding previous frame; acquiring detection frames of m target objects in the current frame according to the homography matrix, and judging whether tracking frames of the m target objects in the current frame are matched with the detection frames; and if the tracking frames of the m target objects in the current frame are successfully matched with the detection frame, outputting corresponding 3D tracking tracks. Therefore, the conditions of missing detection and error detection of the target object can be effectively avoided, and the reliability of the system is improved.
Description
Technical Field
The invention relates to the technical field of target tracking, in particular to a target tracking method and a target tracking device.
Background
In the related technology, 3D target tracking algorithms are based on kalman algorithm and hungarian algorithm under a laser radar 3D coordinate system, and establish a tracking trajectory for target tracking through IOU (Intersection-over-Union) matching according to a target predicted position and an actual position between consecutive frames.
However, in practical applications, small target objects are sensitive to position deviation, and small-range position jitter can reduce the IOU values of actual targets and predicted targets, thereby easily causing the conditions of missed detection and false detection of the small target objects, and greatly reducing the reliability of the system.
Disclosure of Invention
The invention provides a target tracking method for solving the technical problems, which can effectively avoid the situations of missing detection and error detection of a target object, thereby greatly improving the reliability of a system.
The technical scheme adopted by the invention is as follows:
a target tracking method is used for carrying out target detection on a current frame 3D target image to obtain detection frames of N target objects in the current frame 3D target image, wherein N is a positive integer greater than or equal to 1; acquiring tracking frames of N target objects in a current frame 3D target image; judging whether the detection frames and the tracking frames of N target objects in the current frame of the 3D target image are successfully matched; if the tracking frames of m target objects in the current frame of 3D target image fail to be matched, calculating a homography matrix between the tracking frames of m target objects in the current frame of 3D target image and the detection frames of m target objects in the corresponding previous frame of 3D target image, wherein m is a positive integer which is greater than or equal to 1 and less than or equal to N; acquiring detection frames of m target objects in the current frame 3D target image according to the homography matrix, and judging whether tracking frames of the m target objects in the current frame 3D target image are matched with the detection frames; and if the tracking frames of the m target objects in the current frame of 3D target image are successfully matched with the detection frame, outputting corresponding 3D tracking tracks.
The obtaining of the tracking frames of the N target objects in the current frame 3D target image includes: acquiring tracking frames of the N target objects in the previous frame of 3D target image; and predicting the tracking frames of the N target objects in the current frame of 3D target image by adopting a target detection algorithm according to the tracking frames of the N target objects in the previous frame of 3D target image.
The target tracking method further comprises the following steps: if the matching of the detection frames of the N target objects in the current frame 3D target image fails, establishing a new 3D tracking track, and recording the types of the detection frames of the N target objects in the current frame 3D target image in a type attribute list of the new 3D tracking track, wherein N is a positive integer which is greater than or equal to 1 and less than or equal to N.
The target tracking method further comprises the following steps: acquiring detection frames and tracking frames of the N target objects in the current frame 2D target image; judging whether the detection frames and the tracking frames of the N target objects in the current frame 2D target image are successfully matched; and if the matching is successful, outputting a corresponding 2D tracking track, and combining the 2D tracking track and the 3D tracking track.
The target tracking method further comprises the following steps: if the matching of the tracking frames of k target objects in the current frame 2D target image fails, predicting the tracking frames of the k target objects in the next frame 2D target image according to the tracking frames of the k target objects in the current frame 2D target image, wherein k is a positive integer which is greater than or equal to 1 and less than or equal to N; acquiring detection frames of the k target objects in the next frame of 2D target image; judging whether the detection frames and the tracking frames of the k target objects in the next frame of 2D target image are successfully matched; and if the matching is successful, outputting a corresponding 2D tracking track, and combining the 2D tracking track and the 3D tracking track.
The target tracking method further comprises the following steps: if the matching of the detection frames of p target objects in the current frame 2D target image fails, establishing a new 2D tracking track, and recording the classes of the detection frames of the p target objects in the current frame 2D target image in a class attribute list of the new 2D tracking track, wherein p is a positive integer which is greater than or equal to 1 and less than or equal to N.
An object tracking device, comprising: the first acquisition module is used for carrying out target detection on the current frame 3D target image so as to acquire detection frames of N target objects in the current frame 3D target image, wherein N is a positive integer greater than or equal to 1; a second obtaining module, configured to obtain tracking frames of N target objects in the current frame of 3D target image; the first judgment module is used for judging whether the detection frames and the tracking frames of the N target objects in the current frame 3D target image are successfully matched; a calculating module, configured to calculate a homography matrix between tracking frames of m target objects in the current frame 3D target image and detection frames of the m target objects in a corresponding previous frame 3D target image when matching of the tracking frames of the m target objects in the current frame 3D target image fails, where m is a positive integer greater than or equal to 1 and less than or equal to N; a second determining module, configured to obtain, according to the homography matrix, detection frames of the m target objects in the current frame 3D target image, and determine whether tracking frames of the m target objects in the current frame 3D target image are matched with the detection frames; and the output module is used for outputting corresponding 3D tracking tracks when the tracking frames of the m target objects in the current frame 3D target image are successfully matched with the detection frame.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above object tracking method when executing the computer program.
A non-transitory computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the object tracking method described above.
A computer program product in which instructions, when executed by a processor, perform the above-described object tracking method.
The invention has the beneficial effects that:
the invention can effectively avoid the conditions of missing detection and error detection of the target object, thereby greatly improving the reliability of the system.
Drawings
FIG. 1 is a flow chart of a target tracking method of an embodiment of the present invention;
FIG. 2 is a block diagram illustrating a target tracking method according to an embodiment of the present invention;
fig. 3 is a block diagram illustrating a target tracking method according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
FIG. 1 is a flow chart of a target tracking method according to an embodiment of the invention.
As shown in fig. 1, the target tracking method of the embodiment of the present invention may include the following steps:
s1, performing target detection on the current frame 3D target image to obtain detection frames of N target objects in the current frame 3D target image. Wherein N is a positive integer greater than or equal to 1.
Specifically, a point cloud clustering algorithm and a 3D detection network may be used to perform target detection on a current frame 3D target image to obtain maximum circumscribed rectangular frames of N target objects (for example, small target objects) in the current frame 3D target image, and respectively determine whether the length, width and height of the maximum circumscribed rectangular frame of each target object satisfy preset conditions, for example, determine whether the length, width and height of the maximum circumscribed rectangular frame are less than a set threshold, if so, perform a puffing operation, that is, multiply the three-dimensional coordinates of the point cloud corresponding to the maximum circumscribed rectangular frame by a preset expansion coefficient, so as to implement the puffing operation, and further obtain a detection frame of each target object in the current frame 3D target image.
S2, acquiring tracking frames of N target objects in the current frame 3D target image.
According to one embodiment of the present invention, acquiring tracking frames of N target objects in a current frame of a 3D target image includes: acquiring tracking frames of N target objects in a previous frame of 3D target image; and predicting the tracking frames of the N target objects in the 3D target image of the current frame by adopting a target detection algorithm according to the tracking frames of the N target objects in the 3D target image of the previous frame.
Specifically, the time interval between frames may be preset, and then the tracking frames of the N target objects in the 3D target image of the current frame are predicted according to the tracking frames of the N target objects in the 3D target image of the previous frame by using a target tracking algorithm, for example, kalman algorithm and hungarian algorithm.
It should be noted that, if the current frame is the initial time, the detection frames of N target objects in the 3D target image of the current frame may be used as the tracking frames of N target objects in the 3D target image.
And S3, judging whether the detection frames and the tracking frames of the N target objects in the current frame 3D target image are successfully matched.
Specifically, after the detection frames of the N target objects in the current frame 3D target image and the corresponding tracking frames of the N target objects in the current frame 3D target image are obtained, an intersection ratio between the detection frames of the N target objects in the current frame 3D target image and the corresponding tracking frames of the N target objects in the current frame 3D target image may be calculated, and whether the intersection ratio is greater than a preset threshold value is determined. If the intersection ratio is larger than or equal to the preset threshold, the detection frames and the tracking frames of the N target objects in the current frame 3D target image can be judged to be successfully matched, at the moment, the categories of the detection frames of the N target objects in the current frame 3D target image can be recorded in a category attribute list of the tracking tracks to which the tracking frames of the N target objects in the current frame 3D target image belong, and the corresponding 3D tracking tracks are output; if the intersection ratio is smaller than the preset threshold value, the matching failure can be judged, and the matching result is analyzed.
S4, if the tracking frames of m target objects in the current frame of 3D target image fail to match, calculating the homography matrix between the tracking frames of m target objects in the current frame of 3D target image and the detection frames of m target objects in the corresponding previous frame of 3D target image. Wherein m is a positive integer greater than or equal to 1 and less than or equal to N.
S5, acquiring detection frames of m target objects in the current frame 3D target image according to the homography matrix, and judging whether the tracking frames of the m target objects in the current frame 3D target image are matched with the detection frames.
And S6, if the tracking frames of the m target objects in the current frame 3D target image are successfully matched with the detection frame, outputting corresponding 3D tracking tracks.
Specifically, if it is analyzed that matching of tracking frames of m target objects in a current frame 3D target image fails, a find homographic function in an OpenCV library may be used to calculate a Homography matrix between the tracking frames of m target objects in the current frame 3D target image and detection frames of m target objects in a corresponding previous frame 3D target image, and according to the Homography matrix, point clouds in the detection frames of m target objects in the previous frame 3D target image are mapped to the current frame 3D target image, and are divided by corresponding expansion coefficients to predict detection frames of m target objects in the current frame 3D target image, and at this time, the point clouds in the detection frames of m target objects in the current frame 3D target image may be restored to the original sizes.
Then, continuously judging whether the detection frames of the m target objects in the current frame 3D target image are successfully matched with the tracking frames of the m target objects in the current frame 3D target image, and if the matching is successful, outputting corresponding 3D tracking tracks; if the matching fails, the steps are returned to be continuously executed, that is, the matching result is correspondingly analyzed, and a corresponding tracking strategy is executed until the matching is successful.
It should be noted that if the tracking frame that agrees to the ID fails to match for a preset number of consecutive frames (for example, ten consecutive frames), the 3D tracking track may be deleted.
Therefore, in the embodiment of the invention, when the target tracking detection is carried out and the matching of the tracking frame fails, the homography matrix is adopted to predict the detection frame of the current frame so as to continue the matching, thereby effectively avoiding the situations of the missing detection and the error detection of the target object and further greatly improving the reliability of the system.
According to an embodiment of the present invention, the target tracking method further includes: if the matching of the detection frames of N target objects in the current frame 3D target image fails, establishing a new 3D tracking track, and recording the classes of the detection frames of the N target objects in the current frame 3D target image in a class attribute list of the new 3D tracking track, wherein N is a positive integer which is greater than or equal to 1 and less than or equal to N.
Specifically, as another possible embodiment, there is also a case where the detection box matching fails when the matching result is analyzed. If the detection frames of the n target objects in the current 3D target image are judged to be failed to be matched, a new 3D tracking track (composed of at least two tracking frames with the same ID) can be established, and the types of the detection frames of the n target objects in the current 3D target image are recorded in the type attribute list of the new 3D tracking track. In the new 3D tracking trajectory, the detection frames of n target objects in the 3D target image of the current frame may be used as the tracking frames of n target objects in the 3D target image of the current frame, the tracking frames of n target objects in the 3D target image of the next frame may be predicted by using a target tracking algorithm, the tracking frame diagrams of n target objects in the 3D target image of the next frame may be matched with the detection frames of n target objects in the 3D target image of the next frame, and after the matching is successful, the corresponding 3D tracking trajectory may be output.
According to an embodiment of the present invention, the target tracking method further includes: acquiring detection frames and tracking frames of N target objects in a current frame 2D target image; judging whether the detection frames and the tracking frames of the N target objects in the current frame 2D target image are successfully matched; judging whether the detection frames and the tracking frames of the N target objects in the current frame 2D target image are successfully matched; and if the matching is successful, outputting a corresponding 2D tracking track, and combining the 2D tracking track and the 3D tracking track.
Specifically, in practical applications, the detection frames of N target objects in the current frame 3D target image may be projected into the 2D target image, and the detection frames and the tracking frames of N target objects in the current frame 2D target image may be acquired, specifically, refer to a process of acquiring the detection frames and the tracking frames of N target objects in the 3D target image.
Then, calculating the intersection ratio of the detection frames of the N target objects in the current frame 2D target image and the corresponding tracking frames of the N target objects in the current frame 2D target image, and judging whether the intersection ratio is greater than a preset threshold value. If the intersection ratio is larger than the set threshold value, it can be judged that the detection frames and the tracking frames of the N target objects in the current frame 2D target image are successfully matched, at this time, the categories of the detection frames of the N target objects in the current frame 2D target image can be recorded in a category attribute list of the tracking tracks to which the tracking frames of the N target objects in the current frame 2D target image belong, and the corresponding 2D tracking tracks are output.
Finally, the 2D tracking trajectory and the 3D tracking trajectory may be merged, that is, the 2D tracking trajectory and the 3D tracking trajectory with the same ID are merged, so as to select the detection box category with the largest occurrence frequency in the 2D tracking trajectory and the 3D tracking trajectory with the same ID in a preset frame (for example, ten frames) as the category of the trajectory.
Therefore, in one embodiment of the invention, the 3D tracking algorithm is adopted for target tracking, and simultaneously, the 2D tracking method is fused, so that not only is the point cloud information of the laser radar used, but also the image information of the matched camera is fused, the scene of many small target objects can be adapted, and the accuracy of overall target tracking is further improved.
According to an embodiment of the present invention, the target tracking method further includes: if the matching of the tracking frames of k target objects in the current frame 2D target image fails, predicting the tracking frames of k target objects in the next frame 2D target image according to the tracking frames of k target objects in the current frame 2D target image, wherein k is a positive integer which is greater than or equal to 1 and less than or equal to N; acquiring detection frames of k target objects in a next frame of 2D target image; judging whether the detection frames and the tracking frames of the k target objects in the next frame of 2D target image are successfully matched; and if the matching is successful, outputting a corresponding 2D tracking track, and combining the 2D tracking track and the 3D tracking track.
Specifically, as a possible implementation manner, if it is analyzed that the tracking frames of k target objects in the current frame 2D target image fail to match, the tracking frames of k target objects in the next frame 2D target image are predicted according to the tracking frames of k target objects in the current frame 2D target image, and the detection frames of k target objects in the next frame 2D target image are obtained. Matching the tracking frame diagrams of k target objects in the next frame of 2D target image with the detection frames of k target objects in the next frame of 2D target image, outputting corresponding 2D tracking tracks after matching is successful, and combining the 2D tracking tracks and the 3D tracking tracks.
It should be noted that if the matching of consecutive preset number of frames (for example, consecutive ten frames) of the tracking frame of the agreement ID fails, the 2D tracking track may be deleted.
According to another embodiment of the present invention, the target tracking method further includes: if the matching of the detection frames of p target objects in the current frame 2D target image fails, establishing a new 2D tracking track, and recording the classes of the detection frames of the p target objects in the current frame 2D target image in a class attribute list of the new 2D tracking track, wherein p is a positive integer which is greater than or equal to 1 and less than or equal to N.
As another possible implementation, if it is determined that there are p detection frames of the target object in the current frame 2D target image that have failed to match, a new 2D tracking track (composed of at least two tracking frames with the same ID) may be established, and the categories of the detection frames of the p target objects in the current frame 2D target image are recorded in the category attribute list of the new 2D tracking track. In the new 2D tracking track, the detection frames of p target objects in the 2D target image of the current frame may be used as the tracking frames of p target objects in the 2D target image of the current frame, the tracking frames of p target objects in the 2D target image of the next frame may be predicted by using a target tracking algorithm, the tracking frame diagrams of p target objects in the 2D target image of the next frame may be matched with the detection frames of p target objects in the 2D target image of the next frame, and after the matching is successful, the corresponding 2D tracking track may be output.
To sum up, according to the target tracking method of the embodiment of the present invention, the target detection is performed on the current frame 3D target image to obtain the detection frames of N target objects in the current frame 3D target image, and obtain the tracking frames of N target objects in the current frame 3D target image, and determine whether the detection frames and the tracking frames of N target objects in the current frame 3D target image are successfully matched, and when the tracking frames of m target objects in the current frame 3D target image are unsuccessfully matched, calculate the homography matrix between the tracking frames of m target objects in the current frame 3D target image and the detection frames of m target objects in the previous frame 3D target image, and obtain the detection frames of m target objects in the current frame 3D target image according to the homography matrix, and determine whether the tracking frames of m target objects in the current frame 3D target image are matched with the detection frames, and outputting corresponding 3D tracking tracks when the tracking frames of the m target objects in the current frame 3D target image are successfully matched with the detection frame. Therefore, the conditions of missing detection and wrong detection of the target object can be effectively avoided, and the reliability of the system is greatly improved.
Corresponding to the target tracking method of the above embodiment, the invention further provides a target tracking device.
As shown in fig. 2, the target tracking apparatus according to an embodiment of the present invention may include: the device comprises a first acquisition module 100, a second acquisition module 200, a first judgment module 300, a calculation module 400, a second judgment module 500 and an output module 600.
The first obtaining module 100 is configured to perform target detection on a current frame 3D target image to obtain detection frames of N target objects in the current frame 3D target image, where N is a positive integer greater than or equal to 1; the second obtaining module 200 is configured to obtain tracking frames of N target objects in a current frame of the 3D target image; the first judging module 300 is configured to judge whether the detection frames and the tracking frames of the N target objects in the current frame 3D target image are successfully matched; the calculating module 400 is configured to calculate a homography matrix between tracking frames of m target objects in a current frame 3D target image and detection frames of m target objects in a corresponding previous frame 3D target image when matching of the tracking frames of m target objects in the current frame 3D target image fails, where m is a positive integer greater than or equal to 1 and less than or equal to N; the second judging module 500 is configured to obtain detection frames of m target objects in the current frame 3D target image according to the homography matrix, and judge whether tracking frames of the m target objects in the current frame 3D target image are matched with the detection frames; the output module 600 is configured to output corresponding 3D tracking tracks when the tracking frames of the m target objects in the current frame 3D target image are successfully matched with the detection frame.
According to an embodiment of the present invention, the second obtaining module 200 is specifically configured to: acquiring tracking frames of N target objects in a previous frame of 3D target image; and predicting the tracking frames of the N target objects in the 3D target image of the current frame by adopting a target detection algorithm according to the tracking frames of the N target objects in the 3D target image of the previous frame.
According to an embodiment of the invention, the output module 600 is further configured to: if the matching of the detection frames of N target objects in the current frame 3D target image fails, establishing a new 3D tracking track, and recording the classes of the detection frames of the N target objects in the current frame 3D target image in a class attribute list of the new 3D tracking track, wherein N is a positive integer which is greater than or equal to 1 and less than or equal to N.
According to an embodiment of the present invention, as shown in fig. 3, the target tracking apparatus further includes a trajectory merging module 700. Wherein the trajectory merging module 700 is configured to: acquiring detection frames and tracking frames of N target objects in a current frame 2D target image; judging whether the detection frames and the tracking frames of the N target objects in the current frame 2D target image are successfully matched; and if the matching is successful, outputting a corresponding 2D tracking track, and combining the 2D tracking track and the 3D tracking track.
According to an embodiment of the present invention, the trajectory merging module 700 is further configured to: if the matching of the tracking frames of k target objects in the current frame 2D target image fails, predicting the tracking frames of k target objects in the next frame 2D target image according to the tracking frames of k target objects in the current frame 2D target image, wherein k is a positive integer which is greater than or equal to 1 and less than or equal to N; acquiring detection frames of k target objects in a next frame of 2D target image; acquiring detection frames of k target objects in a next frame of 2D target image; and if the matching is successful, outputting a corresponding 2D tracking track, and combining the 2D tracking track and the 3D tracking track.
According to an embodiment of the present invention, the trajectory merging module 700 is further configured to: if the matching of the detection frames of p target objects in the current frame 2D target image fails, establishing a new 2D tracking track, and recording the classes of the detection frames of the p target objects in the current frame 2D target image in a class attribute list of the new 2D tracking track, wherein p is a positive integer which is greater than or equal to 1 and less than or equal to N.
It should be noted that, for a more specific implementation of the target tracking apparatus according to the embodiment of the present invention, reference may be made to the above-mentioned embodiment of the target tracking method, and details are not described here again.
According to the target tracking device of the embodiment of the invention, the first obtaining module is used for carrying out target detection on the current frame 3D target image to obtain the detection frames of N target objects in the current frame 3D target image, the second obtaining module is used for obtaining the tracking frames of N target objects in the current frame 3D target image, the first judging module is used for judging whether the detection frames and the tracking frames of N target objects in the current frame 3D target image are successfully matched or not, the calculating module is used for calculating the homography matrix between the tracking frames of m target objects in the current frame 3D target image and the detection frames of m target objects in the previous frame 3D target image when the tracking frames of m target objects in the current frame 3D target image are failed to be matched, and the second judging module is used for obtaining the detection frames of m target objects in the current frame 3D target image according to the homography matrix, and judging whether the tracking frames of the m target objects in the current 3D target image are matched with the detection frame, and outputting corresponding 3D tracking tracks through an output module when the tracking frames of the m target objects in the current 3D target image are successfully matched with the detection frame. Therefore, the conditions of missing detection and wrong detection of the target object can be effectively avoided, and the reliability of the system is greatly improved.
The invention further provides a computer device corresponding to the embodiment.
The computer device of the embodiment of the invention comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and when the processor executes the program, the target tracking method of the embodiment is realized.
According to the computer equipment provided by the embodiment of the invention, the conditions of missing detection and error detection of the target object can be effectively avoided, so that the reliability of the system is greatly improved.
The invention also provides a non-transitory computer readable storage medium corresponding to the above embodiment.
A non-transitory computer-readable storage medium of an embodiment of the present invention has stored thereon a computer program that, when executed by a processor, implements the above-described target tracking method.
According to the non-transitory computer readable storage medium provided by the embodiment of the invention, the situations of missing detection and error detection of the target object can be effectively avoided, so that the reliability of the system is greatly improved.
The present invention also provides a computer program product corresponding to the above embodiments.
The instructions in the computer program product, when executed by a processor, may perform the object tracking method of the above-described embodiments.
According to the computer program product provided by the embodiment of the invention, the conditions of missing detection and error detection of the target object can be effectively avoided, so that the reliability of the system is greatly improved.
In the description of the present invention, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. The meaning of "plurality" is two or more unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (10)
1. A target tracking method, comprising the steps of:
performing target detection on the current frame 3D target image to obtain detection frames of N target objects in the current frame 3D target image, wherein N is a positive integer greater than or equal to 1;
acquiring tracking frames of N target objects in the current frame 3D target image;
judging whether the detection frames and the tracking frames of the N target objects in the current frame 3D target image are successfully matched;
if the tracking frames of m target objects in the current frame 3D target image fail to be matched, calculating a homography matrix between the tracking frames of the m target objects in the current frame 3D target image and the detection frames of the m target objects in the corresponding previous frame 3D target image, wherein m is a positive integer which is greater than or equal to 1 and less than or equal to N;
acquiring detection frames of the m target objects in the current frame 3D target image according to the homography matrix, and judging whether tracking frames of the m target objects in the current frame 3D target image are matched with the detection frames;
and if the tracking frames of the m target objects in the current frame of 3D target image are successfully matched with the detection frame, outputting corresponding 3D tracking tracks.
2. The target tracking method according to claim 1, wherein the obtaining tracking frames of N target objects in the current frame 3D target image comprises:
acquiring tracking frames of the N target objects in the previous frame of 3D target image;
and predicting the tracking frames of the N target objects in the current frame of 3D target image by adopting a target detection algorithm according to the tracking frames of the N target objects in the previous frame of 3D target image.
3. The target tracking method of claim 1, further comprising:
if the matching of the detection frames of the N target objects in the current frame 3D target image fails, establishing a new 3D tracking track, and recording the types of the detection frames of the N target objects in the current frame 3D target image in a type attribute list of the new 3D tracking track, wherein N is a positive integer which is greater than or equal to 1 and less than or equal to N.
4. The target tracking method of any one of claims 1-3, further comprising:
acquiring detection frames and tracking frames of the N target objects in the current frame 2D target image;
judging whether the detection frames and the tracking frames of the N target objects in the current frame 2D target image are successfully matched;
and if the matching is successful, outputting a corresponding 2D tracking track, and combining the 2D tracking track and the 3D tracking track.
5. The target tracking method of claim 4, further comprising:
if the matching of the tracking frames of k target objects in the current frame 2D target image fails, predicting the tracking frames of the k target objects in the next frame 2D target image according to the tracking frames of the k target objects in the current frame 2D target image, wherein k is a positive integer which is greater than or equal to 1 and less than or equal to N;
acquiring detection frames of the k target objects in the next frame of 2D target image;
judging whether the detection frames and the tracking frames of the k target objects in the next frame of 2D target image are successfully matched;
and if the matching is successful, outputting a corresponding 2D tracking track, and combining the 2D tracking track and the 3D tracking track.
6. The target tracking method of claim 4, further comprising:
if the matching of the detection frames of p target objects in the current frame 2D target image fails, establishing a new 2D tracking track, and recording the classes of the detection frames of the p target objects in the current frame 2D target image in a class attribute list of the new 2D tracking track, wherein p is a positive integer which is greater than or equal to 1 and less than or equal to N.
7. An object tracking device, comprising:
the first acquisition module is used for carrying out target detection on the current frame 3D target image so as to acquire detection frames of N target objects in the current frame 3D target image, wherein N is a positive integer greater than or equal to 1;
a second obtaining module, configured to obtain tracking frames of N target objects in the current frame of 3D target image;
the first judgment module is used for judging whether the detection frames and the tracking frames of the N target objects in the current frame 3D target image are successfully matched;
a calculating module, configured to calculate a homography matrix between tracking frames of m target objects in the current frame 3D target image and detection frames of the m target objects in a corresponding previous frame 3D target image when matching of the tracking frames of the m target objects in the current frame 3D target image fails, where m is a positive integer greater than or equal to 1 and less than or equal to N;
a second determining module, configured to obtain, according to the homography matrix, detection frames of the m target objects in the current frame 3D target image, and determine whether tracking frames of the m target objects in the current frame 3D target image are matched with the detection frames;
and the output module is used for outputting corresponding 3D tracking tracks when the tracking frames of the m target objects in the current frame 3D target image are successfully matched with the detection frame.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the computer program, implements the object tracking method according to any of claims 1-6.
9. A non-transitory computer readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements the object tracking method according to any one of claims 1-6.
10. A computer program product, characterized in that instructions in the computer program product, when executed by a processor, perform the object tracking method according to any of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011061084.1A CN112184772B (en) | 2020-09-30 | 2020-09-30 | Target tracking method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011061084.1A CN112184772B (en) | 2020-09-30 | 2020-09-30 | Target tracking method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112184772A true CN112184772A (en) | 2021-01-05 |
CN112184772B CN112184772B (en) | 2024-07-09 |
Family
ID=73947385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011061084.1A Active CN112184772B (en) | 2020-09-30 | 2020-09-30 | Target tracking method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112184772B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113298852A (en) * | 2021-07-27 | 2021-08-24 | 第六镜科技(北京)有限公司 | Target tracking method and device, electronic equipment and computer readable storage medium |
CN113723311A (en) * | 2021-08-31 | 2021-11-30 | 浙江大华技术股份有限公司 | Target tracking method |
WO2023077754A1 (en) * | 2021-11-05 | 2023-05-11 | 北京小米移动软件有限公司 | Target tracking method and apparatus, and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160110885A1 (en) * | 2014-10-21 | 2016-04-21 | Government Of The United States As Represented By The Secretary Of The Air Force | Cloud based video detection and tracking system |
CN108053427A (en) * | 2017-10-31 | 2018-05-18 | 深圳大学 | A kind of modified multi-object tracking method, system and device based on KCF and Kalman |
CN111563469A (en) * | 2020-05-13 | 2020-08-21 | 南京师范大学 | Method and device for identifying irregular parking behaviors |
-
2020
- 2020-09-30 CN CN202011061084.1A patent/CN112184772B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160110885A1 (en) * | 2014-10-21 | 2016-04-21 | Government Of The United States As Represented By The Secretary Of The Air Force | Cloud based video detection and tracking system |
CN108053427A (en) * | 2017-10-31 | 2018-05-18 | 深圳大学 | A kind of modified multi-object tracking method, system and device based on KCF and Kalman |
CN111563469A (en) * | 2020-05-13 | 2020-08-21 | 南京师范大学 | Method and device for identifying irregular parking behaviors |
Non-Patent Citations (2)
Title |
---|
XINGLONG SUN: "An Automatic Multi-Target Independent Analysis Framework for Non-Planar Infrared-Visible Registration", 《SENSORS》, pages 1 - 16 * |
李琪: "短道速滑运动中的轨迹建模", 《国学学报》, pages 8 - 14 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113298852A (en) * | 2021-07-27 | 2021-08-24 | 第六镜科技(北京)有限公司 | Target tracking method and device, electronic equipment and computer readable storage medium |
CN113723311A (en) * | 2021-08-31 | 2021-11-30 | 浙江大华技术股份有限公司 | Target tracking method |
WO2023077754A1 (en) * | 2021-11-05 | 2023-05-11 | 北京小米移动软件有限公司 | Target tracking method and apparatus, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN112184772B (en) | 2024-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112184772B (en) | Target tracking method and device | |
WO2021072696A1 (en) | Target detection and tracking method and system, and movable platform, camera and medium | |
CN110532985B (en) | Target detection method, device and system | |
CN109949347B (en) | Human body tracking method, device, system, electronic equipment and storage medium | |
CN109344899B (en) | Multi-target detection method and device and electronic equipment | |
CN110187334B (en) | Target monitoring method and device and computer readable storage medium | |
EP3629288B1 (en) | Method for detecting flying spot on edge of depth image, electronic device, and computer readable storage medium | |
US11928805B2 (en) | Information processing apparatus, information processing method, and storage medium for defect inspection and detection | |
US20190122372A1 (en) | Object detection and tracking method and system | |
US20150104067A1 (en) | Method and apparatus for tracking object, and method for selecting tracking feature | |
CN115063454B (en) | Multi-target tracking matching method, device, terminal and storage medium | |
US6519358B1 (en) | Parallax calculating apparatus, distance calculating apparatus, methods of the same, and information providing media | |
WO2022147655A1 (en) | Positioning method and apparatus, spatial information acquisition method and apparatus, and photographing device | |
WO2020237501A1 (en) | Multi-source collaborative road vehicle monitoring system | |
CN110766715A (en) | Multi-target tracking method combined with single target track | |
US11120572B2 (en) | Method, system and apparatus for associating a target object in images | |
CN113298852A (en) | Target tracking method and device, electronic equipment and computer readable storage medium | |
CN115327529B (en) | 3D target detection and tracking method integrating millimeter wave radar and laser radar | |
CN115861315A (en) | Defect detection method and device | |
CN115512281A (en) | Invader monitoring method and system combining video camera and laser radar | |
CN116977362A (en) | Target tracking method, device, computer equipment and storage medium | |
CN109583511B (en) | Speed fusion method and device | |
CN112633496A (en) | Detection model processing method and device | |
CN117784162B (en) | Target annotation data acquisition method, target tracking method, intelligent device and medium | |
JP2014127068A (en) | Corresponding point search device, camera posture estimation device and program for these |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20240519 Address after: Room 6227, No. 999, Changning District, Shanghai 200050 Applicant after: Shenlan robot (Shanghai) Co.,Ltd. Country or region after: China Address before: 518131 room 115, building 8, 1970 Science Park, Minzhi community, Minzhi street, Longhua District, Shenzhen City, Guangdong Province Applicant before: Shenlan artificial intelligence (Shenzhen) Co.,Ltd. Country or region before: China |
|
TA01 | Transfer of patent application right | ||
GR01 | Patent grant | ||
GR01 | Patent grant |