CN111512317A - Multi-target real-time tracking method and device and electronic equipment - Google Patents

Multi-target real-time tracking method and device and electronic equipment Download PDF

Info

Publication number
CN111512317A
CN111512317A CN201880083620.2A CN201880083620A CN111512317A CN 111512317 A CN111512317 A CN 111512317A CN 201880083620 A CN201880083620 A CN 201880083620A CN 111512317 A CN111512317 A CN 111512317A
Authority
CN
China
Prior art keywords
information
target
current
matching
targets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880083620.2A
Other languages
Chinese (zh)
Other versions
CN111512317B (en
Inventor
孟勇
牛昕宇
蔡权雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Corerain Technologies Co Ltd
Original Assignee
Shenzhen Corerain Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Corerain Technologies Co Ltd filed Critical Shenzhen Corerain Technologies Co Ltd
Publication of CN111512317A publication Critical patent/CN111512317A/en
Application granted granted Critical
Publication of CN111512317B publication Critical patent/CN111512317B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

A multi-target real-time tracking method, a multi-target real-time tracking device and electronic equipment are provided, wherein the method comprises the following steps: acquiring image information (101), wherein the image information comprises current frame information and previous frame information of a plurality of targets; matching the current frame information of the targets with the previous frame information once, and judging whether at least one target in the targets is successfully matched once (102); if the at least one target is not successfully matched for the first time, performing secondary matching, and judging whether the at least one target is successfully matched for the second time (103); and forming output information (104) of the at least one target with successful primary matching and/or successful secondary matching, wherein the output information comprises current existence information and identification information. By matching at least one target twice in one frame image, the tracking accuracy can be increased.

Description

Multi-target real-time tracking method and device and electronic equipment Technical Field
The invention relates to the field of software development, in particular to a multi-target real-time tracking method, a multi-target real-time tracking device and electronic equipment.
Background
Tracking can determine the motion trajectory of a target (object or person). The current single target tracking algorithm, such as based on correlation filtering (KCF), can perform real-time single target tracking on low-power consumption equipment. However, in the terminal device, due to the limitation of power consumption, the current tracking algorithm has a real-time problem for multi-target tracking. Particularly, when the total number of targets is greater than 10, the combination of a plurality of single-target tracking algorithms has large calculation amount and high data processing delay, so that the tracking accuracy obtained by using the tracking mode is low.
Disclosure of Invention
The invention aims to provide a multi-target real-time tracking method, a multi-target real-time tracking device and electronic equipment aiming at the defects in the prior art, and solves the problem of low tracking accuracy.
The purpose of the invention is realized by the following technical scheme:
in a first aspect, a multi-target real-time tracking method is provided, where the method includes:
acquiring image information, wherein the image information comprises current frame information and previous frame information of a plurality of targets;
performing primary matching according to the current frame information and the previous frame information of the plurality of targets, and judging whether at least one target in the plurality of targets is successfully matched for one time;
if the at least one target is not successfully matched for the first time, performing secondary matching, and judging whether the at least one target is successfully matched for the second time, wherein the secondary matching comprises at least one of feature matching and distance matching;
and forming output information by using the information of the at least one target successfully matched for the first time and/or successfully matched for the second time, wherein the output information comprises current existence information and identification information.
Optionally, if the at least one target is not successfully matched for the first time, performing a second matching, and after determining whether the at least one target is successfully matched for the second time, the method further includes:
and if the secondary matching is unsuccessful, regenerating the current frame information of the remaining targets, and acquiring new image information, wherein the new image information comprises the current frame information and the next frame information.
Optionally, the current frame includes current detection information of the multiple targets, and the previous frame includes historical presence information of the multiple targets and corresponding identification information;
the step of performing a primary matching between the current frame information and the previous frame information of the plurality of targets and determining whether at least one target of the plurality of targets is successfully matched at a time includes:
calculating the overlapping degree of the current detection information of at least one target of the targets and the historical existence information of at least one target of the targets to obtain the overlapping degree of the current detection information of at least one target and the historical existence information of at least one target;
and judging whether the at least one target is successfully matched once according to the overlapping degree.
Optionally, the determining whether the at least one target is successfully matched according to the overlapping degree includes:
selecting the maximum overlapping degree to compare with a preset overlapping degree threshold value, and judging whether the maximum overlapping degree is greater than the overlapping degree threshold value;
if the maximum overlapping degree is larger than the overlapping degree threshold value, the primary matching is successful, and if the maximum overlapping degree is smaller than the overlapping degree threshold value, the primary matching is unsuccessful.
Optionally, the current frame includes current detection information of the multiple targets, and the previous frame includes historical presence information of the multiple targets and corresponding identification information;
the performing the secondary matching and determining whether the at least one target is successfully matched for the second time includes:
extracting a current feature vector of current detection information of at least one target of the multiple targets, extracting a historical feature vector of historical existence information of at least one target of the multiple targets, and calculating the current feature vector and the historical feature vector to obtain cosine similarity of the at least one target;
extracting the current coordinate of the current detection information of the at least one target and the historical coordinate of the historical existence information of the at least one target, and calculating the current coordinate and the historical coordinate of the at least one target to obtain a distance value of the at least one target;
and judging whether the at least one target is successfully matched for the second time or not according to the cosine similarity and the distance value of the at least one target.
Optionally, the determining whether the at least one target is successfully matched twice according to the feature value and the distance value of the at least one target includes:
comparing the cosine similarity of the characteristic value with a preset cosine similarity threshold, and comparing the distance value with a preset distance threshold to obtain a comparison result;
and judging whether the at least one target is successfully matched for the second time according to the comparison result and a preset judgment rule.
Optionally, the forming output information of the at least one target with successful primary matching and/or successful secondary matching includes:
updating the current detection information of the at least one target to the current existence information, and associating the corresponding identification information of the at least one target to the current detection information;
and forming output information of the at least one target according to the current existence information and the corresponding identification information.
In a second aspect, a multi-target real-time tracking apparatus is provided, the apparatus comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring image information, and the image information comprises current frame information and previous frame information of a plurality of targets;
the first matching module is used for matching the current frame information of the plurality of targets with the previous frame information once and judging whether at least one target in the plurality of targets is successfully matched once;
the second matching module is used for carrying out secondary matching if the at least one target is not successfully matched for the first time, and judging whether the at least one target is successfully matched for the second time, wherein the secondary matching comprises at least one of feature matching and distance matching;
and the output module is used for forming output information from the information of the at least one target which is successfully matched for the first time and/or successfully matched for the second time, wherein the output information comprises current existence information and identification information.
In a third aspect, an electronic device is provided, including: the multi-target real-time tracking method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the computer program to realize the steps of the multi-target real-time tracking method provided by the embodiment of the invention.
In a fourth aspect, a computer-readable storage medium is provided, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps in the multi-target real-time tracking method provided in the embodiments of the present invention.
The invention has the following beneficial effects: acquiring image information, wherein the image information comprises current frame information and previous frame information of a plurality of targets; performing primary matching according to the current frame information and the previous frame information of the plurality of targets, and judging whether at least one target in the plurality of targets is successfully matched for one time; if the at least one target is not successfully matched for the first time, performing secondary matching, and judging whether the at least one target is successfully matched for the second time; and forming output information of the at least one target with successful primary matching and/or successful secondary matching, wherein the output information comprises current existence information and identification information. By matching at least one target twice in one frame image, the tracking accuracy can be increased.
Drawings
Fig. 1 is a schematic flow chart of a multi-target real-time tracking method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of current information in accordance with an embodiment of the present invention;
FIG. 3 is a diagram illustrating image information according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart of another multi-target real-time tracking method according to an embodiment of the present invention;
FIG. 5 is a schematic flow chart of another multi-target real-time tracking method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a multi-target real-time tracking apparatus according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of another multi-target real-time tracking apparatus according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of another multi-target real-time tracking apparatus according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of another multi-target real-time tracking apparatus according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of another multi-target real-time tracking apparatus according to an embodiment of the present invention;
fig. 11 is a schematic diagram of another multi-target real-time tracking apparatus according to an embodiment of the present invention.
Detailed Description
The following describes preferred embodiments of the present invention and those skilled in the art will be able to realize the invention by using the related art in the following and will more clearly understand the innovative features and the advantages brought by the present invention.
The invention provides a multi-target real-time tracking method and device and electronic equipment.
The purpose of the invention is realized by the following technical scheme:
in a first aspect, please refer to fig. 1, fig. 1 is a schematic flowchart of a multi-target real-time tracking method according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following steps:
101. acquiring image information, wherein the image information comprises current frame information and previous frame information of a plurality of targets.
In this step, the image information may be image information of a video frame captured by a camera, and the image information may be identified according to a time of the video frame, for example, if a time for acquiring a frame image in the video is 15.6789 seconds, the frame image may be identified as 15S 6789. Or the sequence number of the frame in the total video frame, for example, the frame is 14567 th frame in the total video frame, and the frame image can be identified as 14567. Of course, the embodiment of the present invention is not limited to the two identification manners, and other identification manners may also be used, such as identification with a time stamp of date, and sequential identification with a camera number.
The current frame information includes information such as feature coordinate values, feature range values, and confidence values of a plurality of targets in the image, the previous frame information includes information such as identifiers, feature coordinate values, feature range values, and confidences of a plurality of targets, and the feature coordinate values and the feature range values may be measured by using pixels or actual dimensions, which is not limited in the embodiments of the present invention. As shown in fig. 2, the current frame information may be obtained by performing real-time detection on a plurality of targets in an original image of the current frame, and when it is detected that the image includes information of a target, a current feature box is used to represent the target information, that is, the target, and the center coordinate information, the length and width (width and height) information, and the occurring confidence of the current feature box are obtained, where the confidence is a measure of the confidence of the existence of the target, that is, the higher the confidence is, the higher the possibility that the target exists is, the more credible the current feature box is, and the confidence may be obtained when the image is detected in real time. The characteristic range value comprises an area value occupied by the characteristic image in the characteristic frame.
The previous frame information may be information of identifiers, feature coordinate values, feature range values, confidence values, and the like of a plurality of objects in the image in the previous frame, it should be noted that the identifiers of the plurality of objects in the current frame are different from the identifiers of the plurality of objects in the previous frame, or, the identifiers of the plurality of objects in the current frame are not overlapped with the identifiers of the plurality of objects in the previous frame, for example, the identifiers of the plurality of objects in the current frame are A, B, C, D respectively, and the identifiers of the plurality of objects in the previous frame may be a ', B ', C ', and D ', where a and a ' may be different objects. As shown in fig. 3, the previous frame of information may be obtained by detecting a plurality of targets in the original image of the previous frame in real time, or may be obtained by detecting a plurality of targets in the processed image of the previous frame in real time, where the target information in the previous frame of information is represented by a previous feature frame that is different from a feature frame in the current frame of information, for example, the current feature frame is a solid line frame, and the previous feature frame is a dashed line frame, and of course, the target information may also be distinguished by identifiers of the feature frames, for example, by associating the identifier of the target with the feature frame, and configuring two different identifiers for the feature frame in the current frame of information and the feature frame in the previous frame of information.
Specifically, optionally, the image information includes information such as a current frame original image, a current feature frame, a previous feature frame, and the like, the current feature frame includes a current identifier, current center coordinate information, current length and width (width and height) information, and an occurrence confidence, and the previous feature frame includes a previous identifier, previous center coordinate information, and previous length and width (width and height) information.
It should be noted that the feature frame may also be referred to as a target frame, the identifier may also be referred to as an ID, the current feature frame may also be referred to as a detection frame, the previous feature frame may also be referred to as a tracking frame, the feature range value is an area value occupied by the feature image in the feature frame, the real-time detection may be performed in a tracker, or may be obtained through a tracking algorithm, and as the tracker and the tracking algorithm are known by those skilled in the art, details are not described herein.
102. And matching the current frame information of the targets with the previous frame information once, and judging whether at least one target in the targets is successfully matched once.
The current frame information comprises current feature frames of a plurality of targets or current feature vectors of the plurality of targets, the previous frame information comprises previous feature frames of the plurality of targets or previous feature vectors of the plurality of targets, and at least one target in the plurality of targets is matched by using a deep learning target detection algorithm.
In this embodiment of the present invention, the current frame information in step 102 includes a plurality of current feature frames corresponding to a plurality of targets, the previous frame information includes a plurality of previous feature frames corresponding to the plurality of targets, the information obtained by real-time detection is output to obtain a set of image information including the plurality of current feature frames and the plurality of previous feature frames, for each previous feature frame, the previous feature frame is placed in the plurality of current feature frames for matching, the overlapping area between each previous feature frame and the plurality of current feature frames is calculated, the overlapping degree (i.e., Intersection-over-Union ratio) between each previous feature frame and the plurality of current feature frames is calculated according to the overlapping area, and one current feature frame with the largest overlapping degree with the previous feature frame is selected to form a set, as shown in fig. 3. And comparing the maximum overlapping degree of the last feature frame with a preset overlapping degree threshold, marking that the matching is successful for the first time if the maximum overlapping degree meets the overlapping degree threshold, marking that the matching is unsuccessful for the first time if the maximum overlapping degree does not meet the overlapping degree threshold, and entering the step 103 for secondary matching.
In some possible embodiments, the similarity of the feature frames may also be compared, and a current feature frame corresponding to the feature frame of the previous frame is matched. The similarity includes: area similarity, length-width (width-height) similarity, etc.
The first matching may be referred to as first matching, first tracking, or the like, and may be directly referred to as tracking.
103. And if the at least one target is not successfully matched for the first time, performing secondary matching, and judging whether the at least one target is successfully matched for the second time, wherein the secondary matching comprises at least one of feature matching and distance matching.
In this step, the secondary matching is performed on the target that has not been successfully matched in step 102, and the secondary matching includes at least one of feature matching and distance matching. The feature matching comprises obtaining a current feature vector of a current feature frame, obtaining a previous feature vector of a previous feature frame, and calculating the similarity between the previous feature vector and the current feature vector. The distance matching comprises the acquisition of the distance value between the last characteristic frame and the current characteristic frame. And selecting the last feature frame with the similarity greater than a preset similarity threshold value to perform secondary matching with the current feature frame, or selecting the last feature frame with the distance value less than the preset distance threshold value to perform secondary matching with the current feature frame.
The second matching may be referred to as first matching, second tracking, or the like.
104. And forming output information by using the information of the at least one target successfully matched for the first time and/or successfully matched for the second time, wherein the output information comprises current existence information and identification information.
The current existence information includes a current feature box, for example, if the matching is successful, the current feature box a and a previous feature box a', the information of the current feature box a is output, and the information of the current feature box includes: the identification information includes identification information of the current feature frame, which is used to represent the current feature frame, for example, if the current feature frame is a, a is output, and the identification information of the current feature frame is associated with the current feature frame.
In step 102, matching the successfully matched target, the successfully matched target information may be placed in an active set, and the unsuccessfully matched target information may be placed in a lost set. After performing secondary matching on the targets in the missing set in step 103, a target successfully matched is obtained, and the information of the target successfully matched may be added into the active set and output.
Updating the successfully matched target to obtain current existing information and identification information, updating the identification information of the previous characteristic frame to the current characteristic frame for the successfully matched current characteristic frame and the previous characteristic frame, unifying the identification of the current characteristic frame and the identification of the previous characteristic frame to represent the same target, namely tracking the target successfully, and deleting the previous characteristic frame which is successfully matched to enable only the information of the current characteristic frame to exist in the active set to form the current existing information and the identification information of the target. For example: a is the mark of the current characteristic frame, A ' is the mark of the previous characteristic frame, A and A ' are a pair of current characteristic frame and previous characteristic frame which are successfully matched, the mark of the current characteristic frame is changed from A to A ', then the previous characteristic frame is deleted from the image information, the current characteristic frame A ' is recorded into the active set, and the output information is the mark A ' of the current characteristic frame, the center coordinate information of the current characteristic frame, the length and width (width and height) information and the like.
In the embodiment of the invention, image information is obtained, wherein the image information comprises current frame information and previous frame information of a plurality of targets; performing primary matching according to the current frame information and the previous frame information of the plurality of targets, and judging whether at least one target in the plurality of targets is successfully matched for one time; if the at least one target is not successfully matched for the first time, performing secondary matching, and judging whether the at least one target is successfully matched for the second time; and forming output information of the at least one target with successful primary matching and/or successful secondary matching, wherein the output information comprises current existence information and identification information. The efficiency of tracking can be increased by processing at least one target in one frame of image simultaneously.
It should be noted that the installation method of the container orchestration engine provided by the embodiment of the present invention may be applied to an installation device of the container orchestration engine, for example: and the computer, the server, the mobile phone and the like can be used for installing the container arrangement engine.
Referring to fig. 4, fig. 4 is a schematic flow chart of another multi-target real-time tracking method according to an embodiment of the present invention, as shown in fig. 4, the method includes the following steps:
201. acquiring image information, wherein the image information comprises current frame information and previous frame information of a plurality of targets;
202. performing primary matching according to the current frame information and the previous frame information of the multiple targets, and judging whether at least one target in the multiple targets is successfully matched for one time;
203. if the at least one target is not successfully matched for the first time, performing secondary matching, and judging whether the at least one target is successfully matched for the second time, wherein the secondary matching comprises at least one of feature matching and distance matching;
204. forming output information by the information of the at least one target successfully matched for the first time and/or successfully matched for the second time, wherein the output information comprises current existing information and identification information;
205. and if the secondary matching is unsuccessful, regenerating the current frame information of the remaining targets, and acquiring new image information, wherein the new image information comprises the current frame information and the next frame information.
In step 202, matching the successful target, the target information that is successfully matched may be placed in an active set, and the target information that is not successfully matched may be placed in a lost set. After performing secondary matching on the targets in the missing set in step 203, a target that is successfully matched is obtained, the target information that is successfully matched may be added into the active set, the target information in the active set is output, and if the matching is unsuccessful, the process proceeds to step 205.
In step 205, for the current feature box and the previous feature box that have not been successfully matched twice, the feature box of the target is regenerated, the regenerated feature box is recorded in the active set, and the identification information corresponding to the active set is regenerated for the feature boxes. For example: assuming that two elements, namely a current feature frame A ' and a current feature frame D ', exist in an active set, B is a current feature frame which is not successfully matched, B ' is a previous feature frame which is successfully matched, C is a current feature frame which is successfully matched with B ', and C ' is a previous feature frame which is not successfully matched, changing the identifier C of the current feature frame which is successfully matched into an identifier B ' to be recorded in the active set to obtain a current feature frame B ', wherein three elements, namely the current feature frame A ', the current feature frame B ' and the current feature frame D ' exist in the active set, deleting the previous feature frame B ', and regenerating the identifier B of the current feature frame into E ', recording the E ' and the previous feature frame C ' into the active set, and then actively concentrating the current feature frame A ', the current feature frame B ' and the current feature frame B ' into the active set, And obtaining current frame information by using three elements of a current feature frame D 'and four elements of a current feature frame E'.
Acquiring new image information includes acquiring new original image information, performing real-time detection on the new original image to obtain next frame information, adding the current feature frame in the active set to the new image information, and performing a new tracking process in a circulating manner to obtain all tracking results, as shown in fig. 5. In addition, step 201 to step 205 can be executed in a loop, and multiple targets can be tracked.
It should be noted that step 205 is optional, and in some embodiments, the output information output only needs to be formed for the information of the at least one target for which the first matching is successful and/or the second matching is successful. Optionally, the current frame includes current detection information of the multiple targets, and the previous frame includes historical presence information of the multiple targets and corresponding identification information;
the step of performing a primary matching between the current frame information and the previous frame information of the plurality of targets and determining whether at least one target of the plurality of targets is successfully matched at a time includes:
calculating the overlapping degree of the current detection information of at least one target of the targets and the historical existence information of at least one target of the targets to obtain the overlapping degree of the current detection information of at least one target and the historical existence information of at least one target;
and judging whether the at least one target is successfully matched once according to the overlapping degree.
The current detection information includes current feature frame information obtained by detecting a current original image in real time and generated identifiers of each current feature frame, the current feature frame information includes information such as center coordinate information, length and width (width and height) information and confidence coefficient, and the identifier of the current feature frame can be unique identifiers such as a unique digital identifier and a unique letter identifier. The history existence information may be feature frame information existing in a previous frame image, and the corresponding identification information is a unique identification of a feature frame existing in a previous frame image, or may be a unique identification of a previous feature frame. The overlapping degree may be the overlapping degree of the current feature frame and the previous feature frame, and includes the coordinate overlapping degree of the length and the width (width and height), the area overlapping degree, and the like.
In some possible embodiments, the overlapping degree may also be the similarity of the feature vectors or the similarity of the feature frames.
When the overlapping degree or the similarity is larger than a preset threshold, it can be judged that the at least one target is successfully matched for one time, and when the overlapping degree or the similarity is smaller than the preset threshold, it can be judged that the at least one target is unsuccessfully matched for one time.
Optionally, the determining whether the at least one target is successfully matched according to the overlapping degree includes:
selecting the maximum overlapping degree to compare with a preset overlapping degree threshold value, and judging whether the maximum overlapping degree is greater than the overlapping degree threshold value;
if the maximum overlapping degree is larger than the overlapping degree threshold value, the primary matching is successful, and if the maximum overlapping degree is smaller than the overlapping degree threshold value, the primary matching is unsuccessful.
For each previous feature frame, putting the previous feature frame in a plurality of current feature frames for matching, calculating the overlapping area of each previous feature frame and the plurality of current feature frames, calculating the overlapping degree (Intersection-over-Union, IoU, also called Intersection-comparison) of each previous feature frame and the plurality of current feature frames according to the overlapping area, selecting one current feature frame with the maximum overlapping degree with the previous feature frame as a group, and comparing the maximum overlapping degree of the previous feature frame with a preset overlapping degree threshold, for example: a 'is the mark of the previous feature frame, B' is the mark of the previous feature frame, A is the mark of the current feature frame, B is the mark of the current feature frame, the overlapping degree of A 'and A is 0.4, the overlapping degree of A' and B is 0.8, A 'and B have the maximum overlapping degree and are recorded as a group, the overlapping degree of B' and A is 0.4, the overlapping degree of B 'and B is 0.5, B' and B have the maximum overlapping degree and are recorded as a group, the overlapping degree of A 'and A is 0.4, the maximum overlapping degree meets the overlapping degree threshold value and is recorded as a primary matching success, if the overlapping degree threshold value is 0.6, the matching of A' and B is successful, the overlapping degree of B 'and B is less than 0.6, the matching of B' and B is unsuccessful, if the maximum overlapping degree does not meet the overlapping degree threshold value, the primary matching is recorded, and the previous feature frame is subjected to secondary matching in the step 203.
Optionally, the current frame includes current detection information of the multiple targets, and the previous frame includes historical presence information of the multiple targets and corresponding identification information;
the performing the secondary matching and determining whether the at least one target is successfully matched for the second time includes:
extracting a current feature vector of current detection information of at least one target of the multiple targets, extracting a historical feature vector of historical existence information of at least one target of the multiple targets, and calculating the current feature vector and the historical feature vector to obtain cosine similarity of the at least one target;
extracting the current coordinate of the current detection information of the at least one target and the historical coordinate of the historical existence information of the at least one target, and calculating the current coordinate and the historical coordinate of the at least one target to obtain a distance value of the at least one target;
and judging whether the at least one target is successfully matched for the second time or not according to the cosine similarity and the distance value of the at least one target.
The current detection information comprises current feature box information and an identifier, and the historical existence information comprises previous feature box information and an identifier. The current feature vector may obtain a current HOG feature vector by extracting a Histogram of Oriented Gradients (HOG) of the current feature box, and similarly, the previous feature vector may obtain a previous HOG feature vector by extracting a Histogram of Oriented gradients of the previous feature box, and a cosine similarity between the current HOG feature vector and the previous HOG feature vector is obtained by calculation.
The current feature box comprises current central coordinate information and current length and width (width and height) information, the previous feature box comprises previous central coordinate information and previous length and width (width and height) information, and the distance value between the current feature box and the previous feature box can be calculated by the following formula:
D=sqrt((x1-x2)2+(y1-y2)2)/min(w1,w2)
wherein D is a distance value between the current feature frame and the previous feature frame, x1, y1, w1 belong to the current feature frame, x1, y1 are center coordinates of the current feature frame, w1 is a width of the current feature frame, x2, y2, w2 belong to the previous feature frame, x2, y2 are center coordinates of the previous feature frame, and w2 is a width of the previous feature frame.
Optionally, the determining whether the at least one target is successfully matched twice according to the feature value and the distance value of the at least one target includes:
comparing the cosine similarity of the characteristic value with a preset cosine similarity threshold, and comparing the distance value with a preset distance threshold to obtain a comparison result;
and judging whether the at least one target is successfully matched for the second time according to the comparison result and a preset judgment rule.
The judgment rule includes: when the cosine similarity is greater than the preset cosine similarity threshold, the second matching can be considered to be successful. In addition, when the distance value is smaller than the set distance threshold, the second matching may be considered to be successful. Of course, when the cosine similarity is greater than the preset cosine similarity threshold and the distance value is less than the set distance threshold, the second matching may be considered to be successful.
Optionally, the forming output information of the at least one target with successful primary matching and/or successful secondary matching includes:
updating the current detection information of the at least one target to the current existence information, and associating the corresponding identification information of the at least one target to the current detection information;
and forming output information of the at least one target according to the current existence information and the corresponding identification information.
The current existence information includes a current feature box, for example, if the matching is successful, the current feature box a and a previous feature box a', the information of the current feature box a is output, and the information of the current feature box includes: the identification information includes identification information of the current feature frame, which is used to represent the current feature frame, for example, if the current feature frame is a, a is output, if the current feature frame is a ', a' is output, and the identification information of the current feature frame is associated with the current feature frame.
In step 202, matching the successful target, the target information that is successfully matched may be placed in an active set, and the target information that is not successfully matched may be placed in a lost set. After performing secondary matching on the targets in the missing set in step 203, a target successfully matched is obtained, and the information of the target successfully matched may be added into the active set and output.
Updating the successfully matched target to obtain current existing information and identification information, updating the identification information of the previous characteristic frame to the current characteristic frame for the successfully matched current characteristic frame and the previous characteristic frame, unifying the identification of the current characteristic frame and the identification of the previous characteristic frame to represent the same target, namely tracking the target successfully, and deleting the previous characteristic frame which is successfully matched to enable only the information of the current characteristic frame to exist in the active set to form the current existing information and the identification information of the target. For example: a is the mark of the current characteristic frame, A ' is the mark of the previous characteristic frame, A and A ' are a pair of current characteristic frame and previous characteristic frame which are successfully matched, the mark of the current characteristic frame is changed from A to A ', then the previous characteristic frame is deleted from the image information, the current characteristic frame A ' is recorded into the active set, and the output information is the mark A ' of the current characteristic frame, the center coordinate information of the current characteristic frame, the length and width (width and height) information and the like.
In a second aspect, as shown in fig. 6, there is provided a multi-target real-time tracking apparatus, the apparatus comprising:
an obtaining module 401, configured to obtain image information, where the image information includes current frame information and previous frame information of multiple targets;
a first matching module 402, configured to perform a primary matching according to current frame information and previous frame information of the multiple targets, and determine whether at least one of the multiple targets is successfully matched at a time;
a second matching module 403, configured to perform secondary matching if the at least one target is not successfully matched for the first time, and determine whether the at least one target is successfully matched for the second time, where the secondary matching includes at least one of feature matching and distance matching;
an output module 404, configured to form output information from information of the at least one target that is successfully matched for the first time and/or successfully matched for the second time, where the output information includes current existence information and identification information.
Optionally, as shown in fig. 7, after performing a second matching if the at least one target is not successfully matched for the first time and determining whether the at least one target is successfully matched for the second time, the apparatus further includes:
a generating module 405, configured to regenerate current frame information of the remaining targets and acquire new image information if the secondary matching is unsuccessful, where the new image information includes the current frame information and next frame information.
Optionally, as shown in fig. 8, the current frame includes current detection information of the multiple targets, and the previous frame includes historical presence information of the multiple targets and corresponding identification information;
the first matching module 402 comprises:
the first processing unit 4021 is configured to perform overlap calculation on current detection information of at least one of the multiple targets and historical presence information of at least one of the multiple targets to obtain an overlap of the current detection information of the at least one target and the historical presence information of the at least one target;
a first determining unit 4022, configured to determine whether the at least one target is successfully matched according to the overlapping degree.
Optionally, as shown in fig. 9, the first judging unit 4022 includes:
a comparison subunit 40221, configured to select a maximum overlapping degree to compare with a preset overlapping degree threshold, and determine whether the maximum overlapping degree is greater than the overlapping degree threshold;
a determining subunit 40222, configured to determine that one-time matching is successful if the maximum overlapping degree is greater than the overlapping degree threshold, and determine that one-time matching is unsuccessful if the maximum overlapping degree is less than the overlapping degree threshold.
Optionally, as shown in fig. 10, the current frame includes current detection information of the multiple targets, and the previous frame includes historical presence information of the multiple targets and corresponding identification information;
the second matching module 403 includes:
a second processing unit 4031, configured to extract a current feature vector of current detection information of at least one target of the multiple targets, extract a historical feature vector of historical presence information of the at least one target of the multiple targets, and calculate the current feature vector and the historical feature vector to obtain a cosine similarity of the at least one target;
a third processing unit 4032, configured to extract a current coordinate of the current detection information of the at least one target and a historical coordinate of the historical presence information of the at least one target, and calculate the current coordinate and the historical coordinate of the at least one target to obtain a distance value of the at least one target;
a second determining unit 4033, configured to determine whether the at least one target is successfully matched twice according to the cosine similarity and the distance value of the at least one target.
Optionally, as shown in fig. 11, the output module includes:
an updating unit 4041, configured to update the current detection information of the at least one target to the current presence information, and associate the corresponding identification information of the at least one target with the current detection information;
the output unit 4042 forms output information of the at least one target according to the current existence information and the corresponding identification information.
In a third aspect, an embodiment of the present invention provides an electronic device, including: the multi-target real-time tracking method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the computer program to realize the steps of the multi-target real-time tracking method provided by the embodiment of the invention.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements steps in the multi-target real-time tracking method provided in the embodiment of the present invention.
The foregoing is a more detailed description of the present invention in connection with specific preferred embodiments thereof, and it is not intended that the specific embodiments of the present invention be limited to these descriptions. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (10)

  1. A multi-target real-time tracking method is characterized by comprising the following steps:
    acquiring image information, wherein the image information comprises current frame information and previous frame information of a plurality of targets;
    performing primary matching according to the current frame information and the previous frame information of the plurality of targets, and judging whether at least one target in the plurality of targets is successfully matched for one time;
    if the at least one target is not successfully matched for the first time, performing secondary matching, and judging whether the at least one target is successfully matched for the second time, wherein the secondary matching comprises at least one of feature matching and distance matching;
    and forming output information by using the information of the at least one target successfully matched for the first time and/or successfully matched for the second time, wherein the output information comprises current existence information and identification information.
  2. The method of claim 1, wherein after performing a second matching if the at least one object does not have a successful first matching and determining whether the at least one object has a successful second matching, further comprising:
    and if the secondary matching is unsuccessful, regenerating the current frame information of the remaining targets, and acquiring new image information, wherein the new image information comprises the current frame information and the next frame information.
  3. The method of claim 2, wherein the current frame includes current detection information of the plurality of targets, and the previous frame information includes historical presence information and corresponding identification information of the plurality of targets;
    the step of performing a primary matching between the current frame information and the previous frame information of the plurality of targets and determining whether at least one target of the plurality of targets is successfully matched at a time includes:
    calculating the overlapping degree of the current detection information of at least one target of the targets and the historical existence information of at least one target of the targets to obtain the overlapping degree of the current detection information of at least one target and the historical existence information of at least one target;
    and judging whether the at least one target is successfully matched once according to the overlapping degree.
  4. The method of claim 3, wherein said determining whether the at least one target matches successfully based on the degree of overlap comprises:
    selecting the maximum overlapping degree to compare with a preset overlapping degree threshold value, and judging whether the maximum overlapping degree is greater than the overlapping degree threshold value;
    if the maximum overlapping degree is larger than the overlapping degree threshold value, the primary matching is successful, and if the maximum overlapping degree is smaller than the overlapping degree threshold value, the primary matching is unsuccessful.
  5. The method of claim 2, wherein the current frame includes current detection information of the plurality of targets, and the previous frame information includes historical presence information and corresponding identification information of the plurality of targets;
    the performing the secondary matching and determining whether the at least one target is successfully matched for the second time includes:
    extracting a current feature vector of current detection information of at least one target of the multiple targets, extracting a historical feature vector of historical existence information of at least one target of the multiple targets, and calculating the current feature vector and the historical feature vector to obtain cosine similarity of the at least one target;
    extracting the current coordinate of the current detection information of the at least one target and the historical coordinate of the historical existence information of the at least one target, and calculating the current coordinate and the historical coordinate of the at least one target to obtain a distance value of the at least one target;
    and judging whether the at least one target is successfully matched for the second time or not according to the cosine similarity and the distance value of the at least one target.
  6. The method of claim 5, wherein the determining whether the at least one object is successfully matched twice according to the feature value and the distance value of the at least one object comprises:
    comparing the cosine similarity of the characteristic value with a preset cosine similarity threshold, and comparing the distance value with a preset distance threshold to obtain a comparison result;
    and judging whether the at least one target is successfully matched for the second time according to the comparison result and a preset judgment rule.
  7. The method of claim 5, wherein the forming output information of the at least one target for which the primary matching is successful and/or the secondary matching is successful comprises:
    updating the current detection information of the at least one target to the current existence information, and associating the corresponding identification information of the at least one target to the current detection information;
    and forming output information of the at least one target according to the current existence information and the corresponding identification information.
  8. A multi-target real-time tracking apparatus, the apparatus comprising:
    the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring image information, and the image information comprises current frame information and previous frame information of a plurality of targets;
    the first matching module is used for matching the current frame information of the plurality of targets with the previous frame information once and judging whether at least one target in the plurality of targets is successfully matched once;
    the second matching module is used for carrying out secondary matching if the at least one target is not successfully matched for the first time, and judging whether the at least one target is successfully matched for the second time, wherein the secondary matching comprises at least one of feature matching and distance matching;
    and the output module is used for forming output information from the information of the at least one target which is successfully matched for the first time and/or successfully matched for the second time, wherein the output information comprises current existence information and identification information.
  9. An electronic device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the multi-target real-time tracking method according to any one of claims 1 to 7 when executing the computer program.
  10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which, when being executed by a processor, implements the steps in the multi-target real-time tracking method according to any one of claims 1 to 7.
CN201880083620.2A 2018-10-24 2018-10-24 Multi-target real-time tracking method and device and electronic equipment Active CN111512317B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/111589 WO2020082258A1 (en) 2018-10-24 2018-10-24 Multi-objective real-time tracking method and apparatus, and electronic device

Publications (2)

Publication Number Publication Date
CN111512317A true CN111512317A (en) 2020-08-07
CN111512317B CN111512317B (en) 2023-06-06

Family

ID=70330247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880083620.2A Active CN111512317B (en) 2018-10-24 2018-10-24 Multi-target real-time tracking method and device and electronic equipment

Country Status (2)

Country Link
CN (1) CN111512317B (en)
WO (1) WO2020082258A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112070802A (en) * 2020-09-02 2020-12-11 合肥英睿系统技术有限公司 Target tracking method, device, equipment and computer readable storage medium
CN114185034A (en) * 2020-09-15 2022-03-15 郑州宇通客车股份有限公司 Target tracking method and system for millimeter wave radar

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914653B (en) * 2020-07-02 2023-11-07 泰康保险集团股份有限公司 Personnel marking method and device
CN112037256A (en) * 2020-08-17 2020-12-04 中电科新型智慧城市研究院有限公司 Target tracking method and device, terminal equipment and computer readable storage medium
CN112084914B (en) * 2020-08-31 2024-04-26 的卢技术有限公司 Multi-target tracking method integrating space motion and apparent feature learning
CN112101223B (en) * 2020-09-16 2024-04-12 阿波罗智联(北京)科技有限公司 Detection method, detection device, detection equipment and computer storage medium
CN112634327A (en) * 2020-12-21 2021-04-09 合肥讯图信息科技有限公司 Tracking method based on YOLOv4 model
CN113238209B (en) * 2021-04-06 2024-01-16 宁波吉利汽车研究开发有限公司 Road perception method, system, equipment and storage medium based on millimeter wave radar
CN113344975A (en) * 2021-06-24 2021-09-03 西安天和防务技术股份有限公司 Multi-target tracking method and device and electronic equipment
CN113361456B (en) * 2021-06-28 2024-05-07 北京影谱科技股份有限公司 Face recognition method and system
CN113723311B (en) * 2021-08-31 2024-09-20 浙江大华技术股份有限公司 Target tracking method
CN114155275A (en) * 2021-11-17 2022-03-08 深圳职业技术学院 IOU-Tracker-based fish tracking method and device
CN115223135B (en) * 2022-04-12 2023-11-21 广州汽车集团股份有限公司 Parking space tracking method and device, vehicle and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104424638A (en) * 2013-08-27 2015-03-18 深圳市安芯数字发展有限公司 Target tracking method based on shielding situation
EP2858008A2 (en) * 2013-09-27 2015-04-08 Ricoh Company, Ltd. Target detecting method and system
US20180040094A1 (en) * 2015-04-29 2018-02-08 Baidu Online Network Technology (Beijing) Co., Ltd. Image-based information acquisition method and apparatus
CN108664930A (en) * 2018-05-11 2018-10-16 西安天和防务技术股份有限公司 A kind of intelligent multi-target detection tracking

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106033613B (en) * 2015-03-16 2019-04-30 北京大学 Method for tracking target and device
CN104778465B (en) * 2015-05-06 2018-05-15 北京航空航天大学 A kind of matched method for tracking target of distinguished point based
CN106097391B (en) * 2016-06-13 2018-11-16 浙江工商大学 A kind of multi-object tracking method of the identification auxiliary based on deep neural network
CN106203274B (en) * 2016-06-29 2020-03-31 长沙慧联智能科技有限公司 Real-time pedestrian detection system and method in video monitoring
CN107316322A (en) * 2017-06-27 2017-11-03 上海智臻智能网络科技股份有限公司 Video tracing method and device and object identifying method and device
CN108154118B (en) * 2017-12-25 2018-12-18 北京航空航天大学 A kind of target detection system and method based on adaptive combined filter and multistage detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104424638A (en) * 2013-08-27 2015-03-18 深圳市安芯数字发展有限公司 Target tracking method based on shielding situation
EP2858008A2 (en) * 2013-09-27 2015-04-08 Ricoh Company, Ltd. Target detecting method and system
US20180040094A1 (en) * 2015-04-29 2018-02-08 Baidu Online Network Technology (Beijing) Co., Ltd. Image-based information acquisition method and apparatus
CN108664930A (en) * 2018-05-11 2018-10-16 西安天和防务技术股份有限公司 A kind of intelligent multi-target detection tracking

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112070802A (en) * 2020-09-02 2020-12-11 合肥英睿系统技术有限公司 Target tracking method, device, equipment and computer readable storage medium
CN112070802B (en) * 2020-09-02 2024-01-26 合肥英睿系统技术有限公司 Target tracking method, device, equipment and computer readable storage medium
CN114185034A (en) * 2020-09-15 2022-03-15 郑州宇通客车股份有限公司 Target tracking method and system for millimeter wave radar

Also Published As

Publication number Publication date
WO2020082258A1 (en) 2020-04-30
CN111512317B (en) 2023-06-06

Similar Documents

Publication Publication Date Title
CN111512317A (en) Multi-target real-time tracking method and device and electronic equipment
US11205276B2 (en) Object tracking method, object tracking device, electronic device and storage medium
US20200116498A1 (en) Visual assisted distance-based slam method and mobile robot using the same
CN108960211A (en) A kind of multiple target human body attitude detection method and system
CN106971401B (en) Multi-target tracking device and method
EP2660753B1 (en) Image processing method and apparatus
US20120106784A1 (en) Apparatus and method for tracking object in image processing system
CN109919002B (en) Yellow stop line identification method and device, computer equipment and storage medium
CN111382637B (en) Pedestrian detection tracking method, device, terminal equipment and medium
US11544926B2 (en) Image processing apparatus, method of processing image, and storage medium
CN112597837A (en) Image detection method, apparatus, device, storage medium and computer program product
JP5262705B2 (en) Motion estimation apparatus and program
CN115063454B (en) Multi-target tracking matching method, device, terminal and storage medium
CN112150514A (en) Pedestrian trajectory tracking method, device and equipment of video and storage medium
CN117896626B (en) Method, device, equipment and storage medium for detecting motion trail by multiple cameras
JP5441151B2 (en) Facial image tracking device, facial image tracking method, and program
CN113763466B (en) Loop detection method and device, electronic equipment and storage medium
CN113920158A (en) Training and traffic object tracking method and device of tracking model
CN115115530B (en) Image deblurring method, device, terminal equipment and medium
KR20150137698A (en) Method and apparatus for movement trajectory tracking of moving object on animal farm
CN115131691A (en) Object matching method and device, electronic equipment and computer-readable storage medium
JP2022185872A5 (en)
CN117218162B (en) Panoramic tracking vision control system based on ai
CN117670939B (en) Multi-camera multi-target tracking method and device, storage medium and electronic equipment
CN114219978B (en) Target multi-part association method and device, terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A multi-target real-time tracking method, device, and electronic equipment

Granted publication date: 20230606

Pledgee: Shenzhen hi tech investment small loan Co.,Ltd.

Pledgor: SHENZHEN CORERAIN TECHNOLOGIES Co.,Ltd.

Registration number: Y2024980027493