WO2020082258A1 - Procédé et appareil de suivi multi-objectif en temps réel, et dispositif électronique - Google Patents

Procédé et appareil de suivi multi-objectif en temps réel, et dispositif électronique Download PDF

Info

Publication number
WO2020082258A1
WO2020082258A1 PCT/CN2018/111589 CN2018111589W WO2020082258A1 WO 2020082258 A1 WO2020082258 A1 WO 2020082258A1 CN 2018111589 W CN2018111589 W CN 2018111589W WO 2020082258 A1 WO2020082258 A1 WO 2020082258A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
target
current
match
feature
Prior art date
Application number
PCT/CN2018/111589
Other languages
English (en)
Chinese (zh)
Inventor
肖梦秋
Original Assignee
深圳鲲云信息科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳鲲云信息科技有限公司 filed Critical 深圳鲲云信息科技有限公司
Priority to CN201880083620.2A priority Critical patent/CN111512317B/zh
Priority to PCT/CN2018/111589 priority patent/WO2020082258A1/fr
Publication of WO2020082258A1 publication Critical patent/WO2020082258A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the invention relates to the field of software development, and more specifically, to a multi-target real-time tracking method, device and electronic equipment.
  • Tracking can determine the trajectory of the target (object or person).
  • Current single target tracking algorithms such as based on correlation filtering (KCF)
  • KCF correlation filtering
  • the current tracking algorithm has a real-time problem for multi-target tracking.
  • the combination of multiple single target tracking algorithms requires a large amount of calculation and a high data processing delay, so that the tracking accuracy obtained using this tracking method is low.
  • the purpose of the present invention is to provide a multi-target real-time tracking method, device and electronic equipment in view of the above-mentioned defects in the prior art, which solves the problem of low tracking accuracy.
  • a multi-target real-time tracking method includes:
  • image information where the image information includes current frame information and previous frame information of multiple targets;
  • a second match is performed to determine whether the at least one target is successfully matched twice, and the second match includes at least one of feature matching and distance matching;
  • the information of the at least one target with a successful match and / or a successful second match is formed as output information, and the output information includes current presence information and identification information.
  • performing a second match to determine whether the at least one target is successfully matched a second time further includes:
  • the current frame information of the remaining target is regenerated to obtain new image information, and the new image information includes the current frame information and the next frame information.
  • the current frame includes current detection information of the multiple targets, and the previous frame information includes historical existence information and corresponding identification information of the multiple targets;
  • the matching the current frame information of the multiple targets with the previous frame information once to determine whether at least one of the multiple targets is successfully matched at a time includes:
  • the degree of overlap it is determined whether the at least one target is successfully matched at a time.
  • the judging whether the at least one target is successfully matched according to the overlapping degree includes:
  • the current frame includes current detection information of the multiple targets, and the previous frame information includes historical existence information and corresponding identification information of the multiple targets;
  • the performing second matching to determine whether the at least one target is successfully matched twice includes:
  • the judging whether the at least one target is successfully matched twice according to the characteristic value and the distance value of the at least one target includes:
  • the output information of the formation of the at least one target with a successful first match and / or a successful second match includes:
  • the output information of the at least one target is formed according to the current presence information and the corresponding identification information.
  • a multi-target real-time tracking device includes:
  • An acquisition module for acquiring image information, the image information including current frame information and previous frame information of multiple targets;
  • the first matching module is configured to perform a match based on the current frame information of the multiple targets and the previous frame information, and determine whether at least one of the multiple targets is successfully matched at one time;
  • the second matching module is used to perform secondary matching if the at least one target has not been matched once, and determine whether the at least one target has been successfully matched twice.
  • the secondary matching includes at least one of feature matching and distance matching. item;
  • the output module is configured to form information of the at least one target with a successful match and / or a successful second match into output information, where the output information includes current presence information and identification information.
  • an electronic device including: a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the implementation of the present invention is implemented Example provides the steps in the multi-target real-time tracking method.
  • a computer-readable storage medium is provided, and a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the multi-target real-time tracking method provided by the embodiments of the present invention are implemented .
  • Beneficial effects brought by the present invention acquiring image information, the image information includes current frame information and previous frame information of multiple targets; according to the current frame information of the multiple targets and the previous frame information, a match is made to determine Whether at least one target among the plurality of targets is matched successfully once; if the at least one target is not matched successfully once, then a second match is performed to determine whether the at least one target is matched twice successfully; Or the output information of the formation of the at least one target for which the second match is successful, the output information includes current presence information and identification information.
  • the tracking accuracy can be increased.
  • FIG. 1 is a schematic flowchart of a multi-target real-time tracking method according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of current information according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of image information according to an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of another multi-target real-time tracking method provided by an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of another multi-target real-time tracking method according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a multi-target real-time tracking device provided by an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of another multi-target real-time tracking device provided by an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of another multi-target real-time tracking device provided by an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of another multi-target real-time tracking device provided by an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of another multi-target real-time tracking device provided by an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of another multi-target real-time tracking device provided by an embodiment of the present invention.
  • the invention provides a multi-target real-time tracking method, device and electronic equipment.
  • FIG. 1 is a schematic flowchart of a multi-target real-time tracking method according to an embodiment of the present invention. As shown in FIG. 1, the method includes the following steps:
  • the above image information may be the image information of the video frame collected by the camera, and the image information may be identified according to the time of the video frame.
  • the time for acquiring one frame of image in the video is 15.6789 seconds. Identify the frame image as 15S6789. It may also be the sequence number obtained from the frame in the total video frame. For example, if the frame is the 14567th frame in the total video frame, the frame image may be identified as 14567.
  • the embodiments of the present invention are not limited to the above two identification methods, and may also be other identification methods, such as a time stamp with a date and a sequential identification with a camera number.
  • the above current frame information includes the feature coordinate values, feature range values, and confidence values of multiple targets in the image.
  • the previous frame information includes the identification, feature coordinate values, feature range values, and confidence levels of multiple targets.
  • the feature coordinate value and the feature range value may be measured in pixels or measured in actual size, which is not specifically limited in the embodiment of the present invention.
  • the current frame information can be obtained by real-time detection of multiple targets in the original image of the current frame. If the image contains target information, a current feature box is used to represent the target information. It is the target. Obtain the center coordinate information, length and width (width and height) information of the current feature box, and the confidence level that appears. The confidence level measures the credibility of the existence of the target. The higher it is, the more credible the current feature box is, and the confidence can be obtained when the image is detected in real time.
  • the above feature range value includes the area value occupied by the feature image in the feature frame.
  • the above information of the previous frame may be the identification, feature coordinate value, feature range value, confidence value, etc. of multiple targets in the image in the previous frame.
  • the identification of multiple targets in the current frame is the same as the previous The marks of multiple targets in the frame are different from each other. It can also be said that the marks of multiple targets in the current frame do not overlap with the marks of multiple targets in the previous frame.
  • the marks of multiple targets in the current frame are A and B, respectively.
  • C, D the identification of multiple targets in the previous frame can be A ', B', C ', D', where A and A 'can be different targets.
  • the information of the previous frame can be obtained by real-time detection of multiple targets in the original image of the previous frame, or by real-time detection of multiple targets in the processed image of the previous frame.
  • the target information in the information is represented by a previous feature box that is different from the feature box in the current frame information.
  • the current feature box is a solid box
  • the previous feature box is a dotted box.
  • you can also use the feature box To distinguish between, for example, associating the target's identifier with the feature frame, configuring two different identifiers for the feature frame in the current frame information and the feature frame in the previous frame information, etc.
  • the above image information includes the original image of the current frame, the current feature box, and the previous feature box.
  • the current feature box includes the current identification, current center coordinate information, current length and width (width and height) information, and the appearance of the
  • the previous feature box includes the last logo, the last center coordinate information, and the last length and width (width and height) information.
  • the feature frame may also be referred to as a target frame
  • the aforementioned identifier may also be referred to as an ID
  • the aforementioned current feature frame may also be referred to as a detection frame
  • the aforementioned previous feature frame may also be referred to as a tracking frame
  • the aforementioned feature range The value is the area value occupied by the feature image in the feature frame.
  • the above-mentioned real-time detection can be performed in the tracker or can be obtained by a tracking algorithm. Those skilled in the art know that the tracker and the tracking algorithm will not repeat them here.
  • the current frame information includes the current feature boxes of multiple targets or the current feature vectors of multiple targets.
  • the previous frame information includes the previous feature boxes of multiple targets or the previous feature vectors of multiple targets.
  • the detection algorithm matches at least one target among multiple targets.
  • the current frame information in step 102 includes multiple current feature frames corresponding to multiple targets
  • the previous frame information includes multiple previous feature frames corresponding to multiple targets.
  • the information obtained by real-time detection is performed.
  • Output get a set of image information including multiple current feature boxes and multiple previous feature boxes, for each previous feature box, put it in multiple current feature boxes to match, calculate each The overlapping area of a feature box and multiple current feature boxes, according to the overlapping area, calculate the overlap degree of each previous feature box and multiple current feature boxes (Intersection-over-Union, IoU, also known as intersection and merge ratio), select and The maximum overlapping degree of the previous feature box and a current feature box become a group, as shown in FIG. 3.
  • the previous feature box compares the maximum overlap degree of the previous feature box with the preset overlap degree threshold. If the maximum overlap degree meets the overlap degree threshold, it is recorded as a successful match; if the maximum overlap degree does not meet the overlap degree threshold, it is recorded as a match. If it is unsuccessful, the previous feature box enters step 103 for secondary matching.
  • the similarity of the feature frames may also be compared to match the current feature frame corresponding to the feature frame of the previous frame. Similarity includes: area similarity, length-width (width-height) similarity, etc.
  • one-time matching may also be called first-time matching, first-time matching, one-time tracking, first-time tracking, first-time tracking, etc., and may also be directly called tracking.
  • the second match includes at least one of feature matching and distance matching.
  • the target that did not match successfully in step 102 is subjected to secondary matching, and the secondary matching includes at least one of feature matching and distance matching.
  • Feature matching includes obtaining the current feature vector of the current feature box, and acquiring the previous feature vector of the previous feature box, and calculating the similarity between the previous feature vector and the current feature vector.
  • Distance matching includes obtaining the distance value between the previous feature box and the current feature box.
  • second matching may also be referred to as first matching, rematching, second tracking, second tracking, retracking, and so on.
  • the above-mentioned current presence information includes the current feature box. For example, if the successful match is the current feature box A and the previous feature box A ', the information of the current feature box A is output.
  • the information of the current feature box includes: the center coordinates of the current feature box Information, length and width (width and height) information, etc.
  • the above identification information includes the identification information of the current feature box, which is used to indicate the current feature box. For example, if the current feature box is A, output A.
  • the identification information of the current feature box and the current feature Box is used to indicate the current feature box.
  • step 102 if the target is successfully matched, the target information of the successful match may be placed in an active set, and the target information of unsuccessful match may be placed in a lost set.
  • step 103 after the target in the missing set is matched for a second time, a target with a successful match is obtained, and the target information of the successful match may be added to the active set, and the information of the target in the active set is output.
  • A is the identification of the current feature box
  • a ' is the identification of the previous feature box
  • a and A' are the pair of current feature boxes and the previous feature box that match successfully, and the identification of the current feature box is changed from A to A ', And then delete the previous feature box in the image information, and record the current feature box A' into the active set, then the output information is the current feature box identification A 'and the center coordinate information, length and width of the current feature box ( Width and height) information and other information.
  • image information is obtained, and the image information includes current frame information and previous frame information of multiple targets; according to the current frame information of the multiple targets and the previous frame information, a match is made to determine Whether at least one target among the plurality of targets is matched successfully once; if the at least one target is not matched successfully once, then a second match is performed to determine whether the at least one target is successfully matched twice; the first match is successful and / or
  • the output information of the formation of the at least one target after the second match is successful, the output information includes current presence information and identification information. Simultaneous processing of at least one target in a frame of image can increase the efficiency of tracking.
  • the method for installing a container orchestration engine provided by an embodiment of the present invention can be applied to installation equipment for a container orchestration engine, such as computers, servers, mobile phones, and other devices that can install a container orchestration engine.
  • FIG. 4 is a schematic flowchart of another multi-target real-time tracking method provided by an embodiment of the present invention. As shown in FIG. 4, the method includes the following steps:
  • the second match includes at least one of feature matching and distance matching.
  • step 202 if the target is successfully matched, the target information of the successful match may be placed in an active set, and the target information of unsuccessful match may be placed in a lost set.
  • step 203 after the target in the missing set is matched a second time, the target with a successful match can be obtained. The target information in the successful set can be added to the active set, and the target information in the active set can be output. If the match is not successful, Then go to step 205.
  • step 205 for the current feature frame and the previous feature frame that have not been successfully matched for the second time, the target feature frame is regenerated, the regenerated feature frame is recorded in the active set, and the feature frames are regenerated It is consistent with the identification information in the active set. For example: suppose there are two elements of the current feature box A 'and the current feature box D' in the active set, B is the current feature box without matching success, B 'is the previous feature box with matching success, and C is the successful matching with B' In the current feature box, C 'is the previous feature box without a successful match, then the identifier C of the successfully matched current feature box is changed to the identifier B' and recorded in the active set, and the current feature box B 'is obtained.
  • Feature box A ', current feature box B' and current feature box D 'three elements delete the previous feature box B', delete the previous feature box for the current feature box B and the previous feature box C 'without matching C ', the identifier B of the current feature box is regenerated as E' and recorded in the active set, then there are three elements of the current feature box A ', the current feature box B', the current feature box D 'and the current feature in the active set Box E 'four elements, get the current frame information.
  • Obtaining new image information includes acquiring new original image information, performing real-time detection on the new original image to obtain the next frame information, adding the current feature frame in the active set to the new image information, and performing the new tracking process cyclically to obtain All tracking results are shown in Figure 5.
  • step 201 to step 205 can also be executed cyclically, and multiple targets can be tracked.
  • step 205 is optional. In some embodiments, it is only necessary to form the output information output for the information of the at least one target with a successful match and / or a successful second match.
  • the current frame includes current detection information of the multiple targets, and the previous frame information includes historical existence information and corresponding identification information of the multiple targets;
  • the matching the current frame information of the multiple targets with the previous frame information once to determine whether at least one of the multiple targets is successfully matched at a time includes:
  • the degree of overlap it is determined whether the at least one target is successfully matched at a time.
  • the above current detection information includes the current feature frame information obtained by real-time detection of the current original image and the generated identifier of each current feature frame, and the current feature frame information information includes center coordinate information, length and width (width and height) information, and confidence information ,
  • the identification of the current feature box may be a unique identification such as a number unique identification, a letter unique identification, and so on.
  • the above-mentioned historical existence information may be the characteristic frame information existing in the image of the previous frame, and the corresponding identification information is the unique identification of the characteristic frame existing in the image of the previous frame, and may also be said to be the unique identification of the previous characteristic frame.
  • the degree of overlap may be the degree of overlap between the current feature frame and the previous feature frame, including the degree of overlap of the length, width (width and height) coordinates, and the degree of overlap of the area.
  • the degree of overlap may also be the similarity of feature vectors or the similarity of feature frames.
  • the at least one target When the above overlapping degree or similarity is greater than the preset threshold, the at least one target can be judged as a successful match, and when the above overlapping degree or similarity is less than the preset threshold, the at least one target can be determined as One match was unsuccessful.
  • the judging whether the at least one target is successfully matched according to the overlapping degree includes:
  • each previous feature box place it in multiple current feature boxes to match, calculate the overlapping area of each previous feature box and multiple current feature boxes, and calculate each previous feature based on the overlapping area
  • the overlap degree of the frame and multiple current feature frames (Intersection-over-Union, IoU, also known as cross-combination ratio), select the maximum overlap degree with the previous feature frame, one current feature frame becomes a group, and the previous feature frame
  • the maximum overlap degree is compared with the preset overlap threshold, for example: A 'is the identifier of the previous feature box, B' is the identifier of the previous feature box, A is the identifier of the current feature box, and B is the current feature box.
  • the overlapping degree of A 'and A is 0.4, and the overlapping degree of A' and B is 0.8, then A 'and B have the maximum overlapping degree, which is recorded as a group, the overlapping degree of B' and A is 0.4, B 'and If the overlap degree of B is 0.5, then B 'and B have the maximum overlap degree, which is recorded as a group, and the overlap degree of A' and A is 0.4. If the maximum overlap degree meets the overlap degree threshold, it is recorded as a successful match. Assuming the overlap degree The threshold is 0.6, A 'matches with B successfully, and the overlap between B' and B is less than 0.6, Then, the matching between B 'and B is unsuccessful. If the maximum degree of overlap does not meet the threshold of the degree of overlap, it is recorded as one-time unsuccessful matching, and the previous feature box proceeds to step 203 to perform the second-time matching.
  • the current frame includes current detection information of the multiple targets, and the previous frame information includes historical existence information and corresponding identification information of the multiple targets;
  • the performing second matching to determine whether the at least one target is successfully matched twice includes:
  • the above-mentioned current detection information includes the current feature frame information and identification
  • the historical presence information includes the previous feature frame information and identification.
  • the current feature vector can be obtained by extracting the direction gradient histogram (Histogram of Oriented Gradient, HOG for short) of the current feature box.
  • the previous feature vector can be obtained by extracting the direction gradient histogram of the previous feature box On the previous HOG feature vector, the cosine similarity between the current HOG feature vector and the previous HOG feature vector is calculated.
  • the current feature box includes the current center coordinate information, current length and width (width and height) information
  • the previous feature box includes the previous center coordinate information, the previous length and width (width and height) information
  • the distance value between the current feature box and the previous feature box It can be calculated by the following formula:
  • D is the distance between the current feature box and the previous feature box
  • x1, y1, w1 belong to the current feature box
  • x1, y1 are the center coordinates of the current feature box
  • w1 is the width of the current feature box
  • x2, y2, w2 Belongs to the previous feature frame
  • x2 and y2 are the center coordinates of the previous feature frame
  • w2 is the width of the previous feature frame.
  • the judging whether the at least one target is successfully matched twice according to the characteristic value and the distance value of the at least one target includes:
  • the judgment rule includes: when the cosine similarity is greater than the preset cosine similarity threshold, it can be considered that the second match is successful. In addition, when the distance value is less than the set distance threshold, it can also be considered that the secondary matching is successful. Of course, when the cosine similarity is greater than the preset cosine similarity threshold and the distance value is less than the set distance threshold, it can also be considered that the second match is successful.
  • the output information of the formation of the at least one target with a successful first match and / or a successful second match includes:
  • the output information of the at least one target is formed according to the current presence information and the corresponding identification information.
  • the above current presence information includes the current feature box. For example, if the matching is successful between the current feature box A and the previous feature box A ', the information of the current feature box A is output.
  • the information of the current feature box includes: the center coordinates of the current feature box Information, length and width (width and height) information, etc.
  • the above identification information includes the identification information of the current feature box, used to represent the current feature box, for example, the current feature box is A, then output A, and the current feature box is A ', then output A ', the identification information of the current feature box is associated with the current feature box.
  • step 202 if the target is successfully matched, the target information of the successful match may be placed in an active set, and the target information of unsuccessful match may be placed in a lost set.
  • step 203 after the target in the missing set is matched a second time, a target with a successful match is obtained, and the target information of the successful match can be added to the active set, and the information of the target in the active set is output.
  • A is the identification of the current feature box
  • a ' is the identification of the previous feature box
  • a and A' are the pair of current feature boxes and the previous feature box that match successfully, and the identification of the current feature box is changed from A to A ', And then delete the previous feature box in the image information, and record the current feature box A' into the active set, then the output information is the current feature box identification A 'and the center coordinate information, length and width of the current feature box ( Width and height) information and other information.
  • a multi-target real-time tracking device includes:
  • the obtaining module 401 is used to obtain image information, where the image information includes current frame information and previous frame information of multiple targets;
  • the first matching module 402 is configured to perform a match based on the current frame information of the multiple targets and the previous frame information, and determine whether at least one of the multiple targets is successfully matched at one time;
  • the second matching module 403 is configured to perform secondary matching if the at least one target has not been matched successfully once, and determine whether the at least one target has succeeded in secondary matching.
  • the secondary matching includes at least one of feature matching and distance matching.
  • the output module 404 is configured to form information of the at least one target with a successful match and / or a successful second match into output information, where the output information includes current presence information and identification information.
  • the device further includes:
  • the generating module 405 is configured to regenerate the current frame information of the remaining target to obtain new image information if the second matching is unsuccessful, and the new image information includes the current frame information and the next frame information.
  • the current frame includes current detection information of the multiple targets, and the previous frame information includes historical existence information and corresponding identification information of the multiple targets;
  • the first matching module 402 includes:
  • the first processing unit 4021 is configured to calculate the degree of overlap between the current detection information of at least one target of the plurality of targets and the historical presence information of at least one target of the plurality of targets to obtain the current detection of the at least one target The degree of overlap between the information and the historical existence information of the at least one target;
  • the first determining unit 4022 is configured to determine whether the at least one target is successfully matched at a time according to the degree of overlap.
  • the first judgment unit 4022 includes:
  • the comparison subunit 40221 is used to select a maximum overlap degree and compare with a preset overlap degree threshold to determine whether the maximum overlap degree is greater than the overlap degree threshold;
  • the judgment subunit 40222 is configured to match once if the maximum overlap degree is greater than the overlap degree threshold, and fail to match once if the maximum overlap degree is less than the overlap degree threshold.
  • the current frame includes current detection information of the multiple targets, and the previous frame information includes historical existence information and corresponding identification information of the multiple targets;
  • the second matching module 403 includes:
  • the second processing unit 4031 is configured to extract a current feature vector of current detection information of at least one target of the plurality of targets, extract a historical feature vector of historical presence information of at least one target of the plurality of targets, and convert the current Calculating the feature vector and the historical feature vector to obtain the cosine similarity of the at least one target;
  • the third processing unit 4032 is configured to extract the current coordinates of the current detection information of the at least one target and the historical coordinates of the historical presence information of the at least one target, and calculate the current coordinates and the historical coordinates of the at least one target, Obtain the distance value of the at least one target;
  • the second determining unit 4033 is configured to determine whether the at least one target is successfully matched twice according to the cosine similarity and distance values of the at least one target.
  • the output module includes:
  • the updating unit 4041 is configured to update the current detection information of the at least one target to the current presence information, and associate the corresponding identification information of the at least one target to the current detection information;
  • the output unit 4042 forms the output information of the at least one target according to the current presence information and the corresponding identification information.
  • an embodiment of the present invention provides an electronic device, including: a memory, a processor, and a computer program stored on the memory and executable on the processor, when the processor executes the computer program
  • the steps in the multi-target real-time tracking method provided by the embodiments of the present invention are implemented.
  • an embodiment of the present invention provides a computer-readable storage medium that stores a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, realizes multi-target real-time tracking provided by an embodiment of the present invention Steps in the method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé et un appareil de suivi multi-objectif en temps réel, et un dispositif électronique. Le procédé consiste à : acquérir des informations d'image (101), les informations d'image comprenant des informations de la trame actuelle et des informations de la trame précédente d'une pluralité d'objectifs ; mettre en correspondance les informations de la trame actuelle avec les informations de la trame précédente de la pluralité d'objectifs une fois, et déterminer si au moins un objectif de la pluralité d'objectifs est mis en correspondance avec succès une première fois (102) ; si l'un ou les objectifs n'ont pas été mis en correspondance avec succès une première fois, alors réaliser une seconde correspondance, et déterminer si l'un ou les objectifs sont mis en correspondance avec succès une seconde fois (103) ; et les informations du ou des objectifs qui sont mises en correspondance avec succès une première fois et/ou qui sont appariées avec succès une seconde fois étant formées en tant qu'informations de sortie (104), les informations de sortie comprenant les informations d'existence actuelle et les informations d'identification. La réalisation d'une mise en correspondance sur au moins un objectif deux fois dans une image de trame peut augmenter le taux de précision de suivi.
PCT/CN2018/111589 2018-10-24 2018-10-24 Procédé et appareil de suivi multi-objectif en temps réel, et dispositif électronique WO2020082258A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880083620.2A CN111512317B (zh) 2018-10-24 2018-10-24 一种多目标实时跟踪方法、装置及电子设备
PCT/CN2018/111589 WO2020082258A1 (fr) 2018-10-24 2018-10-24 Procédé et appareil de suivi multi-objectif en temps réel, et dispositif électronique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/111589 WO2020082258A1 (fr) 2018-10-24 2018-10-24 Procédé et appareil de suivi multi-objectif en temps réel, et dispositif électronique

Publications (1)

Publication Number Publication Date
WO2020082258A1 true WO2020082258A1 (fr) 2020-04-30

Family

ID=70330247

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/111589 WO2020082258A1 (fr) 2018-10-24 2018-10-24 Procédé et appareil de suivi multi-objectif en temps réel, et dispositif électronique

Country Status (2)

Country Link
CN (1) CN111512317B (fr)
WO (1) WO2020082258A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914653A (zh) * 2020-07-02 2020-11-10 泰康保险集团股份有限公司 一种人员标记方法和装置
CN112037256A (zh) * 2020-08-17 2020-12-04 中电科新型智慧城市研究院有限公司 目标跟踪方法、装置、终端设备及计算机可读存储介质
CN112084914A (zh) * 2020-08-31 2020-12-15 的卢技术有限公司 一种融合空间运动和表观特征学习的多目标跟踪方法
CN112101223A (zh) * 2020-09-16 2020-12-18 北京百度网讯科技有限公司 检测方法、装置、设备和计算机存储介质
CN112634327A (zh) * 2020-12-21 2021-04-09 合肥讯图信息科技有限公司 基于YOLOv4模型的跟踪方法
CN113238209A (zh) * 2021-04-06 2021-08-10 宁波吉利汽车研究开发有限公司 基于毫米波雷达的道路感知方法、系统、设备及存储介质
CN113344975A (zh) * 2021-06-24 2021-09-03 西安天和防务技术股份有限公司 一种多目标跟踪方法、装置及电子设备
CN113361456A (zh) * 2021-06-28 2021-09-07 北京影谱科技股份有限公司 一种人脸识别方法和系统
CN113723311A (zh) * 2021-08-31 2021-11-30 浙江大华技术股份有限公司 目标跟踪方法
CN114155275A (zh) * 2021-11-17 2022-03-08 深圳职业技术学院 一种基于IOU-Tracker的鱼类跟踪方法和装置
CN115223135A (zh) * 2022-04-12 2022-10-21 广州汽车集团股份有限公司 车位跟踪方法、装置、车辆及存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112070802B (zh) * 2020-09-02 2024-01-26 合肥英睿系统技术有限公司 一种目标跟踪方法、装置、设备及计算机可读存储介质
CN114185034A (zh) * 2020-09-15 2022-03-15 郑州宇通客车股份有限公司 一种毫米波雷达目标跟踪方法及系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104778465A (zh) * 2015-05-06 2015-07-15 北京航空航天大学 一种基于特征点匹配的目标跟踪方法
CN106033613A (zh) * 2015-03-16 2016-10-19 北京大学 目标跟踪方法及装置
CN106097391A (zh) * 2016-06-13 2016-11-09 浙江工商大学 一种基于深度神经网络的识别辅助的多目标跟踪方法
CN106203274A (zh) * 2016-06-29 2016-12-07 长沙慧联智能科技有限公司 一种视频监控中行人实时检测系统及方法
CN107316322A (zh) * 2017-06-27 2017-11-03 上海智臻智能网络科技股份有限公司 视频跟踪方法和装置、以及对象识别方法和装置
CN108154118A (zh) * 2017-12-25 2018-06-12 北京航空航天大学 一种基于自适应组合滤波与多级检测的目标探测系统及方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104424638A (zh) * 2013-08-27 2015-03-18 深圳市安芯数字发展有限公司 一种基于遮挡情况下的目标跟踪方法
CN104517275A (zh) * 2013-09-27 2015-04-15 株式会社理光 对象检测方法和系统
CN104765886A (zh) * 2015-04-29 2015-07-08 百度在线网络技术(北京)有限公司 一种基于图像的信息获取方法和装置
CN108664930A (zh) * 2018-05-11 2018-10-16 西安天和防务技术股份有限公司 一种智能的多目标检测跟踪方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106033613A (zh) * 2015-03-16 2016-10-19 北京大学 目标跟踪方法及装置
CN104778465A (zh) * 2015-05-06 2015-07-15 北京航空航天大学 一种基于特征点匹配的目标跟踪方法
CN106097391A (zh) * 2016-06-13 2016-11-09 浙江工商大学 一种基于深度神经网络的识别辅助的多目标跟踪方法
CN106203274A (zh) * 2016-06-29 2016-12-07 长沙慧联智能科技有限公司 一种视频监控中行人实时检测系统及方法
CN107316322A (zh) * 2017-06-27 2017-11-03 上海智臻智能网络科技股份有限公司 视频跟踪方法和装置、以及对象识别方法和装置
CN108154118A (zh) * 2017-12-25 2018-06-12 北京航空航天大学 一种基于自适应组合滤波与多级检测的目标探测系统及方法

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914653A (zh) * 2020-07-02 2020-11-10 泰康保险集团股份有限公司 一种人员标记方法和装置
CN111914653B (zh) * 2020-07-02 2023-11-07 泰康保险集团股份有限公司 一种人员标记方法和装置
CN112037256A (zh) * 2020-08-17 2020-12-04 中电科新型智慧城市研究院有限公司 目标跟踪方法、装置、终端设备及计算机可读存储介质
CN112084914B (zh) * 2020-08-31 2024-04-26 的卢技术有限公司 一种融合空间运动和表观特征学习的多目标跟踪方法
CN112084914A (zh) * 2020-08-31 2020-12-15 的卢技术有限公司 一种融合空间运动和表观特征学习的多目标跟踪方法
CN112101223A (zh) * 2020-09-16 2020-12-18 北京百度网讯科技有限公司 检测方法、装置、设备和计算机存储介质
CN112101223B (zh) * 2020-09-16 2024-04-12 阿波罗智联(北京)科技有限公司 检测方法、装置、设备和计算机存储介质
CN112634327A (zh) * 2020-12-21 2021-04-09 合肥讯图信息科技有限公司 基于YOLOv4模型的跟踪方法
CN113238209A (zh) * 2021-04-06 2021-08-10 宁波吉利汽车研究开发有限公司 基于毫米波雷达的道路感知方法、系统、设备及存储介质
CN113238209B (zh) * 2021-04-06 2024-01-16 宁波吉利汽车研究开发有限公司 基于毫米波雷达的道路感知方法、系统、设备及存储介质
CN113344975A (zh) * 2021-06-24 2021-09-03 西安天和防务技术股份有限公司 一种多目标跟踪方法、装置及电子设备
CN113361456B (zh) * 2021-06-28 2024-05-07 北京影谱科技股份有限公司 一种人脸识别方法和系统
CN113361456A (zh) * 2021-06-28 2021-09-07 北京影谱科技股份有限公司 一种人脸识别方法和系统
CN113723311A (zh) * 2021-08-31 2021-11-30 浙江大华技术股份有限公司 目标跟踪方法
CN114155275A (zh) * 2021-11-17 2022-03-08 深圳职业技术学院 一种基于IOU-Tracker的鱼类跟踪方法和装置
CN115223135B (zh) * 2022-04-12 2023-11-21 广州汽车集团股份有限公司 车位跟踪方法、装置、车辆及存储介质
CN115223135A (zh) * 2022-04-12 2022-10-21 广州汽车集团股份有限公司 车位跟踪方法、装置、车辆及存储介质

Also Published As

Publication number Publication date
CN111512317B (zh) 2023-06-06
CN111512317A (zh) 2020-08-07

Similar Documents

Publication Publication Date Title
WO2020082258A1 (fr) Procédé et appareil de suivi multi-objectif en temps réel, et dispositif électronique
CN108960211B (zh) 一种多目标人体姿态检测方法以及系统
CN105164700B (zh) 使用概率模型在视觉数据中检测对象
CN104866414B (zh) 应用程序的测试方法、装置及系统
WO2016034059A1 (fr) Procédé de suivi d'objet cible utilisant des caractéristiques de couleur-structure
WO2019242672A1 (fr) Procédé, dispositif et système de suivi de cible
CN109426785B (zh) 一种人体目标身份识别方法及装置
CN109919002B (zh) 黄色禁停线识别方法、装置、计算机设备及存储介质
CN112016402B (zh) 基于无监督学习的行人重识别领域自适应方法及装置
CN111382637B (zh) 行人检测跟踪方法、装置、终端设备及介质
CN111354022B (zh) 基于核相关滤波的目标跟踪方法及系统
KR20120044484A (ko) 이미지 처리 시스템에서 물체 추적 장치 및 방법
CN111553234A (zh) 融合人脸特征与Re-ID特征排序的行人跟踪方法及装置
WO2019033575A1 (fr) Dispositif électronique, procédé et système de suivi de visage, et support d'informations
JP2022540101A (ja) ポジショニング方法及び装置、電子機器、コンピュータ読み取り可能な記憶媒体
CN108875506B (zh) 人脸形状点跟踪方法、装置和系统及存储介质
US11594073B2 (en) Face recognition method and face recognition device
JP2009129237A (ja) 画像処理装置及びその方法
JP2015204023A (ja) 被写体検出装置、被写体検出方法及びプログラム
JP5848665B2 (ja) 移動物体上動きベクトル検出装置、移動物体上動きベクトル検出方法、およびプログラム
JP5931646B2 (ja) 画像処理装置
US11741151B1 (en) Indexing key frames for localization
JP2015007919A (ja) 異なる視点の画像間で高精度な幾何検証を実現するプログラム、装置及び方法
JP2010113562A (ja) 物体検知追跡装置,物体検知追跡方法および物体検知追跡プログラム
WO2017179728A1 (fr) Dispositif, procédé et programme de reconnaissance d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18937754

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31/08/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18937754

Country of ref document: EP

Kind code of ref document: A1