CN111401285B - Target tracking method and device and electronic equipment - Google Patents

Target tracking method and device and electronic equipment Download PDF

Info

Publication number
CN111401285B
CN111401285B CN202010210615.2A CN202010210615A CN111401285B CN 111401285 B CN111401285 B CN 111401285B CN 202010210615 A CN202010210615 A CN 202010210615A CN 111401285 B CN111401285 B CN 111401285B
Authority
CN
China
Prior art keywords
target
tracked
determining
angle
orientation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010210615.2A
Other languages
Chinese (zh)
Other versions
CN111401285A (en
Inventor
徐越
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Megvii Technology Co Ltd
Original Assignee
Beijing Megvii Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Megvii Technology Co Ltd filed Critical Beijing Megvii Technology Co Ltd
Priority to CN202010210615.2A priority Critical patent/CN111401285B/en
Publication of CN111401285A publication Critical patent/CN111401285A/en
Application granted granted Critical
Publication of CN111401285B publication Critical patent/CN111401285B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a target tracking method, a target tracking device and electronic equipment, which relate to the technical field of image processing and comprise the following steps: firstly, acquiring an image to be identified; the image to be identified comprises a target to be tracked; then processing the image to be identified, and determining the orientation information of each target to be tracked in the image to be identified; wherein the orientation information indicates an orientation of the target to be tracked; and finally, determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked. The method for determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked can solve the technical problems of multiple mismatching times and low tracking precision in the prior art.

Description

Target tracking method and device and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a target tracking method, a target tracking device, and an electronic device.
Background
The multi-target tracking technology plays an important role in the security monitoring field, and the tracking effect of the multi-target tracking technology is highly dependent on the performance of a pedestrian re-identification model. The existing pedestrian re-identification model only judges pedestrians based on apparent features such as human body colors, attributes and the like, and when pedestrians with the appearance similar to that of tracked targets appear in pictures, higher similarity values can still be obtained. Therefore, in the multi-target tracking task scene, if the existing pedestrian re-recognition model is directly utilized to carry out similarity measurement, different people with similar appearance are likely to be associated into the same track by mistake, and the defects of high mismatching times and low tracking precision are likely to be caused.
Disclosure of Invention
In view of the above, the present invention aims to provide a target tracking method, a device and an electronic apparatus, so as to solve the technical problems of a plurality of mismatching times and low tracking precision in the existing target tracking method.
In a first aspect, an embodiment of the present invention provides a target tracking method, including: acquiring an image to be identified; wherein the image to be identified comprises a target to be tracked; processing the image to be identified, and determining the orientation information of each target to be tracked in the image to be identified; wherein the orientation information indicates an orientation of the target to be tracked; and determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked.
Further, determining the tracked object to which each target to be tracked belongs according to the orientation information of each target to be tracked includes: determining an estimated orientation angle of each target to be tracked based on the orientation information; and determining the tracked target to which each target to be tracked belongs based on the estimated orientation angle.
Further, the orientation information includes probabilities that the actual orientation of the target to be tracked is a plurality of preset orientations, and determining the estimated orientation angle of each target to be tracked based on the orientation information includes: selecting the first N maximum probabilities from the probabilities of the preset orientations to obtain N target probabilities; and determining the estimated orientation angle of each target to be tracked based on N preset orientation angles corresponding to the N target probabilities.
Further, determining the estimated orientation angle of each target to be tracked based on the N preset orientation angles corresponding to the N target probabilities includes: and carrying out weighted summation calculation on the N target probabilities and the N angles of the preset orientations, and determining a weighted summation calculation result as an estimated orientation angle of each target to be tracked.
Further, determining the estimated orientation angle of each target to be tracked based on the N preset orientation angles corresponding to the N target probabilities further includes: and if the N preset orientations comprise opposite orientations, determining an angle of the preset orientation corresponding to the maximum target probability in the N target probabilities as an estimated orientation angle of each target to be tracked.
Further, the plurality of preset orientations include: forward, left, right and backward; the angle corresponding to the forward direction comprises: a first angle and a second angle, the first angle being greater than the second angle, and a difference between the first angle and the second angle being 360 degrees; the method further comprises the steps of: if N preset orientations corresponding to the N target probabilities contain the forward directions and the left directions, determining forward angles in the N preset orientations corresponding to the N target probabilities as first angles; and if the N preset orientations corresponding to the N target probabilities contain the forward directions and the right directions, determining that the forward angles in the N preset orientations corresponding to the N target probabilities are second angles.
Further, processing the image to be identified, and determining the orientation information of each target to be tracked in the image to be identified includes:
and processing the image to be identified through the target re-identification model to obtain the orientation information of each target to be tracked in the image to be identified.
Further, the target re-recognition model includes: at least one residual unit, a first bottleneck unit and a second bottleneck unit; the output end of the at least one residual error unit is respectively connected with the input ends of the first bottleneck unit and the second bottleneck unit; processing the image to be identified through a target re-identification model, wherein obtaining the orientation information of each target to be tracked in the image to be identified comprises the following steps: processing the image to be identified through the at least one residual error unit to obtain a target feature map; performing bottleneck processing on the target feature map through the second bottleneck unit to obtain the orientation information of each target to be tracked in the image to be identified; the first bottleneck unit is used for carrying out bottleneck processing on the target feature map to obtain feature information of the image to be identified.
Further, determining, based on the estimated orientation angle, a tracked object to which each target to be tracked belongs includes: calculating an angle difference between the estimated orientation angle and an orientation angle of at least one tracked object; calculating a feature similarity score between each target to be tracked and the at least one tracked target; and determining the tracked target to which each target to be tracked belongs based on the angle difference value and the feature similarity score.
Further, determining the tracked object to which each target to be tracked belongs based on the angle difference value and the feature similarity score includes: and calculating the product between the angle difference value and the feature similarity score, and determining the tracked target to which each target to be tracked belongs based on the product calculation result.
Further, the tracked target is a plurality of; the determining the tracked target to which each target to be tracked belongs based on the product calculation result comprises: determining a tracked target corresponding to the maximum product calculation result in the multiple product calculation results; and determining the tracked target corresponding to the maximum product calculation result as the tracked target to which each target to be tracked belongs.
In a second aspect, an embodiment of the present invention provides a target tracking apparatus, including: the acquisition module is used for acquiring the image to be identified; wherein the image to be identified comprises a target to be tracked; the first determining module is used for processing the image to be identified and determining the orientation information of each target to be tracked in the image to be identified; wherein the orientation information indicates an orientation of the target to be tracked; and the second determining module is used for determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked.
In a third aspect, the present invention further provides an electronic device, including a memory, and a processor, where the memory stores a computer program executable on the processor, where the processor executes steps of the target tracking method implemented by the computer program.
In a fourth aspect, the present invention also provides a computer readable medium having non-volatile program code executable by a processor, wherein the program code causes the processor to perform the object tracking method.
In the embodiment of the invention, an image to be identified is firstly obtained; the image to be identified comprises a target to be tracked; then processing the image to be identified, and determining the orientation information of each target to be tracked in the image to be identified, wherein the orientation information represents the orientation of the target to be tracked; and finally, determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked.
According to the method and the device for tracking the target, the orientation information of each target to be tracked in the image to be identified is determined, after the orientation information of each target to be tracked is determined, the mode that the tracked target of each target to be tracked belongs to is determined according to the orientation information of each target to be tracked, so that the technical problems that the existing target tracking method is high in mismatching times and low in tracking precision can be solved, and therefore the technical effects of reducing the mismatching times and improving the tracking precision are achieved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
FIG. 2 is a flowchart of a target tracking method according to an embodiment of the present invention;
FIG. 3 is a flowchart of a first alternative target tracking method according to an embodiment of the present invention;
FIG. 4 is a flowchart of a second alternative target tracking method according to an embodiment of the present invention;
FIG. 5 is a flow chart of a third alternative target tracking method provided by an embodiment of the present invention;
FIG. 6 is a flow chart of a fourth alternative target tracking method provided by an embodiment of the present invention;
FIG. 7 is a flowchart of a fifth alternative target tracking method according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a target tracking apparatus according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the present invention will be described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments.
In a multi-target tracking task scene, if the existing pedestrian re-recognition model is directly utilized to carry out similarity measurement, different people with similar appearance are likely to be associated into the same track by mistake, and the defects of high mismatching times and low tracking precision are likely to be caused.
The inventor finds that the target tracking task scene has the following characteristics: the human body orientation does not change significantly within a few frames. That is, in the video, the orientation information of each object to be tracked in the previous frame image and the subsequent frame image does not substantially change. Based on the above, the embodiment of the invention provides a target tracking method, a target tracking device and electronic equipment, which are used for determining the orientation information of each target to be tracked in an image to be identified, and then determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked. Therefore, the method and the device reduce the number of mismatching and improve the tracking precision to a certain extent by determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked.
For the sake of understanding the present embodiment, first, a detailed description is given of a target tracking method disclosed in the embodiment of the present invention.
Example 1:
first, an electronic device 100 for implementing an embodiment of the present invention, which can be used to run the object tracking method of the embodiments of the present invention, will be described with reference to fig. 1.
As shown in fig. 1, electronic device 100 includes one or more processors 102, one or more memories 104, an input device 106, an output device 108, and an image capture device 110, which are interconnected by a bus system 112 and/or other forms of connection mechanisms (not shown). It should be noted that the components and structures of the electronic device 100 shown in fig. 1 are exemplary only and not limiting, as the electronic device may have other components and structures as desired.
The processor 102 may be implemented in hardware in at least one of a digital signal processor (DSP, digital Signal Processing), field programmable gate array (FPGA, field-Programmable Gate Array), programmable logic array (PLA, programmable Logic Array) and ASIC (Application Specific Integrated Circuit), and the processor 102 may be a central processing unit (CPU, central Processing Unit) or other form of processing unit having data processing and/or instruction execution capabilities and may control other components in the electronic device 100 to perform desired functions.
The memory 104 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on the computer readable storage medium that can be executed by the processor 102 to implement client functions and/or other desired functions in embodiments of the present invention as described below. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer readable storage medium.
The input device 106 may be a device used by a user to input instructions and may include one or more of a keyboard, mouse, microphone, touch screen, and the like.
The output device 108 may output various information (e.g., images or sounds) to the outside (e.g., a user), and may include one or more of a display, a speaker, and the like.
The image acquisition device 110 is configured to acquire an image to be identified, and the data acquired by the image acquisition device is used to determine whether the target to be tracked belongs to the tracked target through the target tracking method.
Example 2:
fig. 2 is a flowchart of a target tracking method according to an embodiment of the present invention. As shown in fig. 2, the method comprises the steps of:
step S202, obtaining an image to be identified; the image to be identified comprises an object to be tracked.
In the embodiment of the invention, the image to be identified can refer to one image, or can refer to one image in a preset number of continuous frame images, and the object to be tracked can refer to a pedestrian to be tracked. If the image to be identified is one frame of image in the continuous frame of image, the target to be tracked in the image to be identified and the target to be tracked in the previous frame of image can be the same or different in number. Therefore, the method for acquiring the image to be identified in the embodiment of the invention is not particularly limited, and the number of the targets to be tracked in the image to be identified is not particularly limited.
Step S204, processing the image to be identified, and determining the orientation information of each target to be tracked in the image to be identified; wherein the orientation information indicates an orientation of the target to be tracked.
Step S206, determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked.
In the embodiment of the present invention, in step S206, in the process of determining the tracked target to which each target to be tracked belongs, the target to be tracked and the tracked target may be associated with each other in data. The data association may be to compare the orientation information of the tracked target with the orientation information of each target to be tracked contained in the image to be identified. If the comparison result is that the difference between the angle corresponding to the orientation information of the target to be tracked in the image to be identified and the angle corresponding to the orientation information of a certain tracked target is smaller than a certain threshold value, the target to be tracked can be initially determined to belong to the tracked target. It should be understood that if the difference between the angles corresponding to the orientation information of the plurality of tracked objects and the angles corresponding to the orientation information of the object to be tracked is smaller than a certain threshold, it may be determined that the object to be tracked belongs to one tracked object of the plurality of tracked objects, and at this time, it may be further determined that the similarity score between the object to be tracked and the plurality of tracked objects is the tracked object with the highest similarity score (i.e., the closest appearance).
As can be seen from the above description, in the present application, in the case where the appearance of the target to be tracked is similar to that of the tracked target, if only the features of the tracked target are compared with the features of each target to be tracked, there is a possibility of erroneous judgment. The method for determining the tracked target to which the target to be tracked belongs based on the orientation information of the target to be tracked can reduce the possibility of misjudgment.
In the embodiment of the invention, an image to be identified is firstly obtained; the image to be identified comprises a target to be tracked; then processing the image to be identified, and determining the orientation information of each target to be tracked in the image to be identified; wherein the orientation information indicates an orientation of the target to be tracked; and finally, determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked. In the embodiment of the invention, the technical effects of reducing the number of mismatching and improving the tracking precision are realized by determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked.
In an alternative embodiment, as shown in fig. 3, step S206 includes the steps of:
step S301, determining an estimated orientation angle of each target to be tracked based on the orientation information.
In step S302, a tracked target to which each target to be tracked belongs is determined based on the estimated orientation angle.
In the embodiment of the invention, the tracking accuracy can be improved by mapping the discrete orientation information into an estimated orientation angle and determining the tracked object to which each object to be tracked belongs based on the estimated orientation angle.
In an alternative embodiment, the orientation information includes probabilities that the actual orientation of the object to be tracked is a plurality of preset orientations, as shown in fig. 4, step S301 includes the steps of:
step S401, selecting the first N maximum probabilities from a plurality of probabilities of preset orientations to obtain N target probabilities.
In the embodiment of the present invention, the preset direction may refer to four positive directions of forward, left, right and backward, and may be extended based on the four positive directions: a first direction, a second direction, a third direction, a fourth direction; the first direction is a direction between the forward direction and the right direction, the second direction is a direction between the right direction and the backward direction, the third direction is a direction between the backward direction and the left direction, and the fourth direction is a direction between the forward direction and the left direction. In the present application, the number of preset orientations and the angle of each preset orientation are not particularly limited, and a user may set the orientation according to actual needs.
The specific numerical value of N in the embodiment of the present invention is not specifically limited, but it should be noted that N is necessarily less than or equal to the number of all preset orientations. For example, if four preset orientations of forward, right, backward and left are predefined, and the probabilities of the target to be tracked in the four preset orientations of forward, right, backward and left are respectively: 0.8, 0.2, 0. If N is 1, then the 1 target probabilities are: 0.8. if N is 2, and the 2 target probabilities are respectively: 0.8 and 0.2. Here, N may also take a value of 3 or 4, and nmax may only take a value of 4. 0.8 of the 2 target probabilities is a forward probability, and 0.2 is a right probability, so 2 preset orientations are forward and right.
Step S402, determining an estimated orientation angle of each target to be tracked based on N preset orientation angles corresponding to the N target probabilities.
In an alternative embodiment, the specific operation of step S402 is as follows: and carrying out weighted summation calculation on the N target probabilities and the N angles of the preset orientations, and determining the weighted summation calculation result as an estimated orientation angle of each target to be tracked.
In the embodiment of the present invention, taking the direction of the pedestrian as an example, the direction of the pedestrian may be in one direction between any two preset directions, so that erroneous judgment is easily caused in this case, for example, the direction of one pedestrian is between the forward direction and the right direction, and is closer to the right direction, and is easily determined as the forward direction by the pedestrian re-recognition model. In order to accurately determine the real direction of the target to be tracked, the embodiment of the invention determines the estimated orientation angle of the target to be tracked based on the N angles of the preset orientations corresponding to the N target probabilities, wherein the estimated orientation angle can represent the real direction of the target to be tracked.
Each preset orientation may be given an angle before determining the estimated orientation angle for each target to be tracked. If four preset orientations of the forward direction, the right direction, the backward direction and the left direction are predefined, the angles of the four preset orientations are as follows: the forward direction is 0 degrees or 360 degrees, the right direction is 90 degrees, the backward direction is 180 degrees, and the left direction is 270 degrees; wherein the first angle may be 360 ° and the second angle may be 0 °.
According to the embodiment of the invention, the N target probabilities can be normalized, then the N target probabilities after the normalization and the N angles of the preset orientations are weighted and summed, and the weighted and summed calculation result is determined as the estimated orientation angle of each target to be tracked.
For example, the probabilities of the target to be tracked in the four preset orientations of the forward direction, the right direction, the backward direction and the left direction are respectively: 0.8, 0.2, 0.N is 2, and the weighted summation calculation result, that is, 0 ° x 0.8+90° x 0.2=18° can be obtained by performing weighted summation calculation on the angles of 2 target probabilities and 2 preset orientations.
For another example, the probabilities of the target to be tracked in the four preset orientations of the forward direction, the backward direction, the left direction and the right direction are respectively: 0.07, 1.8e -4 、4.2e -4 0.93.N is 2, firstly, normalizing the first 2 target probabilities with large probability, and then enabling the normalized 2 target probabilities to correspond to the 2 target probabilities And carrying out weighted summation calculation on the angles of 2 preset orientations, wherein the weighted summation calculation result is 83.7 degrees.
For another example, the probabilities of the target to be tracked in the four preset orientations of the forward direction, the backward direction, the left direction and the right direction are respectively: 3.5e -3 、0.59、0.40、2.2e -3 . And N is 2, firstly carrying out normalization processing on the first 2 target probabilities with large probability, then carrying out weighted summation calculation on the 2 target probabilities after normalization processing and angles of 2 preset orientations corresponding to the 2 target probabilities, wherein the weighted summation calculation result is 216.6 degrees.
In an alternative embodiment, step S402 further includes a step of determining, if the N preset orientations include opposite orientations, an angle of the preset orientation corresponding to a maximum target probability of the N target probabilities as an estimated orientation angle of each target to be tracked.
For example, the probabilities of the target to be tracked in the four preset orientations of the forward direction, the right direction, the backward direction and the left direction are respectively: 0.9, 0, 0.1, 0. Since the forward and backward directions are in opposite directions, and 0.9 is greater than 0.1, the angle 0 ° of the forward direction is an estimated orientation angle of the object to be tracked.
For another example, the probabilities of the target to be tracked in the four preset orientations of the forward direction, the backward direction, the left direction and the right direction are respectively: 0.12, 0.80, 0.04.N is 2, the two preset orientations corresponding to the first two maximum probabilities are forward and backward (opposite directions), respectively, and 0.8 is greater than 0.12, so the backward is the estimated orientation angle of the object to be tracked.
In an alternative embodiment, the plurality of preset orientations includes: forward, left, right and backward; the angle corresponding to the forward direction includes: the first angle and the second angle are greater than the second angle, and a difference between the first angle and the second angle is 360 degrees, for example, the first angle may be 360 degrees and the second angle may be 0 degrees.
In an embodiment of the present invention, as shown in fig. 5, the method further includes the following steps:
in step S501, if the N preset orientations corresponding to the N target probabilities include the forward direction and the left direction, it is determined that the forward direction angle in the N preset orientations corresponding to the N target probabilities is the first angle.
In the embodiment of the present invention, taking the above four preset orientations as examples, all the preset orientations are implemented into the rectangular coordinate system, and the positive direction of the vertical axis of the rectangular coordinate system may be denoted as the backward direction, the negative direction of the horizontal axis of the rectangular coordinate system may be denoted as the left direction, the positive direction of the horizontal axis of the rectangular coordinate system may be denoted as the right direction, and the negative direction of the vertical axis of the rectangular coordinate system may be denoted as the forward direction. For example: the forward angle is set to be 0 degrees, the right angle, the backward angle and the left angle are sequentially 90 degrees, 180 degrees and 270 degrees through anticlockwise rotation, the forward angle can be 360 degrees from left to right. Therefore, when the forward direction and the left direction are included in the N preset orientations, the angle of the forward direction is determined to be the first angle 360 °. Also for example: the forward angle is set to be 30 degrees, the right angle, the backward angle and the left angle are sequentially determined to be 120 degrees, 210 degrees and 300 degrees through anticlockwise rotation, the forward angle can be 390 degrees from left to right. Therefore, when the forward direction and the left direction are included in the N preset orientations, the angle of the forward direction is determined to be the first angle 390 °. Taking the first angle of 360 ° as an example, if the target to be tracked has four preset orientations, namely, the probabilities of the four preset orientations are respectively: 0.2, 0, 0.8, and N is 2. The 2 preset orientations corresponding to the 2 target probabilities (0.2 and 0.8) include a forward direction and a left direction, and the angle of the forward direction is 360 °, and the left direction is still 270 °, and the weighted sum calculation result is 360 ° x 0.2+270° x 0.8=288°.
In step S502, if the N preset orientations corresponding to the N target probabilities include the forward direction and the right direction, determining the forward direction angle in the N preset orientations corresponding to the N target probabilities as the second angle.
In the embodiment of the present invention, when the forward angle is 360 ° at the first angle, the second angle is 0 °. The forward angle is at a first angle of 390 and the second angle is 30. If the target to be tracked has four preset orientations, namely, the probabilities of the four preset orientations of the forward direction, the right direction, the backward direction and the left direction are respectively: 0.2, 0.8, 0, and N is 2. The 2 preset orientations corresponding to the 2 target probabilities (0.2 and 0.8) include a forward direction and a right direction, and the angle of the forward direction is 0 ° and the right direction is still 90 °, and the weighted sum calculation result is 0 ° ×0.2+90° ×0.8=72°.
In the embodiment of the present invention, the manner of determining the estimated orientation angle of each target to be tracked by using the N preset orientation angles has the following advantages: for the target to be tracked with the actual orientation between two preset orientations, discrete preset orientations can be mapped into continuous estimated orientation angles so as to ensure the accuracy of the estimated orientation angles. In addition, for the situation that different body parts of the target to be tracked are at different angles, the direction angle estimation can be respectively carried out for the different body parts, and the number of misjudgment can be reduced to a certain extent.
In an alternative embodiment, step S204 includes the steps of: and processing the image to be identified through the target re-identification model to obtain the orientation information of each target to be tracked in the image to be identified.
In an alternative embodiment, the target re-recognition model comprises: at least one residual unit, a first bottleneck unit and a second bottleneck unit; the output end of at least one residual unit is connected with the input ends of the first bottleneck unit and the second bottleneck unit respectively. As shown in fig. 6, based on the above target re-recognition model, step S204 includes the steps of:
and step S601, processing the image to be identified through at least one residual unit to obtain a target feature map.
Step S602, performing bottleneck processing on the target feature map through a second bottleneck unit to obtain the orientation information of each target to be tracked in the image to be identified; the first bottleneck unit is used for carrying out bottleneck processing on the target feature map to obtain feature information of the image to be identified.
The target re-identification model in the embodiment of the invention comprises the following steps: and (5) a pedestrian re-recognition model. The pedestrian re-recognition model can use a ResNet50 network structure as a skeleton, comprising 4 residual units, and introduce new branches after the 4 th residual unit, a first BottleNeck unit (or called a first BottleNeck unit) and a second BottleNeck unit (or called a second BottleNeck unit). On the basis of the original branch of the ResNet50 network structure, the first BottleNeck unit continuously monitors by using a cross entropy loss function Softmax and a triple loss function, and the second BottleNeck unit monitors the probability of each pedestrian to be identified in the image to be identified in four preset orientations, namely forward, backward, left and right directions by using the cross entropy loss function.
In order to realize the function of orientation estimation, the pedestrian re-identification model in the embodiment of the invention firstly judges the probabilities of the target to be tracked in four preset orientations, and then determines the actual orientation of the target to be tracked based on the probabilities of the target to be tracked in the four preset orientations.
In an alternative embodiment, as shown in fig. 7, step S302 includes the steps of:
step S701, calculating an angle difference between the estimated orientation angle and the orientation angle of the at least one tracked object.
After determining the estimated orientation angle of each target to be tracked, the angle difference between the tracked target and each target to be tracked is calculated in turn. For example, if the orientation angles of the 3 tracked objects are respectively: the estimated orientation angles of the 2 targets to be tracked are respectively 9 degrees, 100 degrees and 200 degrees, and then the angle differences between the orientation angles of the 3 targets to be tracked and the estimated orientation angle of the first target to be tracked are respectively as follows: 1 °, 91 °, 191 °; the angle differences between the orientation angles of the 3 tracked objects and the estimated orientation angle of the second object to be tracked are respectively: 70 °, 20 °, 180 °.
In step S702, a feature similarity score between each target to be tracked and at least one tracked target is calculated.
According to the embodiment of the invention, the apparent characteristics of each target to be tracked in the image to be identified are extracted, then the characteristic distance between the target to be tracked and the tracked target is calculated based on the apparent characteristics of the target to be tracked and the apparent characteristics of the tracked target, and finally the characteristic similarity score is determined based on the characteristic distance, wherein the larger the characteristic distance is, the lower the similarity score is.
In step S703, a tracked target to which each target to be tracked belongs is determined based on the angle difference and the feature similarity score.
The specific embodiment of step S703 includes: and calculating the product between the angle difference value and the feature similarity score, and determining the tracked target to which each target to be tracked belongs based on the product calculation result.
According to the embodiment of the invention, the angle difference value of the first preset range can be mapped into the numerical value of the second preset range. Exemplary, embodiments of the present invention may map an angle difference value in the range of 0 ° to 180 ° to a value in the range of 0.9 to 1.1.
According to the embodiment of the invention, the mapped angle difference value is used as a penalty factor to be multiplied by the feature similarity score, and the tracked target to which each target to be tracked belongs can be determined based on the product calculation result. Calculating the product of the angle difference and the feature similarity score, which is equivalent to reordering the apparent features of the target to be tracked, so that the feature distance of the target to be tracked with large difference of the orientation of the tracked target is increased, and the similarity is decreased; otherwise, the feature distance of the target to be tracked, which has small difference with the tracked target, is reduced, and the similarity is increased. By multiplying the angle difference value and the feature similarity score, the number of false matching times in the multi-target tracking task can be reduced, and the tracking precision is improved.
In an alternative embodiment, if there are multiple tracked objects; the step S703 of determining, based on the product calculation result, the tracked object to which each target to be tracked belongs includes: step 1, determining a tracked target corresponding to a maximum product calculation result from a plurality of product calculation results; and 2, determining the tracked target corresponding to the maximum product calculation result as the tracked target to which each target to be tracked belongs.
In order to solve the technical problem that different pedestrians with similar appearance are easily and wrongly associated to the same track in the existing multi-target tracking method, the embodiment of the invention determines the estimated orientation angle of each target to be tracked based on the characteristic that the orientation information of the target to be tracked in the previous frame image and the next frame image is basically unchanged in the video, reorders the characteristic distance by using the estimated orientation angle, and filters the target to be tracked with larger orientation angle difference with the tracked target, thereby reducing mismatching to a certain extent.
In the embodiment of the invention, the target re-identification model can estimate the probability of four preset orientations of the target to be tracked in the forward direction, the backward direction, the left direction and the right direction. On the basis of obtaining the probabilities of all preset orientations, an orientation angle estimation strategy is designed, namely, N target probabilities and N angles of the preset orientations are subjected to weighted summation calculation, and then the estimated orientation angle of each target to be tracked is estimated. According to the embodiment of the invention, the target to be tracked can be rearranged by utilizing the estimated orientation angle of the target to be tracked, the similarity of the target to be tracked with larger orientation difference with the tracked target is reduced, the similarity of the target to be tracked with smaller orientation difference with the tracked target is improved, and further, the mismatching can be reduced, and the tracking precision is improved.
Embodiment two:
the embodiment of the invention also provides a target tracking device which is mainly used for executing the target tracking method provided by the embodiment of the invention, and the target tracking device provided by the embodiment of the invention is specifically introduced below.
Fig. 8 is a schematic diagram of a target tracking apparatus according to an embodiment of the present invention. As shown in fig. 8, the object tracking device mainly includes the following modules: an acquisition module 11, configured to acquire an image to be identified; the image to be identified comprises a target to be tracked; the first determining module 12 is configured to process the image to be identified, and determine orientation information of each target to be tracked in the image to be identified; wherein the orientation information indicates an orientation of the target to be tracked; the second determining module 13 is configured to determine, according to the orientation information of each target to be tracked, a tracked target to which each target to be tracked belongs.
In the embodiment of the invention, an image to be identified is acquired by utilizing an acquisition module 11; the image to be identified comprises a target to be tracked; then, the first determining module 12 is utilized to process the image to be identified, and the orientation information of each target to be tracked in the image to be identified is determined; wherein the orientation information indicates an orientation of the target to be tracked; finally, the second determining module 13 is utilized to determine the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked. In the embodiment of the invention, the technical effects of reducing the number of mismatching and improving the tracking precision are realized by determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked.
Optionally, the second determination module 13 comprises the following sub-modules: a first determination sub-module for determining an estimated orientation angle of each target to be tracked based on the orientation information; and the second determination submodule is used for determining a tracked target to which each target to be tracked belongs based on the estimated orientation angle.
Optionally, the orientation information comprises probabilities that the actual orientation of the object to be tracked is a plurality of preset orientations, and the first determining submodule comprises the following units: the selection unit is used for selecting the first N maximum probabilities from the probabilities of a plurality of preset orientations to obtain N target probabilities; and the first determining unit is used for determining the estimated orientation angle of each target to be tracked based on N preset orientation angles corresponding to the N target probabilities.
The first determining unit is used for: and carrying out weighted summation calculation on the N target probabilities and the N angles of the preset orientations, and determining the weighted summation calculation result as an estimated orientation angle of each target to be tracked.
The first determining unit is further configured to: if the N preset orientations include opposite orientations, determining an angle of the preset orientation corresponding to the maximum target probability in the N target probabilities as an estimated orientation angle of each target to be tracked.
Optionally, the plurality of preset orientations include: forward, left, right and backward; the angle corresponding to the forward direction includes: the first angle is larger than the second angle, and the difference between the first angle and the second angle is 360 degrees; the device is also used for:
if N preset orientations corresponding to the N target probabilities contain forward directions and left directions, determining forward angles in the N preset orientations corresponding to the N target probabilities as first angles; if the N preset orientations corresponding to the N target probabilities contain forward directions and right directions, determining that the forward angles in the N preset orientations corresponding to the N target probabilities are second angles.
Optionally, the first determining module 12 is configured to process the image to be identified through the target re-identification model, so as to obtain orientation information of each target to be tracked in the image to be identified.
Optionally, the target re-recognition model includes: at least one residual unit, a first bottleneck unit and a second bottleneck unit; the output end of at least one residual unit is connected with the input ends of the first bottleneck unit and the second bottleneck unit respectively.
The first determination module 12 is further configured to: processing the image to be identified through at least one residual error unit to obtain a target feature map; performing bottleneck processing on the target feature map through a second bottleneck unit to obtain the orientation information of each target to be tracked in the image to be identified; the first bottleneck unit is used for carrying out bottleneck processing on the target feature map to obtain feature information of the image to be identified.
The second determination submodule includes: a first calculation unit for calculating an angle difference between the estimated orientation angle and the orientation angle of the at least one tracked object; a second calculation unit for calculating a feature similarity score between each target to be tracked and at least one tracked target; and the second determining unit is used for determining the tracked target to which each target to be tracked belongs based on the angle difference value and the feature similarity score.
The second determining unit is used for: and calculating the product between the angle difference value and the feature similarity score, and determining the tracked target to which each target to be tracked belongs based on the product calculation result.
Optionally, the tracked target is a plurality; the second determining unit is further configured to: determining a tracked target corresponding to the maximum product calculation result in the multiple product calculation results; and determining the tracked target corresponding to the maximum product calculation result as the tracked target of each target to be tracked.
The device provided by the embodiment of the present invention has the same implementation principle and technical effects as those of the foregoing method embodiment, and for the sake of brevity, reference may be made to the corresponding content in the foregoing method embodiment where the device embodiment is not mentioned.
Further, the present embodiment also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, performs the steps of the method provided by the foregoing method embodiments.
The computer program product of the face recognition method, the apparatus and the system provided by the embodiments of the present invention includes a computer readable storage medium storing program codes, and the instructions included in the program codes may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment and will not be described herein.
In addition, in the description of embodiments of the present invention, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
In the description of the present invention, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above examples are only specific embodiments of the present invention, and are not intended to limit the scope of the present invention, but it should be understood by those skilled in the art that the present invention is not limited thereto, and that the present invention is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (14)

1. A target tracking method, comprising:
acquiring an image to be identified; wherein the image to be identified comprises a target to be tracked;
processing the image to be identified, and determining the orientation information of each target to be tracked in the image to be identified; wherein the orientation information indicates an orientation of the target to be tracked; the orientation information comprises probabilities that the actual orientation of the target to be tracked is a plurality of preset orientations;
Determining a tracked target to which each target to be tracked belongs according to the orientation information of the target to be tracked;
in the process of determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked, if the target to be tracked is determined to belong to a plurality of tracked targets according to the orientation information of the target to be tracked, determining the tracked target of the target to be tracked according to the similarity scores of the target to be tracked and the plurality of tracked targets.
2. The method of claim 1, wherein determining the tracked object to which each target to be tracked belongs based on the orientation information of each target to be tracked comprises:
determining an estimated orientation angle of each target to be tracked based on the orientation information;
and determining the tracked target to which each target to be tracked belongs based on the estimated orientation angle.
3. The method of claim 2, wherein determining the estimated orientation angle of each object to be tracked based on the orientation information comprises:
selecting the first N maximum probabilities from the probabilities of the preset orientations to obtain N target probabilities;
And determining the estimated orientation angle of each target to be tracked based on N preset orientation angles corresponding to the N target probabilities.
4. A method according to claim 3, wherein determining the estimated orientation angle of each object to be tracked based on the N preset orientation angles corresponding to the N object probabilities comprises:
and carrying out weighted summation calculation on the N target probabilities and the N angles of the preset orientations, and determining a weighted summation calculation result as an estimated orientation angle of each target to be tracked.
5. The method of claim 3, wherein determining the estimated orientation angle for each object to be tracked based on the N preset orientation angles for the N object probabilities further comprises:
and if the N preset orientations comprise opposite orientations, determining an angle of the preset orientation corresponding to the maximum target probability in the N target probabilities as an estimated orientation angle of each target to be tracked.
6. A method according to claim 3, wherein the plurality of preset orientations comprises: forward, left, right and backward; the angle corresponding to the forward direction comprises: a first angle and a second angle, the first angle being greater than the second angle, and a difference between the first angle and the second angle being 360 degrees;
The method further comprises the steps of:
if N preset orientations corresponding to the N target probabilities contain the forward directions and the left directions, determining forward angles in the N preset orientations corresponding to the N target probabilities as first angles;
and if the N preset orientations corresponding to the N target probabilities contain the forward directions and the right directions, determining that the forward angles in the N preset orientations corresponding to the N target probabilities are second angles.
7. The method of claim 1, wherein processing the image to be identified, determining orientation information for each object to be tracked in the image to be identified comprises:
and processing the image to be identified through the target re-identification model to obtain the orientation information of each target to be tracked in the image to be identified.
8. The method of claim 7, wherein the target re-recognition model comprises: at least one residual unit, a first bottleneck unit and a second bottleneck unit;
the output end of the at least one residual error unit is respectively connected with the input ends of the first bottleneck unit and the second bottleneck unit;
processing the image to be identified through a target re-identification model, wherein obtaining the orientation information of each target to be tracked in the image to be identified comprises the following steps:
Processing the image to be identified through the at least one residual error unit to obtain a target feature map;
performing bottleneck processing on the target feature map through the second bottleneck unit to obtain the orientation information of each target to be tracked in the image to be identified;
the first bottleneck unit is used for carrying out bottleneck processing on the target feature map to obtain feature information of the image to be identified.
9. The method of claim 2, wherein determining the tracked object to which each object to be tracked belongs based on the estimated orientation angle comprises:
calculating an angle difference between the estimated orientation angle and an orientation angle of at least one tracked object;
calculating a feature similarity score between each target to be tracked and the at least one tracked target;
and determining the tracked target to which each target to be tracked belongs based on the angle difference value and the feature similarity score.
10. The method of claim 9, wherein determining the tracked object to which each object to be tracked belongs based on the angle difference and the feature similarity score comprises:
And calculating the product between the angle difference value and the feature similarity score, and determining the tracked target to which each target to be tracked belongs based on the product calculation result.
11. The method of claim 10, wherein the tracked target is a plurality of;
the determining the tracked target to which each target to be tracked belongs based on the product calculation result comprises:
determining a tracked target corresponding to the maximum product calculation result in the multiple product calculation results;
and determining the tracked target corresponding to the maximum product calculation result as the tracked target to which each target to be tracked belongs.
12. An object tracking device, comprising:
the acquisition module is used for acquiring the image to be identified; wherein the image to be identified comprises a target to be tracked;
the first determining module is used for processing the image to be identified and determining the orientation information of each target to be tracked in the image to be identified; wherein the orientation information indicates an orientation of the target to be tracked; the orientation information comprises probabilities that the actual orientation of the target to be tracked is a plurality of preset orientations;
the second determining module is used for determining a tracked target to which each target to be tracked belongs according to the orientation information of the target to be tracked; in the process of determining the tracked target to which each target to be tracked belongs according to the orientation information of each target to be tracked, if the target to be tracked is determined to belong to a plurality of tracked targets according to the orientation information of the target to be tracked, determining the tracked target of the target to be tracked according to the similarity scores of the target to be tracked and the plurality of tracked targets.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of any of the preceding claims 1 to 11 when the computer program is executed.
14. A computer readable medium having non-volatile program code executable by a processor, the program code causing the processor to perform the method of any one of claims 1-11.
CN202010210615.2A 2020-03-23 2020-03-23 Target tracking method and device and electronic equipment Active CN111401285B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010210615.2A CN111401285B (en) 2020-03-23 2020-03-23 Target tracking method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010210615.2A CN111401285B (en) 2020-03-23 2020-03-23 Target tracking method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111401285A CN111401285A (en) 2020-07-10
CN111401285B true CN111401285B (en) 2024-02-23

Family

ID=71434393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010210615.2A Active CN111401285B (en) 2020-03-23 2020-03-23 Target tracking method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111401285B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085769A (en) * 2020-09-09 2020-12-15 武汉融氢科技有限公司 Object tracking method and device and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105574894A (en) * 2015-12-21 2016-05-11 零度智控(北京)智能科技有限公司 Method and system for screening moving object feature point tracking results
CN108230284A (en) * 2016-12-14 2018-06-29 深圳先进技术研究院 A kind of movement locus determines method and device
CN108446585A (en) * 2018-01-31 2018-08-24 深圳市阿西莫夫科技有限公司 Method for tracking target, device, computer equipment and storage medium
CN108629791A (en) * 2017-03-17 2018-10-09 北京旷视科技有限公司 Pedestrian tracting method and device and across camera pedestrian tracting method and device
CN110443829A (en) * 2019-08-05 2019-11-12 北京深醒科技有限公司 It is a kind of that track algorithm is blocked based on motion feature and the anti-of similarity feature
CN110516556A (en) * 2019-07-31 2019-11-29 平安科技(深圳)有限公司 Multi-target tracking detection method, device and storage medium based on Darkflow-DeepSort

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106033615B (en) * 2016-05-16 2017-09-15 北京旷视科技有限公司 Destination object motion direction detecting method and device
US11751947B2 (en) * 2017-05-30 2023-09-12 Brainlab Ag Soft tissue tracking using physiologic volume rendering
CN108090916B (en) * 2017-12-21 2019-05-07 百度在线网络技术(北京)有限公司 Method and apparatus for tracking the targeted graphical in video

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105574894A (en) * 2015-12-21 2016-05-11 零度智控(北京)智能科技有限公司 Method and system for screening moving object feature point tracking results
CN108230284A (en) * 2016-12-14 2018-06-29 深圳先进技术研究院 A kind of movement locus determines method and device
CN108629791A (en) * 2017-03-17 2018-10-09 北京旷视科技有限公司 Pedestrian tracting method and device and across camera pedestrian tracting method and device
CN108446585A (en) * 2018-01-31 2018-08-24 深圳市阿西莫夫科技有限公司 Method for tracking target, device, computer equipment and storage medium
CN110516556A (en) * 2019-07-31 2019-11-29 平安科技(深圳)有限公司 Multi-target tracking detection method, device and storage medium based on Darkflow-DeepSort
CN110443829A (en) * 2019-08-05 2019-11-12 北京深醒科技有限公司 It is a kind of that track algorithm is blocked based on motion feature and the anti-of similarity feature

Also Published As

Publication number Publication date
CN111401285A (en) 2020-07-10

Similar Documents

Publication Publication Date Title
US20220383535A1 (en) Object Tracking Method and Device, Electronic Device, and Computer-Readable Storage Medium
WO2022156640A1 (en) Gaze correction method and apparatus for image, electronic device, computer-readable storage medium, and computer program product
CN109344899B (en) Multi-target detection method and device and electronic equipment
CN111242222B (en) Classification model training method, image processing method and device
CN111104925B (en) Image processing method, image processing apparatus, storage medium, and electronic device
JP7297989B2 (en) Face biometric detection method, device, electronic device and storage medium
CN108876758B (en) Face recognition method, device and system
CN112381071A (en) Behavior analysis method of target in video stream, terminal device and medium
WO2021208373A1 (en) Image identification method and apparatus, and electronic device and computer-readable storage medium
CN111951192A (en) Shot image processing method and shooting equipment
CN111401285B (en) Target tracking method and device and electronic equipment
JP2022519398A (en) Image processing methods, equipment and electronic devices
WO2024022301A1 (en) Visual angle path acquisition method and apparatus, and electronic device and medium
CN112418153B (en) Image processing method, device, electronic equipment and computer storage medium
CN110956131B (en) Single-target tracking method, device and system
CN110717441B (en) Video target detection method, device, equipment and medium
CN111784750A (en) Method, device and equipment for tracking moving object in video image and storage medium
CN114461078B (en) Man-machine interaction method based on artificial intelligence
CN115937950A (en) Multi-angle face data acquisition method, device, equipment and storage medium
CN112184766B (en) Object tracking method and device, computer equipment and storage medium
CN114565777A (en) Data processing method and device
CN113065523B (en) Target tracking method and device, electronic equipment and storage medium
CN111986230A (en) Method and device for tracking posture of target object in video
CN111369425A (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN111199179A (en) Target object tracking method, terminal device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant