CN113674309B - Method, device, management platform and storage medium for object tracking - Google Patents

Method, device, management platform and storage medium for object tracking Download PDF

Info

Publication number
CN113674309B
CN113674309B CN202010407982.1A CN202010407982A CN113674309B CN 113674309 B CN113674309 B CN 113674309B CN 202010407982 A CN202010407982 A CN 202010407982A CN 113674309 B CN113674309 B CN 113674309B
Authority
CN
China
Prior art keywords
tracked
feature
data
feature data
human body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010407982.1A
Other languages
Chinese (zh)
Other versions
CN113674309A (en
Inventor
曹中胜
孟凡旗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision System Technology Co Ltd filed Critical Hangzhou Hikvision System Technology Co Ltd
Priority to CN202010407982.1A priority Critical patent/CN113674309B/en
Publication of CN113674309A publication Critical patent/CN113674309A/en
Application granted granted Critical
Publication of CN113674309B publication Critical patent/CN113674309B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a method, a device, a management platform and a storage medium for object tracking, and belongs to the technical field of video monitoring. The method comprises the following steps: acquiring feature data of a first feature of an object to be tracked, and taking the feature data as tracking reference data of the object to be tracked; acquiring monitoring data, determining feature data of a second feature of the object to be tracked based on the monitoring data and tracking reference data of the object to be tracked, and updating the tracking reference data of the object to be tracked into the feature data of the first feature and the feature data of the second feature; and tracking the object to be tracked based on the updated tracking reference data of the object to be tracked. By adopting the method and the device, the problem of tracking failure caused by the fact that a single feature is blocked can be avoided as much as possible.

Description

Method, device, management platform and storage medium for object tracking
Technical Field
The present invention relates to the field of monitoring technologies, and in particular, to a method, an apparatus, a management platform, and a storage medium for object tracking.
Background
Along with the increasing demand of security protection, monitoring devices are deployed in various areas of the city, and the monitoring devices play an important role in the security protection system of the city. The tracking of the appointed object can be realized through the monitoring data shot by the monitoring equipment deployed at different points.
At present, tracking of an object is realized through monitoring data shot by monitoring equipment, and security personnel are generally required to input face images of the object to be tracked and to assign an initial tracking place on a management platform. And acquiring monitoring data shot by the monitoring equipment corresponding to the tracking initiation by the management platform, and carrying out face recognition on the monitoring data to obtain a face image to be compared in the monitoring data. And then matching the identified face image to be compared with the input face image of the object to be tracked, and determining the face image to be compared, the similarity of which meets the preset threshold value, as the face image of the object to be tracked, so that the monitoring equipment is indicated to shoot the object to be tracked. Then, the process can be executed according to the monitoring data of the next monitoring device in the estimated path, so as to track the object to be tracked.
In carrying out the present application, the applicant has found that the related art has at least the following problems:
when an object to be tracked is tracked through a face image, if the face of the object to be tracked is blocked, a situation that a target face image meeting the condition of similarity cannot be matched may occur, and thus tracking failure is caused.
Disclosure of Invention
The embodiment of the application provides a method, a device, a management platform and a storage medium for object tracking. The problem of tracking failure caused by the fact that a face is blocked in the related art can be solved, and the technical scheme is as follows:
in a first aspect, there is provided a method of object tracking, the method comprising:
acquiring feature data of a first feature of an object to be tracked, and taking the feature data as tracking reference data of the object to be tracked;
acquiring monitoring data, determining feature data of a second feature of the object to be tracked based on the monitoring data and tracking reference data of the object to be tracked, and updating the tracking reference data of the object to be tracked into the feature data of the first feature and the feature data of the second feature;
and tracking the object to be tracked based on the updated tracking reference data of the object to be tracked.
In a possible implementation manner, the determining feature data of the second feature of the object to be tracked based on the monitoring data and tracking reference data of the object to be tracked includes:
identifying the characteristic data of the first characteristic contained in the monitoring data as characteristic data to be compared of the first characteristic;
Determining feature data to be compared of a first feature with similarity to the tracking reference data being larger than a preset threshold value, and taking the feature data to be compared as target feature data;
and acquiring the characteristic data of the second characteristic of the object with the target characteristic data from the monitoring data as the characteristic data of the second characteristic of the object to be tracked.
In one possible implementation, the first feature is one or both of a face, a human body, and a gait, and the second feature is at least one of a face, a human body, and a gait, in addition to the first feature.
In one possible implementation manner, the acquiring feature data of the second feature of the object having the target feature data, as feature data of the second feature of the object to be tracked, includes:
according to the monitoring data, performing behavior analysis on the object with the target characteristic data, and if the object with the target characteristic data is analyzed to have the behavior of riding the target vehicle, acquiring an image of the target vehicle;
and acquiring vehicle characteristic data in the image of the target vehicle as characteristic data of the second characteristic of the object to be tracked.
In one possible implementation manner, the tracking the object to be tracked based on the updated tracking reference data of the object to be tracked includes:
determining a track of the target vehicle, and analyzing monitoring data of monitoring devices positioned behind a first monitoring device in the track, wherein the first monitoring device is the monitoring device to which the monitoring data for analyzing that the behavior of the target vehicle exists in the object with the target characteristic data belongs;
and if the object to be tracked is not in the target vehicle, continuing to track the object to be tracked by taking the position of a second monitoring device as an initial tracking position according to the rest characteristic data except the vehicle characteristic data in the tracking reference data of the object to be tracked, wherein the second monitoring device is the monitoring device to which the monitoring data for determining that the object to be tracked is not in the target vehicle belongs.
In a second aspect, there is provided an apparatus for object tracking, the apparatus comprising:
the acquisition module is used for acquiring the characteristic data of the first characteristic of the object to be tracked and taking the characteristic data as tracking reference data of the object to be tracked;
The updating module is used for acquiring monitoring data, determining the characteristic data of the second characteristic of the object to be tracked based on the monitoring data and the tracking reference data of the object to be tracked, and updating the tracking reference data of the object to be tracked into the characteristic data of the first characteristic and the characteristic data of the second characteristic;
and the tracking module is used for tracking the object to be tracked based on the updated tracking reference data of the object to be tracked.
In one possible implementation manner, the updating module is configured to:
identifying the characteristic data of the first characteristic contained in the monitoring data as characteristic data to be compared of the first characteristic;
determining feature data to be compared of a first feature with similarity to the tracking reference data being larger than a preset threshold value, and taking the feature data to be compared as target feature data;
and acquiring the characteristic data of the second characteristic of the object with the target characteristic data from the monitoring data as the characteristic data of the second characteristic of the object to be tracked.
In one possible implementation, the first feature is one or both of a face, a human body, and a gait, and the second feature is at least one of a face, a human body, and a gait, in addition to the first feature.
In one possible implementation manner, the acquiring feature data of the second feature of the object having the target feature data, as feature data of the second feature of the object to be tracked, includes:
according to the monitoring data, performing behavior analysis on the object with the target characteristic data, and if the object with the target characteristic data is analyzed to have the behavior of riding the target vehicle, acquiring an image of the target vehicle;
and acquiring vehicle characteristic data in the image of the target vehicle as characteristic data of the second characteristic of the object to be tracked.
In one possible implementation manner, the updating module is configured to:
determining a track of the target vehicle, and analyzing monitoring data of monitoring devices positioned behind a first monitoring device in the track, wherein the first monitoring device is the monitoring device to which the monitoring data for analyzing that the behavior of the target vehicle exists in the object with the target characteristic data belongs;
and if the object to be tracked is not in the target vehicle, continuing to track the object to be tracked by taking the position of a second monitoring device as an initial tracking position according to the rest characteristic data except the vehicle characteristic data in the tracking reference data of the object to be tracked, wherein the second monitoring device is the monitoring device to which the monitoring data for determining that the object to be tracked is not in the target vehicle belongs.
In a third aspect, a management platform is provided, the management platform comprising a processor and a memory, the memory having stored therein at least one instruction that is loaded and executed by the processor to implement the method of object tracking as described in the first aspect above.
In a fourth aspect, there is provided a computer readable storage medium having stored therein at least one instruction that is loaded and executed by the processor to implement the method of object tracking as described in the first aspect above.
The beneficial effects that technical scheme that this application embodiment provided include at least:
in this embodiment of the present application, the feature data of the second feature of the object to be tracked may be determined according to the feature data of the first feature of the object to be tracked in the monitored data. Further, the object to be tracked is tracked together according to the feature data of the first feature and the feature data of the second feature. In this way, the problem of tracking failure due to single features being occluded can be avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for object tracking provided in an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an apparatus for object tracking according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a management platform according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The embodiment of the application provides an object tracking method which can be realized by a management platform. The management platform may be a computer device deployed on the management side. The user can input a face image, a body image, and the like of an object to be tracked, which needs to be tracked, to the management platform. The management platform can acquire the monitoring data of the monitoring equipment, analyze the monitoring data and identify the object to be tracked in the monitoring data. According to the object tracking method provided by the embodiment of the application, in the process of tracking the object to be tracked, the tracking condition can be supplemented so as to improve the tracking accuracy of the object to be tracked.
As shown in fig. 1, the process flow of the method for object tracking may include the following steps:
Step 101, feature data of a first feature of an object to be tracked is obtained and used as tracking reference data of the object to be tracked.
The object to be tracked may be a person, and the first feature of the object to be tracked may include only one feature or may include multiple features. In the case where only one feature is included, the first feature may be a face, a human body, or the like, and in the case where a plurality of features are included, the first feature may be a face and a human body, a gait, or the like.
In an implementation, when the security manager needs to track the object to be tracked, the relevant information of the first feature of the object to be tracked may be input to the management platform. The management platform can acquire feature data of the first feature according to the related information of the first feature. In the following, description will be made of the feature data of the first feature obtained from the related information of the first feature, with respect to the different cases of the first feature.
Case one, the first feature includes only one feature case.
In this case, the first feature may be a human face or a human body. Accordingly, the related information of the first feature may be an image of the first feature or feature data of the first feature.
When the related information of the first feature is an image of the first feature, for example, a face image or a human body image, the management device may perform feature extraction on the image of the first feature, for example, the face image or the human body image, to obtain feature data of the first feature, for example, face feature data or human body feature data. The feature data of the first feature may be a three-dimensional model of the face when the first feature is the face.
Case two, case where the first feature includes multiple features.
The first feature may be a human face and body or a human body and gait. Correspondingly, when the first feature is a human face or a human body, the related information of the first feature may be a human face image or a human body image, or human face feature data or human body feature data. When the first feature is a human body and gait, the information related to the first feature may be a human body image and gait feature data, or a human body feature data and gait feature data.
When the related information of the first feature is a human face image and a human body image, the management device can perform feature extraction on the human face image and the human body image to obtain human face feature data and human body feature data. When the related information of the first feature is human body image and gait feature data, the management device can perform feature extraction on the human body image to obtain human body feature data, and further can obtain the feature data of the first feature.
After the feature data of the first feature is obtained, the feature data of the first feature may be used as tracking reference data of the object to be tracked.
Step 102, acquiring monitoring data, determining feature data of a second feature of the object to be tracked based on the monitoring data and tracking reference data of the object to be tracked, and updating the tracking reference data of the object to be tracked into the feature data of the first feature and the feature data of the second feature.
Wherein the first feature is one or both of a face, a human body, and a gait, and the second feature is at least one of a face, a human body, and a gait in addition to the first feature. The following is an illustration of the possible cases for the first feature and the second feature:
if the first feature is a human face, the second feature may be a human body, or a human body and gait. If the first feature is a human body, the second feature may be a human face, or a gait, or both a human face and a gait. If the first feature is gait, the second feature may be a human face, or a human body, or both. If the first feature is a human body and gait, the second feature may be a human face. The second feature may be gait if the first feature is a human face and a human body. The second feature may be a human body if the first feature is a human face and gait.
In practice, the manager may also specify the tracking start point and the tracking time when tracking the object. The management platform can acquire monitoring data from the monitoring equipment corresponding to the tracking start point.
Optionally, determining feature data of the second feature of the object to be tracked based on the monitoring data and tracking reference data of the object to be tracked includes:
after the monitoring data is acquired, the feature data of the first feature contained in the monitoring data can be identified and used as the feature data to be compared of the first feature. And determining feature data to be compared of the first feature with the similarity with the tracking reference data being larger than a preset threshold value as target feature data. Then, feature data of the second feature of the object having the target feature data is acquired from the monitoring data as feature data of the second feature of the object to be tracked.
The method of determining the feature data of the second feature of the object to be tracked is described below for a plurality of different cases of the first feature and the second feature, respectively.
In case one, the first feature is a human face and the second feature is a human body, or a human body and gait.
Firstly, carrying out face recognition on the monitoring images in the acquired monitoring data, and recognizing face images in the monitoring images as face images to be compared. In addition, when face recognition is performed, besides the face image, face posture information in the face image can be determined, and the face posture information can be a deflection angle relative to the front face.
Then, for each face image to be compared, face comparison can be performed with tracking reference data of the object to be tracked. The comparison method can be as follows:
and extracting the characteristics of each face image to be compared to obtain face characteristic data to be compared. And then, acquiring a human face three-dimensional model of the object to be tracked from tracking reference data of the object to be tracked. And acquiring face characteristic data corresponding to the face posture information in a face three-dimensional model according to the face posture information corresponding to the face images to be compared, and taking the face characteristic data as reference face characteristic data. The face three-dimensional model is rotated to a deflection angle corresponding to the face posture information, and then plane projection is carried out to obtain face characteristic data corresponding to the face posture information, wherein the face characteristic data is used as reference face characteristic data. And then, carrying out similarity calculation on the face feature data to be compared and the reference face feature data, and if the similarity is larger than a face similarity threshold value, determining that the face image to be compared is the face image of the object to be tracked. Here, the similarity calculation may be calculating a euclidean distance between the face feature data to be compared and the reference face feature data, and the face similarity threshold may be 90%.
The human body recognition can be performed on the monitoring image while the human body recognition is performed on the monitoring image in the monitoring data. Human body images in each of the monitor images are identified. After the face image to be compared is determined to be the face image of the object to be tracked, a target human body image which belongs to the same object as the face image to be compared can be determined in the monitoring image where the face image to be compared is located. And the target human body image may be determined as a human body image of the object to be tracked.
Finally, feature extraction can be performed on the target human body image to obtain human body feature data corresponding to the target human body image, wherein the human body feature data is used as feature data of a second feature of the object to be tracked.
In addition, after the monitoring data are acquired, gait modeling can be performed on the monitoring data, and gait characteristic data of each object in the monitoring data are determined. After the target human body image is determined, whether the object corresponding to the target human body image corresponds to gait feature data or not can be judged, and if so, the gait feature data and the determined human body feature data can be used as feature data of the second feature together.
In one possible implementation manner, when the face image in each monitoring image is identified, a score of the face image may also be obtained. Correspondingly, after the face image to be compared is determined to be the face image of the object to be tracked, if the score of the face image to be compared is higher than a preset score threshold, the face feature data of the face image to be compared and the reference face feature data determined in the face three-dimensional model can be averaged, and the average is used for replacing the reference face feature data determined in the face three-dimensional model.
The second, first feature is a human body and the second feature is a human face, or a gait, or both a human face and a gait.
First, human body recognition is performed on the monitoring images in the acquired monitoring data, and human body images in the monitoring images are recognized and used as human body images to be compared.
Then, for each human body image to be compared, human body comparison can be performed with tracking reference data of the object to be tracked. The comparison method can be as follows:
and extracting the characteristics of each human body image to be compared to obtain the human body characteristic data to be compared. Then, human body characteristic data of the object to be tracked is obtained from tracking reference data of the object to be tracked and is used as the reference human body characteristic data. And then, carrying out similarity calculation on the human body characteristic data to be compared and the reference human body characteristic data, and if the similarity is larger than a first human body similarity threshold value, determining that the human body image to be compared is the human body image of the object to be tracked. Here, the similarity calculation may be calculating a euclidean distance or the like of the human body characteristic data to be compared with the reference human body characteristic data, and the first human body similarity threshold may be 90%.
Human body recognition can be carried out on the monitoring image in the monitoring data, and meanwhile human body recognition can be carried out on the monitoring image. Face images in each of the monitoring images are identified. After the human body image to be compared is determined as the human body image of the object to be tracked, a target human face image which belongs to the same object as the human body image to be compared can be determined in the monitoring image where the human body image to be compared is located. And the target face image may be determined as a face image of the object to be tracked.
Finally, feature extraction can be performed on the target face image to obtain face feature data corresponding to the target face image, wherein the face feature data is used as feature data of a second feature of the object to be tracked. In addition, after the monitoring data are acquired, gait modeling can be performed on the monitoring data, and gait characteristic data of each object in the monitoring data are determined. After the human body image or the target face image of the object to be tracked is determined, whether the human body image or the target face image of the object to be tracked corresponds to gait feature data or not can be judged, if so, the gait feature data can be independently used as the feature data of the second feature, or the gait feature data and the determined face feature data are used as the feature data of the second feature together.
The third case, the first feature is human face and body, and the second feature is gait.
First, a face image of an object to be tracked may be determined in the monitored image according to the method of the first case, and a human body image of the object to be tracked may be determined in the monitored image according to the method of the second case. Meanwhile, gait modeling can be performed on the monitoring data, and gait characteristic data of each object in the monitoring data can be determined.
If the determined face image of the object to be tracked is the same as the object corresponding to the human body image of the object to be tracked, the determined face image of the object to be tracked or gait feature data corresponding to the object to which the human body image of the object to be tracked belongs can be used as the feature data of the second feature of the object to be tracked.
If the determined face image of the object to be tracked is different from the object corresponding to the human body image of the object to be tracked, taking into consideration that the accuracy of the face to the object recognition is higher, gait feature data corresponding to the object to which the determined face image of the object to be tracked belongs can be used as feature data of the second feature of the object to be tracked.
If only the face image of the object to be tracked or the human body image of the object to be tracked is determined, the determined face image of the object to be tracked or gait feature data corresponding to the object to which the human body image of the object to be tracked belongs can be used as the feature data of the second feature of the object to be tracked.
The fourth case, the first feature is human body and gait, the second feature is human face.
First, a human body image of an object to be tracked may be determined in the monitored image according to the method of the second case. And acquiring gait feature data in the tracking reference data as reference gait feature data. And determining target gait feature data with the similarity meeting the gait similarity threshold value with the reference gait feature data from the gait feature data corresponding to each object.
If the determined human body image of the object to be tracked is the same as the object to which the target gait feature data belongs, a target human face image corresponding to the determined human body image of the object to be tracked or the object to which the target gait feature data belongs can be obtained, and feature extraction is performed on the target human face image to obtain human face feature data corresponding to the target human face image, wherein the human face feature data is used as feature data of the second feature of the object to be tracked.
If the determined human body image of the object to be tracked is different from the object to which the target gait feature data belongs, the human face image corresponding to the object to which the target gait feature data belongs can be used as a target human face image, and the human face feature data corresponding to the target human face image can be used as feature data of a second feature of the object to be tracked in consideration of higher accuracy of gait to object identification compared with human body identification.
After obtaining the feature data of the second feature of the object to be tracked, the tracking reference data of the object to be tracked can be updated into the feature data of the first feature and the feature data of the second feature.
In the fifth case, the first feature is gait, and the second feature is human body, or face and human body.
First, gait modeling is performed on the monitoring data, and gait feature data corresponding to each object in the monitoring data are obtained. Gait feature data in the tracking reference data are acquired as reference gait feature data. And determining target gait feature data with the similarity meeting the gait similarity threshold value with the reference gait feature data from the gait feature data corresponding to each object.
Then, human body identification can be performed on the monitoring image in the monitoring data, a target human body image corresponding to the object to which the target gait feature data belongs in the monitoring data is obtained, and feature extraction is performed on the target human body image, so that human body feature data corresponding to the target human body image is obtained and used as feature data of the second feature of the object to be tracked.
And when the human body recognition is carried out on the monitoring image in the monitoring data, the human body recognition can be carried out on the monitoring image, the target face image corresponding to the object to which the target gait feature data belong in the monitoring data is obtained, the feature extraction is carried out on the target face image, and the face feature data corresponding to the target face image is obtained and is used as the feature data of the second feature of the object to be tracked.
And when the monitoring image in the monitoring data is subjected to human body recognition, the monitoring image can be subjected to human face recognition, and if a target human face image belonging to the same object as the target human body image exists in the recognized human face image, the target human face image can be subjected to feature extraction to obtain the human face feature data corresponding to the target human face image. Then, the human body feature data corresponding to the target human body image and the human face feature data corresponding to the target human face image can be used as feature data of the second feature of the object to be tracked.
In case six, the first feature is a human face and gait, and the second feature may be a human body.
Firstly, a face image of an object to be tracked can be determined in a monitoring image according to the method of the first case, and target gait feature data can be obtained according to the method of the fifth case.
Then, human body recognition can be performed on the monitoring images in the monitoring data, and human body images in the respective monitoring images can be recognized. If the determined face image of the object to be tracked is the same as the object to which the target gait feature data belongs, a target human body image which belongs to the same object as the determined face image of the object to be tracked or the target gait feature data can be obtained. And extracting the characteristics of the target human body image to obtain human body characteristic data corresponding to the target human body image, wherein the human body characteristic data is used as the characteristic data of the second characteristic of the object to be tracked.
If the determined face image of the object to be tracked is different from the object to which the target gait feature data belongs, the target human body image which belongs to the same object as the determined face image of the object to be tracked can be obtained in consideration of higher accuracy of the face on object identification. And extracting the characteristics of the target human body image to obtain human body characteristic data corresponding to the target human body image, wherein the human body characteristic data is used as the characteristic data of the second characteristic of the object to be tracked.
In one possible implementation, the object to be tracked may be in a vehicle, and then the vehicle in which the object to be tracked is in the process of tracking the object to be tracked may also be used as the second feature of the object to be tracked.
Optionally, determining feature data of the second feature of the object to be tracked based on the monitoring data and tracking reference data of the object to be tracked includes:
and carrying out behavior analysis on the object with the target characteristic data according to the monitoring data, and acquiring an image of the target vehicle if the object with the target characteristic data is analyzed to have the behavior of riding the target vehicle. Vehicle feature data in an image of the target vehicle is acquired as feature data of a second feature of the object to be tracked.
The transportation means can be buses, taxis, household cars and the like. The vehicle characteristic data may include a license plate number, a vehicle model number, a vehicle body color, and the like.
The object to which the target feature data belongs is subjected to behavior analysis, where the behavior analysis may employ a trained machine learning model. If the behavior of the object to which the target characteristic data belongs in riding the target vehicle is analyzed, an image of the target vehicle is acquired in the monitoring data. And extracting vehicle feature data of the image of the target vehicle as feature data of a second feature of the object to be tracked.
The behavior of the riding target vehicle of the object to which the analyzed target feature data belongs may be as follows:
in the first case, the object to which the target feature data belongs appears in the target vehicle.
And in the second case, after the object to which the target feature data belongs is closer to the target vehicle, the object to which the target feature data belongs disappears in the monitoring data.
In one possible implementation, since the accuracy of behavior analysis on the object is relatively low, in order to track the object to be tracked more accurately, a riding prompt of the object to be tracked can be performed when the behavior of the object with the target characteristic data in the target vehicle is analyzed. For example, the management platform may prompt the subject to be tracked for possible rides in text or speech when analyzing that the subject with the target feature data has a behavior to ride the target vehicle. After the security manager obtains the prompt that the object to be tracked is likely to take a car, the security manager can manually preview the monitoring image to judge whether the object to be tracked really has the action of taking the vehicle. If so, a vehicle tracking instruction may be entered. After receiving the vehicle tracking instruction, the management platform can add the vehicle characteristic data into the tracking reference data of the object to be tracked.
In one possible implementation, if the tracking reference data includes face feature data and body feature data, and the corresponding similarity of the face images to be compared is greater than a face similarity threshold, and the corresponding similarity of the body images to be compared is less than a second body similarity threshold, for example, the object to be tracked is changed to wear. The human body characteristic data in the tracking reference data can be replaced by the human body characteristic data of the human body image to be compared.
And 103, tracking the object to be tracked based on the updated tracking reference data of the object to be tracked.
In one possible implementation manner, the monitoring data of which the shooting time is after the shooting time of the current monitoring data in the monitoring device in the preset range of the monitoring device to which the current monitoring data belongs can be acquired, and the object to be tracked is tracked in the newly acquired monitoring data based on the updated tracking reference data of the object to be tracked.
The updated tracking reference data of the object to be tracked comprises face feature data and human feature data:
when the object to be tracked is tracked, the reference face characteristic data and the reference human body characteristic data can be respectively acquired from the updated tracking reference data of the object to be tracked to track the object to be tracked.
If only the target face feature data meeting the face similarity threshold with the reference face feature data or the target human feature data meeting the first human similarity threshold with the reference human feature data is identified in the newly acquired monitoring data, the target face feature data or an object to which the target human feature data belongs can be used as an object to be tracked.
If the target face feature data meeting the face similarity threshold with the reference face feature data and the target human feature data meeting the first human similarity threshold with the reference human feature data are identified in the newly acquired monitoring data, and the target face feature data and the target human feature data belong to the same object, the object to which the target face feature data and the target human feature data belong can be determined to be the object to be tracked. If the target face feature data and the target human feature data belong to different objects, the object to which the target face feature data belongs can be determined as the object to be tracked in consideration of higher accuracy of the face to object recognition.
The updated tracking reference data of the object to be tracked comprises face characteristic data and gait characteristic data:
When the object to be tracked is tracked, the reference face characteristic data and the reference gait characteristic data can be respectively acquired from the updated tracking reference data of the object to be tracked to track the object to be tracked.
If only the target face feature data meeting the face similarity threshold with the reference face feature data or the target gait feature data meeting the gait similarity threshold with the reference gait feature data is identified in the newly acquired monitoring data, the target face feature data or an object to which the target gait feature data belongs can be used as an object to be tracked.
If the target face feature data meeting the face similarity threshold with the reference face feature data and the target gait feature data meeting the gait similarity threshold with the reference gait feature data are identified in the newly acquired monitoring data, and the target face feature data and the target gait feature data belong to the same object, the object to which the target face feature data and the target gait feature data belong can be determined as the object to be tracked. If the target face feature data and the target gait feature data are different, the object to which the target face feature data belongs can be determined as the object to be tracked in consideration of higher accuracy of the face to object recognition.
The updated tracking reference data of the object to be tracked comprises human body characteristic data and gait characteristic data:
when the object to be tracked is tracked, the reference human body characteristic data and the reference gait characteristic data can be respectively acquired from the updated tracking reference data of the object to be tracked to track the object to be tracked.
If only the target human body characteristic data satisfying the first human body similarity threshold with the reference human body characteristic data or the target gait characteristic data satisfying the gait similarity threshold with the reference gait characteristic data is identified in the newly acquired monitoring data, the target human body characteristic data or an object to which the target gait characteristic data belongs may be regarded as the object to be tracked.
If the target human body characteristic data satisfying the first human body similarity threshold with the reference human body characteristic data and the target gait characteristic data satisfying the gait similarity threshold with the reference gait characteristic data are identified in the newly acquired monitoring data, and the target human body characteristic data and the target gait characteristic data belong to the same object, the object to which the target human body characteristic data and the target gait characteristic data belong can be determined as the object to be tracked. If the target human body characteristic data and the target gait characteristic data belong to different objects, the object to which the target gait characteristic data belong can be determined as the object to be tracked in view of the higher accuracy of the gait for object identification.
The updated tracking reference data of the object to be tracked comprises face feature data, human body feature data and gait feature data:
when the object to be tracked is tracked, the reference face characteristic data, the reference human body characteristic data and the reference gait characteristic data can be respectively acquired from the updated tracking reference data of the object to be tracked to track the object to be tracked.
If only the target human body characteristic data satisfying the first human body similarity threshold with the reference human body characteristic data, or the target gait characteristic data satisfying the gait similarity threshold with the reference gait characteristic data, or the target human face characteristic data satisfying the gait similarity threshold with the reference gait characteristic data is identified in the newly acquired monitoring data, the target human body characteristic data, the target gait characteristic data, or the object to which the target gait characteristic data belongs may be determined as the object to be tracked.
If the target human body characteristic data meeting the first human body similarity threshold with the reference human body characteristic data, the target gait characteristic data meeting the gait similarity threshold with the reference gait characteristic data and the target human face characteristic data meeting the gait similarity threshold with the reference gait characteristic data are identified in the newly acquired monitoring data, and two of the three target human body characteristic data meeting the gait similarity threshold with the reference gait characteristic data, when determining an object to be tracked, the following situations can exist:
In case one, the objects to which both belong are the same. In this case, the object to which both belong may be determined as the object to be tracked.
In the second case, the two objects are different. In this case, if the target face feature data is included in both, the object to which the target face feature data belongs may be determined as the object to be tracked. If the target face feature data is not included in the target gait feature data, the object to which the target face feature data belongs can be determined as the object to be tracked.
If the target face feature data meeting the face similarity threshold with the reference face feature data, the target face feature data meeting the first human body similarity threshold with the reference human body feature data, and the target face feature data meeting the gait similarity threshold with the reference gait feature data are identified in the newly acquired monitoring data, and the object to which the target face feature data belongs is different from at least one object in the objects to which the target face feature data and the target face feature data belong, the object to which the target face feature data belongs can be taken as the object to be tracked in consideration of higher accuracy of the recognition of the object by the face.
Optionally, tracking the object to be tracked based on the updated tracking reference data of the object to be tracked, including:
determining a track of the target vehicle, and analyzing monitoring data of monitoring devices positioned behind a first monitoring device in the track, wherein the first monitoring device is the monitoring device to which the monitoring data for analyzing that the object with the target characteristic data has the behavior of riding the target vehicle belongs. If the object to be tracked is determined not to be in the target vehicle, continuing to track the object to be tracked by taking the position of the second monitoring device as the initial tracking position according to the rest characteristic data except the vehicle characteristic data in the tracking reference data of the object to be tracked, wherein the second monitoring device is the monitoring device to which the monitoring data for determining that the object to be tracked is not in the target vehicle belongs.
For the case that the updated tracking reference data of the object to be tracked includes the vehicle feature data of the target vehicle, after the behavior of the object with the target feature data on the target vehicle is analyzed, the track of the target vehicle in a preset time period may be recorded, and the preset time period may be 2 hours. And analyzing the monitoring data of the monitoring devices following the first monitoring device in the track. Here, the first monitoring device is a monitoring device to which monitoring data of behavior of the riding target vehicle in which the object having the target feature data exists is first analyzed, and monitoring data of monitoring devices after the first monitoring device is analyzed is monitoring data of which shooting time is after the shooting time of the monitoring data of behavior of the riding target vehicle in which the object having the target feature data exists in the first analysis.
If it is determined that the object to be tracked is not within the target vehicle, for example, when analyzing the monitoring data of the monitoring device on the track, it may be considered that the object to be tracked may have been taken off halfway if the object to be tracked is analyzed in the target vehicle in the monitoring data of the previous monitoring device and the object to be tracked is not analyzed in the monitoring data of the latter monitoring device. Then, according to the remaining feature data except the feature data of the vehicle in the tracking reference data of the object to be tracked, for example, the feature data of the vehicle is removed from the tracking reference data of the object to be tracked, and the object to be tracked is continuously tracked by taking the position of the second monitoring device as the initial tracking position, where the second monitoring device is the monitoring device to which the monitoring data for determining that the object to be tracked is not located in the target vehicle. In addition, under the condition that the object to be tracked is not analyzed in the target vehicle, the security manager can be prompted in a text or voice mode that the object to be tracked is suspected to get off.
In addition, when analyzing the monitoring data, the method can also assist in analyzing whether the object to be tracked is in the target vehicle or not by using the mobile phone serial number, mobile phone MAC (Media Access Control Address, media access control) information and the like of the object to be tracked.
In one possible implementation manner, when analyzing the monitoring data of any monitoring device in the track, the monitoring data of the monitoring device in the preset range of any monitoring device can be obtained. And in the monitoring data, tracking the object to be tracked according to the rest characteristic data except the characteristic data of the vehicle in the tracking reference data of the object to be tracked. Here, the preset range may be within an area of 100 meters square and round.
In one possible implementation, the security manager may add or subtract tracking reference data through the management platform depending on the actual situation. Accordingly, after the security manager adds or deletes the tracking reference data, the object to be tracked can be tracked according to the added or deleted tracking reference data.
According to the method and the device for determining the characteristic data of the second characteristic of the object to be tracked, the characteristic data of the first characteristic of the object to be tracked can be determined in the monitoring data. Further, the object to be tracked is tracked together according to the feature data of the first feature and the feature data of the second feature. In this way, the problem of tracking failure due to single features being occluded can be avoided.
Based on the same technical concept, the embodiment of the application further provides an object tracking device, as shown in fig. 2, where the device includes: an acquisition module 210, an update module 220, and a tracking module 230.
An obtaining module 210, configured to obtain feature data of a first feature of an object to be tracked, as tracking reference data of the object to be tracked;
an updating module 220, configured to obtain monitoring data, determine feature data of a second feature of the object to be tracked based on the monitoring data and tracking reference data of the object to be tracked, and update the tracking reference data of the object to be tracked to feature data of the first feature and feature data of the second feature;
the tracking module 230 is configured to track the object to be tracked based on the updated tracking reference data of the object to be tracked.
In one possible implementation manner, the updating module is configured to:
identifying the characteristic data of the first characteristic contained in the monitoring data as characteristic data to be compared of the first characteristic;
determining feature data to be compared of a first feature with similarity to the tracking reference data being larger than a preset threshold value, and taking the feature data to be compared as target feature data;
And acquiring the characteristic data of the second characteristic of the object with the target characteristic data from the monitoring data as the characteristic data of the second characteristic of the object to be tracked.
In one possible implementation, the first feature is one or both of a face, a human body, and a gait, and the second feature is at least one of a face, a human body, and a gait, in addition to the first feature.
In one possible implementation manner, the acquiring feature data of the second feature of the object having the target feature data, as feature data of the second feature of the object to be tracked, includes:
according to the monitoring data, performing behavior analysis on the object with the target characteristic data, and if the object with the target characteristic data is analyzed to have the behavior of riding the target vehicle, acquiring an image of the target vehicle;
and acquiring vehicle characteristic data in the image of the target vehicle as characteristic data of the second characteristic of the object to be tracked.
In one possible implementation manner, the updating module is configured to:
determining a track of the target vehicle, and analyzing monitoring data of monitoring devices positioned behind a first monitoring device in the track, wherein the first monitoring device is the monitoring device to which the monitoring data for analyzing that the behavior of the target vehicle exists in the object with the target characteristic data belongs;
And if the object to be tracked is not in the target vehicle, continuing to track the object to be tracked by taking the position of a second monitoring device as an initial tracking position according to the rest characteristic data except the vehicle characteristic data in the tracking reference data of the object to be tracked, wherein the second monitoring device is the monitoring device to which the monitoring data for determining that the object to be tracked is not in the target vehicle belongs.
It should be noted that: in the object tracking device provided in the above embodiment, only the division of the above functional modules is used for illustration, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the apparatus for object tracking and the method embodiment for object tracking provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the apparatus for object tracking and the method embodiment are detailed in the method embodiment and are not repeated herein.
Fig. 3 is a schematic structural diagram of a management platform according to an embodiment of the present application, where the management platform 300 may have a relatively large difference due to different configurations or performances, and may include one or more processors (central processing units, CPU) 301 and one or more memories 302, where at least one instruction is stored in the memories 302, and the at least one instruction is loaded and executed by the processors 301 to implement the above-mentioned object tracking method.
In an exemplary embodiment, there is also provided a computer readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the method of object tracking in the above-described embodiments. For example, the computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the invention to the particular embodiments of the present application, but to limit the scope of the invention to the particular embodiments of the present application.

Claims (4)

1. A method of object tracking, the method comprising:
acquiring feature data of a first feature of an object to be tracked, wherein the first feature is a human face and a human body, or a human face and a gait, or a human body or a gait, as tracking reference data of the object to be tracked;
Acquiring monitoring data, wherein under the condition that the first feature is a human face and a human body, a human face image and a human body image of the object to be tracked are determined in the monitoring data, if the determined human face image of the object to be tracked is the same as the object to which the human body image belongs, the determined human face image of the object to be tracked or gait feature data corresponding to the object to which the human body image belongs is used as feature data of a second feature of the object to be tracked, and if the determined human face image of the object to be tracked is different from the object to which the human body image belongs, the determined gait feature data corresponding to the object to which the human face image of the object to be tracked belongs is used as feature data of the second feature of the object to be tracked; under the condition that the first feature is a face and gait, determining face images and gait feature data of the to-be-tracked object in the monitoring data, if the determined face images of the to-be-tracked object and the determined gait feature data belong to the same object, using the determined face images of the to-be-tracked object or the determined human feature data corresponding to the to-be-tracked object, which are the feature data of the second feature of the to-be-tracked object, and if the determined face images of the to-be-tracked object and the determined gait feature data belong to different objects, using the determined human feature data corresponding to the to-be-tracked object, which are the face images of the to-be-tracked object, as the feature data of the second feature of the to-be-tracked object; when the first feature is a human body and gait, determining human body image and gait feature data of the object to be tracked in the monitoring data, if the determined human body image of the object to be tracked is the same as the object to be tracked, using the determined human face feature data corresponding to the object to be tracked or the gait feature data as feature data of the second feature of the object to be tracked, and if the determined human body image of the object to be tracked is different from the object to be tracked, using the determined human face feature data corresponding to the object to be tracked as feature data of the second feature of the object to be tracked; when the first feature is a human body, determining a human body image of the object to be tracked in the monitoring data, determining a target human face image which belongs to the same object as the human body image of the object to be tracked in the monitoring image of the human body image of the object to be tracked, judging whether the object to which the human body image of the object to be tracked belongs corresponds to gait feature data, if so, using the human face feature data of the target human face image and the gait feature data corresponding to the object to be tracked together as feature data of a second feature of the object to be tracked, and if not, using the human face feature data of the target human face image as feature data of the second feature of the object to be tracked; under the condition that the first feature is gait, acquiring a target human body image and a target human face image corresponding to an object to be tracked, which are corresponding to gait feature data corresponding to the object to be tracked, in monitoring data, using the human body feature data corresponding to the target human body image and the human face feature data corresponding to the target human face image together as feature data of a second feature of the object to be tracked, and updating tracking reference data of the object to be tracked into feature data of the first feature and feature data of the second feature;
And tracking the object to be tracked based on the updated tracking reference data of the object to be tracked.
2. An apparatus for object tracking, the apparatus comprising:
the acquisition module is used for acquiring feature data of a first feature of an object to be tracked, wherein the first feature is a human face and a human body, or a human face and a gait, or a human body or a gait, as tracking reference data of the object to be tracked;
the updating module is used for determining a face image and a human body image of the object to be tracked in monitoring data under the condition that the first feature is a face and a human body, if the determined face image of the object to be tracked is the same as the object to which the human body image belongs, taking the determined face image of the object to be tracked or gait feature data corresponding to the object to which the human body image belongs as feature data of the second feature of the object to be tracked, and if the determined face image of the object to be tracked is different from the object to which the human body image belongs, taking the determined gait feature data corresponding to the object to which the face image of the object to be tracked belongs as feature data of the second feature of the object to be tracked; under the condition that the first feature is a face and gait, determining face images and gait feature data of the to-be-tracked object in the monitoring data, if the determined face images of the to-be-tracked object and the determined gait feature data belong to the same object, using the determined face images of the to-be-tracked object or the determined human feature data corresponding to the to-be-tracked object, which are the feature data of the second feature of the to-be-tracked object, and if the determined face images of the to-be-tracked object and the determined gait feature data belong to different objects, using the determined human feature data corresponding to the to-be-tracked object, which are the face images of the to-be-tracked object, as the feature data of the second feature of the to-be-tracked object; when the first feature is a human body and gait, determining human body image and gait feature data of the object to be tracked in the monitoring data, if the determined human body image of the object to be tracked is the same as the object to be tracked, using the determined human face feature data corresponding to the object to be tracked or the gait feature data as feature data of the second feature of the object to be tracked, and if the determined human body image of the object to be tracked is different from the object to be tracked, using the determined human face feature data corresponding to the object to be tracked as feature data of the second feature of the object to be tracked; when the first feature is a human body, determining a human body image of the object to be tracked in the monitoring data, determining a target human face image which belongs to the same object as the human body image of the object to be tracked in the monitoring image of the human body image of the object to be tracked, judging whether the object to which the human body image of the object to be tracked belongs corresponds to gait feature data, if so, using the human face feature data of the target human face image and the gait feature data corresponding to the object to be tracked together as feature data of a second feature of the object to be tracked, and if not, using the human face feature data of the target human face image as feature data of the second feature of the object to be tracked; under the condition that the first feature is gait, acquiring a target human body image and a target human face image corresponding to an object to be tracked, which are corresponding to gait feature data corresponding to the object to be tracked, in monitoring data, using the human body feature data corresponding to the target human body image and the human face feature data corresponding to the target human face image together as feature data of a second feature of the object to be tracked, and updating tracking reference data of the object to be tracked into feature data of the first feature and feature data of the second feature;
And the tracking module is used for tracking the object to be tracked based on the updated tracking reference data of the object to be tracked.
3. A management platform comprising a processor and a memory having at least one instruction stored therein, the instruction being loaded and executed by the processor to perform the operations performed by the method of object tracking of claim 1.
4. A computer-readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the operations performed by the method of object tracking of claim 1.
CN202010407982.1A 2020-05-14 2020-05-14 Method, device, management platform and storage medium for object tracking Active CN113674309B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010407982.1A CN113674309B (en) 2020-05-14 2020-05-14 Method, device, management platform and storage medium for object tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010407982.1A CN113674309B (en) 2020-05-14 2020-05-14 Method, device, management platform and storage medium for object tracking

Publications (2)

Publication Number Publication Date
CN113674309A CN113674309A (en) 2021-11-19
CN113674309B true CN113674309B (en) 2024-02-20

Family

ID=78537289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010407982.1A Active CN113674309B (en) 2020-05-14 2020-05-14 Method, device, management platform and storage medium for object tracking

Country Status (1)

Country Link
CN (1) CN113674309B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544483A (en) * 2013-10-25 2014-01-29 合肥工业大学 United target tracking method based on local sparse representation and system thereof
JP2016126624A (en) * 2015-01-06 2016-07-11 Kddi株式会社 Device, program and method for tracking body using dedicated discrimination device on occlusion occurrence
CN107255468A (en) * 2017-05-24 2017-10-17 纳恩博(北京)科技有限公司 Method for tracking target, target following equipment and computer-readable storage medium
CN110175587A (en) * 2019-05-30 2019-08-27 黄岩 A kind of video frequency tracking method based on recognition of face and Algorithm for gait recognition
CN110992397A (en) * 2019-10-21 2020-04-10 浙江大华技术股份有限公司 Personnel entrance and exit trajectory tracking method and system, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544483A (en) * 2013-10-25 2014-01-29 合肥工业大学 United target tracking method based on local sparse representation and system thereof
JP2016126624A (en) * 2015-01-06 2016-07-11 Kddi株式会社 Device, program and method for tracking body using dedicated discrimination device on occlusion occurrence
CN107255468A (en) * 2017-05-24 2017-10-17 纳恩博(北京)科技有限公司 Method for tracking target, target following equipment and computer-readable storage medium
CN110175587A (en) * 2019-05-30 2019-08-27 黄岩 A kind of video frequency tracking method based on recognition of face and Algorithm for gait recognition
CN110992397A (en) * 2019-10-21 2020-04-10 浙江大华技术股份有限公司 Personnel entrance and exit trajectory tracking method and system, computer equipment and storage medium

Also Published As

Publication number Publication date
CN113674309A (en) 2021-11-19

Similar Documents

Publication Publication Date Title
CN108596277B (en) Vehicle identity recognition method and device and storage medium
US11887064B2 (en) Deep learning-based system and method for automatically determining degree of damage to each area of vehicle
Fernandez-Sanjurjo et al. Real-time visual detection and tracking system for traffic monitoring
CN110390262A (en) Video analysis method, apparatus, server and storage medium
CN109325429B (en) Method, device, storage medium and terminal for associating feature data
CN110309735A (en) Exception detecting method, device, server and storage medium
CN109446936A (en) A kind of personal identification method and device for monitoring scene
CN107862072B (en) Method for analyzing vehicle urban-entering fake plate crime based on big data technology
CN112132041A (en) Community patrol analysis method and system based on computer vision
CN112434566A (en) Passenger flow statistical method and device, electronic equipment and storage medium
CN112597850B (en) Identity recognition method and device
US11120308B2 (en) Vehicle damage detection method based on image analysis, electronic device and storage medium
CN110175553B (en) Method and device for establishing feature library based on gait recognition and face recognition
CN111814690A (en) Target re-identification method and device and computer readable storage medium
CN114155488A (en) Method and device for acquiring passenger flow data, electronic equipment and storage medium
CN112651398A (en) Vehicle snapshot control method and device and computer readable storage medium
EP3043292A1 (en) Object linking method, object linking apparatus, and object linking program
CN114943750A (en) Target tracking method and device and electronic equipment
CN113674309B (en) Method, device, management platform and storage medium for object tracking
CN111383248A (en) Method and device for judging red light running of pedestrian and electronic equipment
CN116434161B (en) Method and system for judging whether parking behavior based on high-order video is credible
CN110619256A (en) Road monitoring detection method and device
CN112926364B (en) Head gesture recognition method and system, automobile data recorder and intelligent cabin
CN113657169B (en) Gait recognition method, device and system and computer readable storage medium
CN115762172A (en) Method, device, equipment and medium for identifying vehicles entering and exiting parking places

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant