CN114581491A - Pedestrian trajectory tracking method, system and related device - Google Patents
Pedestrian trajectory tracking method, system and related device Download PDFInfo
- Publication number
- CN114581491A CN114581491A CN202210469020.8A CN202210469020A CN114581491A CN 114581491 A CN114581491 A CN 114581491A CN 202210469020 A CN202210469020 A CN 202210469020A CN 114581491 A CN114581491 A CN 114581491A
- Authority
- CN
- China
- Prior art keywords
- pedestrian
- frame
- target
- candidate
- track
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000001514 detection method Methods 0.000 claims abstract description 65
- 239000011159 matrix material Substances 0.000 claims abstract description 39
- 238000004364 calculation method Methods 0.000 claims abstract description 17
- 238000000605 extraction Methods 0.000 claims description 18
- 238000012549 training Methods 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 9
- 238000010276 construction Methods 0.000 claims description 2
- 230000008569 process Effects 0.000 abstract description 9
- 238000012545 processing Methods 0.000 abstract description 3
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
The application provides a pedestrian trajectory tracking method, which relates to the field of image processing and comprises the following steps: acquiring image data; extracting the features of the image data and constructing a candidate relation mask; extracting a historical frame feature set of a pedestrian track library, and performing feature calculation on a candidate frame in a candidate frame relation mask to obtain a frame feature distance matrix and a frame-to-person feature distance matrix; calculating the characteristic distance between the target pedestrian and the candidate frame, and classifying the target candidate frame which meets the requirement of the characteristic distance between the target pedestrian and the candidate frame into the track of the target pedestrian until no detection frame meeting the condition exists in the current frame detection frame; and updating a pedestrian track library, and outputting a pedestrian index set and a corresponding position track of the target pedestrian. The pedestrian tracking method and the pedestrian tracking system can effectively solve the problem that the characteristic richness is insufficient in the pedestrian tracking process, and improve the pedestrian tracking detection precision. The application also provides a pedestrian trajectory tracking system, a computer-readable storage medium and an electronic device, which have the beneficial effects.
Description
Technical Field
The present application relates to the field of image processing, and in particular, to a method, a system, and a related device for tracking a pedestrian trajectory.
Background
Pedestrian target tracking is one of the most important research directions in the field of computer vision, and is valued by researchers in various aspects due to high landing value and practicability.
At present, researchers in the field generally combine target detection and metric learning to realize the target Tracking, generally, people adopt a target detection algorithm to position pedestrians, extract features of the positioned pedestrians by metric learning, and further realize the calculation of the same pedestrian track through a matching strategy of the features. However, the existing strategy causes more frame leakage (False Negative) and ID-shift (ID-Switch) phenomena due to the larger number of tracking targets.
Therefore, how to improve the pedestrian tracking accuracy is a technical problem that needs to be solved urgently by those skilled in the art.
Disclosure of Invention
An object of the present application is to provide a pedestrian trajectory tracking method, a pedestrian trajectory tracking system, a computer-readable storage medium, and an electronic device, which can improve the tracking accuracy of a pedestrian trajectory.
In order to solve the technical problem, the application provides a pedestrian trajectory tracking method, which has the following specific technical scheme:
acquiring image data;
extracting the features of the image data, and constructing a candidate relation mask according to the extracted features; the value in the candidate frame relation mask represents whether the detection frame of the current frame and the target pedestrian can form a reasonable track relation or not;
extracting a historical frame feature set of a pedestrian trajectory library, and performing feature calculation on the historical frame feature set and the candidate frames in the candidate frame relation mask to obtain a frame feature distance matrix and a frame-to-person feature distance matrix;
calculating the characteristic distance between the target pedestrian and the candidate frame according to the human frame characteristic distance matrix and the frame human characteristic distance matrix, if the characteristic distance between the target candidate frame and the target pedestrian meets the minimum distance of each other, classifying the target candidate frame into the track of the target pedestrian until no detection frame meeting the condition exists in the current frame detection frame;
and updating the pedestrian track library, and outputting a pedestrian index set and a corresponding position track of the target pedestrian.
Optionally, the method further includes:
constructing the pedestrian track library; the pedestrian trajectory library contains the historical positions of all pedestrians and characteristic information of the pedestrians at all the historical positions.
Optionally, the extracting the features of the image data, and constructing a candidate relationship mask according to the extracted features includes:
performing target prediction on an image frame in the image data by using a first network model to obtain a first detection result; the first detection result comprises coordinate frame position information of each pedestrian and the number of the pedestrians;
extracting features in the coordinate frame by using a second network model to obtain a feature set;
calculating the track prediction coordinate of each pedestrian at the current moment according to a track prediction formula;
determining a space feasible range of the pedestrian at the current moment according to the track prediction coordinates and the coordinate frame position information of each pedestrian;
and generating a candidate frame relation mask corresponding to each pedestrian according to the space feasible range.
Optionally, before the target prediction is performed on the image frame in the image data by using the first network model to obtain the first detection result, the method further includes:
and inputting the pedestrian frame labels and the pictures in the training data set into a pedestrian detection network, and training the pedestrian detection network by using a double-stage detector or a single-stage detector to obtain the first network model.
Optionally, before the extracting the features in the coordinate frame by using the second network model to obtain the feature set, the method further includes:
and performing model training based on the pedestrian re-recognition mode, and cutting a pedestrian frame in the training data set to obtain the second network model.
Optionally, the candidate box relation mask is an M × N matrix including 0 and 1; wherein, M is the number of the pedestrian detection frames, and N is the number of the pedestrians.
The present application further provides a pedestrian trajectory tracking system, comprising:
the image acquisition module is used for acquiring image data;
the characteristic extraction module is used for carrying out spatial characteristic extraction and appearance characteristic extraction on the image data and constructing a candidate relation mask according to the extracted characteristics;
the characteristic calculation module is used for extracting a historical frame characteristic set of a pedestrian trajectory library and performing characteristic calculation on the historical frame characteristic set and a candidate frame in the candidate frame relation mask to obtain a frame characteristic distance matrix and a frame-to-person characteristic distance matrix;
the detection module is used for calculating the characteristic distance between the target pedestrian and the candidate frame according to the human frame characteristic distance matrix and the frame human characteristic distance matrix, and if the characteristic distance between the target candidate frame and the target pedestrian meets the minimum distance of each other, the target candidate frame is classified into the track of the target pedestrian until no detection frame meeting the condition exists in the current frame detection frame;
and the track updating module is used for updating the pedestrian track library and outputting the pedestrian index set and the corresponding position track of the target pedestrian.
Optionally, the method further includes:
the track library construction module is used for constructing a pedestrian track library; the pedestrian trajectory library contains the historical positions of all pedestrians and characteristic information of the pedestrians at all the historical positions.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method as set forth above.
The present application further provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the method when calling the computer program in the memory.
The application provides a pedestrian trajectory tracking method, which comprises the following steps: acquiring image data; extracting the features of the image data, and constructing a candidate relation mask according to the extracted features; the value in the candidate frame relation mask represents whether the detection frame of the current frame and the target pedestrian can form a reasonable track relation or not; extracting a historical frame feature set of a pedestrian trajectory library, and performing feature calculation on the historical frame feature set and the candidate frames in the candidate frame relation mask to obtain a frame feature distance matrix and a frame-to-person feature distance matrix; calculating the characteristic distance between the target pedestrian and the candidate frame according to the human frame characteristic distance matrix and the frame human characteristic distance matrix, if the characteristic distance between the target candidate frame and the target pedestrian meets the minimum distance of each other, classifying the target candidate frame into the track of the target pedestrian until no detection frame meeting the condition exists in the current frame detection frame; and updating the pedestrian track library, and outputting a pedestrian index set and a corresponding position track of the target pedestrian.
According to the method, the image data are subjected to feature extraction, a candidate relation mask is constructed and used for judging the reasonability of the track relation of the pedestrian and the detection frame identified in the image data, and then the calculation of the feature distance is carried out by comparing historical frame feature sets in a pedestrian track library to determine the target pedestrian and the target candidate frame, so that the pedestrian track in the image data is determined, and the tracking of the pedestrian track is realized. The method and the device can effectively solve the problem of pedestrian index replacement caused by spatial distance weighting in an ultra-large scene, and the problem of too high dependence degree on the metric learning model caused by insufficient feature richness in the pedestrian tracking process is solved, so that the pedestrian tracking detection precision is improved.
The application also provides a pedestrian trajectory tracking method, a pedestrian trajectory tracking system, a computer readable storage medium and an electronic device, which have the beneficial effects and are not repeated herein.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a pedestrian trajectory tracking method according to an embodiment of the present disclosure;
FIG. 2 is a diagram illustrating an example of a mask visualization of a candidate box relationship provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a pedestrian trajectory tracking system according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a flowchart of a method for tracking a pedestrian trajectory according to an embodiment of the present application, the method including:
s101: acquiring image data;
this step is intended to acquire image data, and how to acquire the image data is not limited herein, and video data acquired by a roadside camera may be generally used as an image data source. If the source data is video data, image frame processing may be performed on the source data to obtain image data required in this step.
S102: extracting the features of the image data, and constructing a candidate relation mask according to the extracted features;
this step is intended to perform feature extraction on the image data to construct a candidate relationship mask. The numerical value in the candidate frame relation mask represents whether the detection frame of the current frame and the target pedestrian can form a reasonable track relation. The execution target of this step is the image data acquired in step S101, which can be processed frame by frame in accordance with this step.
One possible implementation may be as follows:
s1021: performing target prediction on an image frame in the image data by using a first network model to obtain a first detection result; the first detection result comprises coordinate frame position information of each pedestrian and the number of the pedestrians;
s1022: extracting features in the coordinate frame by using a second network model to obtain a feature set;
s1023: calculating the track prediction coordinate of each pedestrian at the current moment according to a track prediction formula;
s1024: determining a space feasible range of the pedestrian at the current moment according to the track prediction coordinates and the coordinate frame position information of each pedestrian;
s1025: generating a candidate frame relation mask corresponding to each pedestrian according to the space feasible range; the values in the candidate frame relationship mask indicate whether the detection frame of the current frame and the pedestrian can form a reasonable track relationship.
The pedestrian detection network can be trained by using a dual-stage detector or a single-stage detector, pedestrian frame labels and pictures in the training data set are input into the pedestrian detection network, and network parameters are adjusted to obtain a first network model. The first network model is mainly used for executing pedestrian detection, pedestrian frame labels and pictures in training data sets can be input into a pedestrian detection network, and the pedestrian detection network is trained by using a dual-stage detector or a single-stage detector to obtain the first network model.
For the second network model, model training can be performed based on a pedestrian Re-Identification mode (Person Re-Identification), and pedestrian frames in the training data set are cut to obtain the second network model. The second network model is mainly a metric learning network training model.
Specifically, the following describes the above process with the related formula:
target detection is carried out on the ith frame image, the trained first network model is used for predicting the detection result, and the detection result D is obtainedi ={ p1, p2, …, pm },pjCoordinate frame position [ x1, y1, x2, y2 ] representing each pedestrian]And coordinates of the upper left corner and the lower right corner of the target frame, wherein m represents the number of pedestrians detected by the current frame.
Thereafter, pedestrian target feature extraction is performed. And extracting features of the m predicted pedestrian detection frames by using the trained second network model. Can obtain Fi={f1, f2, …, fm}。
And then performing space condition limit calculation: predicting the trajectories of r pedestrians in T, wherein x and y respectively represent the coordinates of the target pedestrian on the two-dimensional image, k represents the index of the target pedestrian in T, anda least squares formulation used to fit the trajectory curve is shown. Calculating to obtain the track prediction coordinate of each pedestrian at the current moment according to a formula。
Then, for each pedestrian k, the coordinates are predicted from the trajectory thereofAnd its final timeSize of the target frameCalculating its spatial feasible range at the current time:
WhereinIs the expansion coefficient of the target frame, thresh is a threshold, which is a settable parameter used to limit the "feasible range", a certain frame and the target frameThe degree of coincidence therebetween is larger than this value. S is the set of all eligible S, i.e. "feasible range".
Then, a frame candidate relation mask can be generated, and the frame candidate relation mask M of each pedestrian in T is calculated by combining the detection result of the current frame and the spatial condition limit of r pedestrians in T. The expression method of the frame candidate relationship mask is not limited, and one possible method is that the frame candidate relationship mask is an M × N matrix including 0 and 1, where M is the number of pedestrian detection frames and N is the number of pedestrians.
The feasible range (N total S) of all pedestrians (assuming N persons) in the current frame can be obtained, and in addition, all pedestrian detection frames (assuming M b) detected by the current frame can be obtained, and an M multiplied by N matrix can be obtained by comparing whether each b belongs to each S, wherein the jth column element of the ith row in the matrix represents the situation that the ith detection frame is possible to be used as the jth pedestrian candidate frame in the pedestrian library. Referring to fig. 2, fig. 2 is a diagram illustrating an example of a mask visualization of a candidate box relationship provided in an embodiment of the present application, where 1 in fig. 2 indicates true, 0 indicates false, and 1 indicates that a potential candidate relationship exists between related pedestrians. Of course, other ways to identify the candidate box relation mask may be used, which are not limited herein, for example, 1 is used to indicate true, 1 is used to indicate false, etc.
S103: extracting a historical frame feature set of a pedestrian trajectory library, and performing feature calculation on the historical frame feature set and the candidate frames in the candidate frame relation mask to obtain a frame feature distance matrix and a frame-to-person feature distance matrix;
this step is directed to feature extraction and feature distance calculation. Specifically, a history frame feature set of each pedestrian in the pedestrian trajectory library can be extracted, cosine distance operation is respectively performed on the history frame feature set and a candidate frame of a current frame of the pedestrian trajectory library to obtain a frame feature distance matrix Disttd, and feature distance operation is respectively performed on the candidate frame of the current frame and the pedestrian having a candidate relationship to obtain a frame-to-person distance matrix Distdt. Disttd represents the distance matrix between t pedestrians and d candidate frames in the current frame, whereas Distdt represents the distance matrix between the candidate frames and the pedestrians, and in the process of "matching", for the ith pedestrian and the jth frame, the candidate frame j is assigned to the pedestrian i if and only if Disttd [ i, j ] = = min (Disttd [ i, ] ]) and Distdt [ j, i ] = = min (Distdt [ j, ] ]). The function min (array) represents the minimum of the array; matrix [ j, ] represents the jth row array of matrix.
S104: if the characteristic distance between the target pedestrian and the target candidate frame meets the minimum distance, classifying the target candidate frame into the track of the target pedestrian until no detection frame meeting the condition exists in the current frame detection frame;
if the characteristic distance between the pedestrian k and the current frame candidate frame p meets the recall condition that the mutual distance is the minimum distance, the candidate frame p is classified into the track of the pedestrian k. And repeating the operation until no target meeting the condition exists in the current frame detection frame Di, setting the part of frames as new pedestrians, and storing the new pedestrians in a pedestrian track library.
S105: and updating the pedestrian track library, and outputting a pedestrian index set and a corresponding position track of the target pedestrian.
And storing the matched pedestrian frame, the position and the characteristics thereof into related pedestrians, updating the position information and the characteristic queue thereof, and finally outputting the pedestrian index set and the corresponding position track of the target pedestrian.
The time when the pedestrian trajectory library is constructed is not particularly limited, and only the corresponding database or data queue is required to be present in the application process. A pedestrian trajectory library including the history positions of the pedestrians and the feature information at the history positions may be constructed before the present embodiment is executed.
According to the embodiment of the application, the image data is subjected to feature extraction, the candidate relation mask is constructed and used for judging the reasonability of the track relation between the pedestrian and the detection frame identified in the image data, and then the calculation of the feature distance is carried out by comparing the historical frame feature sets in the pedestrian track library to determine the target pedestrian and the target candidate frame, so that the pedestrian track in the image data is determined, and the tracking of the pedestrian track is realized. The method and the device can effectively solve the problem of pedestrian index replacement caused by spatial distance weighting in an ultra-large scene, and the problem of too high dependence degree on the metric learning model caused by insufficient feature richness in the pedestrian tracking process is solved, so that the pedestrian tracking detection precision is improved.
The following describes a pedestrian trajectory tracking system provided in an embodiment of the present application, and the pedestrian trajectory tracking system described below and the pedestrian trajectory tracking method described above may be referred to correspondingly.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a pedestrian trajectory tracking system according to an embodiment of the present application, and the present application further provides a pedestrian trajectory tracking system, including:
the image acquisition module is used for acquiring image data;
the characteristic extraction module is used for carrying out spatial characteristic extraction and appearance characteristic extraction on the image data and constructing a candidate relation mask according to the extracted characteristics;
the characteristic calculation module is used for extracting a historical frame characteristic set of a pedestrian trajectory library and performing characteristic calculation on the historical frame characteristic set and a candidate frame in the candidate frame relation mask to obtain a frame characteristic distance matrix and a frame-to-person characteristic distance matrix;
the detection module is used for calculating the characteristic distance between the target pedestrian and the candidate frame according to the human frame characteristic distance matrix and the frame human characteristic distance matrix, and if the characteristic distance between the target candidate frame and the target pedestrian meets the minimum distance of each other, the target candidate frame is classified into the track of the target pedestrian until no detection frame meeting the condition exists in the current frame detection frame;
and the track updating module is used for updating the pedestrian track library and outputting the pedestrian index set and the corresponding position track of the target pedestrian.
Based on the above embodiment, as a preferred embodiment, the method further includes:
constructing the pedestrian track library; the pedestrian trajectory library contains the historical positions of all pedestrians and characteristic information of the pedestrians at all the historical positions.
Based on the above embodiment, as a preferred embodiment, the feature extraction module includes:
the first feature extraction unit is used for performing target prediction on an image frame in the image data by using a first network model to obtain a first detection result; the first detection result comprises coordinate frame position information of each pedestrian and the number of the pedestrians;
the second feature extraction unit is used for extracting features in the coordinate frame by using a second network model to obtain a feature set;
the track calculation unit is used for calculating track prediction coordinates of the pedestrians at the current moment according to a track prediction formula;
the spatial prediction unit is used for determining the spatial feasible range of the pedestrian at the current moment according to the track prediction coordinates and the coordinate frame position information of each pedestrian;
a candidate frame relation mask generating unit, configured to generate a candidate frame relation mask corresponding to each pedestrian according to the space feasible range; the values in the candidate frame relationship mask indicate whether the detection frame of the current frame and the pedestrian can form a reasonable track relationship.
Based on the above embodiment, as a preferred embodiment, the method further includes:
and the first network model generation module is used for inputting the pedestrian frame labels and pictures in the training data set into a pedestrian detection network, and training the pedestrian detection network by using a double-stage detector or a single-stage detector to obtain the first network model.
Based on the above embodiment, as a preferred embodiment, the method further includes:
and the second network model generation module is used for carrying out model training based on the pedestrian re-recognition mode, and cutting pedestrian frames in the training data set to obtain a second network model.
The present application further provides a computer-readable storage medium, on which a computer program is stored, which, when executed, can implement the steps provided by the above-mentioned embodiments. The storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The application further provides an electronic device, which may include a memory and a processor, where the memory stores a computer program, and the processor may implement the steps provided by the foregoing embodiments when calling the computer program in the memory. Of course, the electronic device may also include various network interfaces, power supplies, and the like.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system provided by the embodiment, the description is relatively simple because the system corresponds to the method provided by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present application are explained herein using specific examples, which are provided only to help understand the method and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Claims (10)
1. A method of pedestrian trajectory tracking, comprising:
acquiring image data;
extracting the features of the image data, and constructing a candidate relation mask according to the extracted features; the value in the candidate frame relation mask represents whether the detection frame of the current frame and the target pedestrian can form a reasonable track relation or not;
extracting a historical frame feature set of a pedestrian trajectory library, and performing feature calculation on the historical frame feature set and the candidate frames in the candidate frame relation mask to obtain a frame feature distance matrix and a frame-to-person feature distance matrix;
calculating the characteristic distance between the target pedestrian and the candidate frame according to the human frame characteristic distance matrix and the frame human characteristic distance matrix, if the characteristic distance between the target candidate frame and the target pedestrian meets the minimum distance of each other, classifying the target candidate frame into the track of the target pedestrian until no detection frame meeting the condition exists in the current frame detection frame;
and updating the pedestrian track library, and outputting a pedestrian index set and a corresponding position track of the target pedestrian.
2. The pedestrian trajectory tracking method according to claim 1, further comprising:
constructing the pedestrian track library; the pedestrian trajectory library contains the historical position of each pedestrian and the characteristic information of the pedestrian at each historical position.
3. The pedestrian trajectory tracking method of claim 1, wherein performing feature extraction on the image data and constructing a candidate relationship mask from the extracted features comprises:
performing target prediction on an image frame in the image data by using a first network model to obtain a first detection result; the first detection result comprises coordinate frame position information of each pedestrian and the number of the pedestrians;
extracting features in the coordinate frame by using a second network model to obtain a feature set;
calculating the track prediction coordinate of each pedestrian at the current moment according to a track prediction formula;
determining a space feasible range of the pedestrian at the current moment according to the track prediction coordinate and the coordinate frame position information of each pedestrian;
and generating a candidate frame relation mask corresponding to each pedestrian according to the space feasible range.
4. The method according to claim 3, wherein before performing target prediction on an image frame in the image data by using a first network model to obtain a first detection result, the method further comprises:
and inputting the pedestrian frame labels and the pictures in the training data set into a pedestrian detection network, and training the pedestrian detection network by using a double-stage detector or a single-stage detector to obtain the first network model.
5. The method according to claim 3, wherein before extracting the features in the coordinate frame by using the second network model to obtain a feature set, the method further comprises:
and performing model training based on the pedestrian re-recognition mode, and cutting a pedestrian frame in the training data set to obtain the second network model.
6. The pedestrian trajectory tracking method of claim 5, wherein the frame candidate relationship mask is an M x N matrix including 0 and 1; wherein, M is the number of the pedestrian detection frames, and N is the number of the pedestrians.
7. A pedestrian trajectory tracking system, comprising:
the image acquisition module is used for acquiring image data;
the characteristic extraction module is used for carrying out spatial characteristic extraction and appearance characteristic extraction on the image data and constructing a candidate relation mask according to the extracted characteristics; the value in the candidate frame relation mask represents whether the detection frame of the current frame and the target pedestrian can form a reasonable track relation or not;
the characteristic calculation module is used for extracting a historical frame characteristic set of a pedestrian trajectory library and performing characteristic calculation on the historical frame characteristic set and a candidate frame in the candidate frame relation mask to obtain a frame characteristic distance matrix and a frame-to-person characteristic distance matrix;
the detection module is used for calculating the characteristic distance between the target pedestrian and the candidate frame according to the human frame characteristic distance matrix and the frame human characteristic distance matrix, and if the characteristic distance between the target candidate frame and the target pedestrian meets the minimum distance of each other, the target candidate frame is classified into the track of the target pedestrian until no detection frame meeting the condition exists in the current frame detection frame;
and the track updating module is used for updating the pedestrian track library and outputting the pedestrian index set and the corresponding position track of the target pedestrian.
8. The pedestrian trajectory tracking system of claim 7, further comprising:
the track library construction module is used for constructing a pedestrian track library; the pedestrian trajectory library contains the historical positions of all pedestrians and characteristic information of the pedestrians at all the historical positions.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the pedestrian trajectory tracking method according to any one of claims 1 to 6.
10. An electronic device, comprising a memory having a computer program stored therein and a processor that implements the steps of the pedestrian trajectory tracking method according to any one of claims 1 to 6 when the processor calls the computer program in the memory.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210469020.8A CN114581491B (en) | 2022-04-30 | 2022-04-30 | Pedestrian trajectory tracking method, system and related device |
PCT/CN2022/117148 WO2023206904A1 (en) | 2022-04-30 | 2022-09-06 | Pedestrian trajectory tracking method and system, and related apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210469020.8A CN114581491B (en) | 2022-04-30 | 2022-04-30 | Pedestrian trajectory tracking method, system and related device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114581491A true CN114581491A (en) | 2022-06-03 |
CN114581491B CN114581491B (en) | 2022-07-22 |
Family
ID=81785306
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210469020.8A Active CN114581491B (en) | 2022-04-30 | 2022-04-30 | Pedestrian trajectory tracking method, system and related device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114581491B (en) |
WO (1) | WO2023206904A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115183763A (en) * | 2022-09-13 | 2022-10-14 | 南京北新智能科技有限公司 | Personnel map positioning method based on face recognition and grid method |
CN115546192A (en) * | 2022-11-03 | 2022-12-30 | 中国平安财产保险股份有限公司 | Livestock quantity identification method, device, equipment and storage medium |
CN115965657A (en) * | 2023-02-28 | 2023-04-14 | 安徽蔚来智驾科技有限公司 | Target tracking method, electronic device, storage medium, and vehicle |
WO2023206904A1 (en) * | 2022-04-30 | 2023-11-02 | 苏州元脑智能科技有限公司 | Pedestrian trajectory tracking method and system, and related apparatus |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10921130B1 (en) * | 2019-09-18 | 2021-02-16 | Here Global B.V. | Method and apparatus for providing an indoor pedestrian origin-destination matrix and flow analytics |
CN112734809A (en) * | 2021-01-21 | 2021-04-30 | 高新兴科技集团股份有限公司 | Online multi-pedestrian tracking method and device based on Deep-Sort tracking framework |
CN113761987A (en) * | 2020-06-05 | 2021-12-07 | 苏宁云计算有限公司 | Pedestrian re-identification method and device, computer equipment and storage medium |
CN114332168A (en) * | 2022-03-14 | 2022-04-12 | 苏州浪潮智能科技有限公司 | Pedestrian tracking method, pedestrian tracking system, electronic device and storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160132728A1 (en) * | 2014-11-12 | 2016-05-12 | Nec Laboratories America, Inc. | Near Online Multi-Target Tracking with Aggregated Local Flow Descriptor (ALFD) |
CN109087335B (en) * | 2018-07-16 | 2022-02-22 | 腾讯科技(深圳)有限公司 | Face tracking method, device and storage medium |
CN110517292A (en) * | 2019-08-29 | 2019-11-29 | 京东方科技集团股份有限公司 | Method for tracking target, device, system and computer readable storage medium |
CN114581491B (en) * | 2022-04-30 | 2022-07-22 | 苏州浪潮智能科技有限公司 | Pedestrian trajectory tracking method, system and related device |
-
2022
- 2022-04-30 CN CN202210469020.8A patent/CN114581491B/en active Active
- 2022-09-06 WO PCT/CN2022/117148 patent/WO2023206904A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10921130B1 (en) * | 2019-09-18 | 2021-02-16 | Here Global B.V. | Method and apparatus for providing an indoor pedestrian origin-destination matrix and flow analytics |
CN113761987A (en) * | 2020-06-05 | 2021-12-07 | 苏宁云计算有限公司 | Pedestrian re-identification method and device, computer equipment and storage medium |
CN112734809A (en) * | 2021-01-21 | 2021-04-30 | 高新兴科技集团股份有限公司 | Online multi-pedestrian tracking method and device based on Deep-Sort tracking framework |
CN114332168A (en) * | 2022-03-14 | 2022-04-12 | 苏州浪潮智能科技有限公司 | Pedestrian tracking method, pedestrian tracking system, electronic device and storage medium |
Non-Patent Citations (1)
Title |
---|
YIFU ZHANG等: "ByteTrack: Multi-Object Tracking by Associating Every Detection Box", 《ARXIV》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023206904A1 (en) * | 2022-04-30 | 2023-11-02 | 苏州元脑智能科技有限公司 | Pedestrian trajectory tracking method and system, and related apparatus |
CN115183763A (en) * | 2022-09-13 | 2022-10-14 | 南京北新智能科技有限公司 | Personnel map positioning method based on face recognition and grid method |
CN115546192A (en) * | 2022-11-03 | 2022-12-30 | 中国平安财产保险股份有限公司 | Livestock quantity identification method, device, equipment and storage medium |
CN115965657A (en) * | 2023-02-28 | 2023-04-14 | 安徽蔚来智驾科技有限公司 | Target tracking method, electronic device, storage medium, and vehicle |
Also Published As
Publication number | Publication date |
---|---|
WO2023206904A1 (en) | 2023-11-02 |
CN114581491B (en) | 2022-07-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114581491B (en) | Pedestrian trajectory tracking method, system and related device | |
CN110070141B (en) | Network intrusion detection method | |
Salimi et al. | Visual-based trash detection and classification system for smart trash bin robot | |
CA3066029A1 (en) | Image feature acquisition | |
CN107516102B (en) | Method, device and system for classifying image data and establishing classification model | |
CN112419202B (en) | Automatic wild animal image recognition system based on big data and deep learning | |
JP6756406B2 (en) | Image processing equipment, image processing method and image processing program | |
CN110188780B (en) | Method and device for constructing deep learning model for positioning multi-target feature points | |
US20170053172A1 (en) | Image processing apparatus, and image processing method | |
CN114202123A (en) | Service data prediction method and device, electronic equipment and storage medium | |
JP2001043368A5 (en) | ||
CN114399729A (en) | Monitoring object movement identification method, system, terminal and storage medium | |
CN110929731B (en) | Medical image processing method and device based on pathfinder intelligent search algorithm | |
CN111985616B (en) | Image feature extraction method, image retrieval method, device and equipment | |
CN113688810B (en) | Target capturing method and system of edge device and related device | |
CN111950507A (en) | Data processing and model training method, device, equipment and medium | |
EP3955205A1 (en) | Information processing device, information processing method, and recording medium | |
CN114359796A (en) | Target identification method and device and electronic equipment | |
CN111382628B (en) | Method and device for judging peer | |
CN113723209A (en) | Target identification method, target identification device, electronic equipment and computer-readable storage medium | |
CN110263196B (en) | Image retrieval method, image retrieval device, electronic equipment and storage medium | |
CN113486879A (en) | Image area suggestion frame detection method, device, equipment and storage medium | |
CN109165097B (en) | Data processing method and data processing device | |
CN111275183A (en) | Visual task processing method and device and electronic system | |
JP7369247B2 (en) | Information processing device, information processing method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |