CN114937306A - Target tracking method and system based on face clustering - Google Patents
Target tracking method and system based on face clustering Download PDFInfo
- Publication number
- CN114937306A CN114937306A CN202210706747.3A CN202210706747A CN114937306A CN 114937306 A CN114937306 A CN 114937306A CN 202210706747 A CN202210706747 A CN 202210706747A CN 114937306 A CN114937306 A CN 114937306A
- Authority
- CN
- China
- Prior art keywords
- face
- personnel
- clustering
- target tracking
- snapshot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a target tracking method and a target tracking system based on face clustering, which belong to the technical field of face recognition, wherein a personnel base is created firstly, and personnel information is input; then extracting the characteristics of the face picture captured by the front-end face capturing machine; and calculating the difference between each face characteristic value by adopting a distance measurement mode of the class center characteristic value, comparing a face picture captured by the front-end face capturing machine with face pictures in a personnel library, carrying out corresponding processing, extracting the characteristic value according to the face picture uploaded by the system, comparing the characteristic value with personnel file cover pictures, retrieving personnel passing records with the similarity reaching a set threshold value, and dynamically drawing a target motion track. Aiming at the characteristics of complex structure, dynamic characteristic, massive data, diversification and the like of the large-scale urban face data, the method improves the parallelism of MapReduce task processing, realizes the rapid retrieval of data and improves the efficiency of data processing and the real-time performance of result display by designing the clustering difference storage strategy.
Description
Technical Field
The invention relates to the technical field of face recognition, in particular to a target tracking method and a target tracking system based on face clustering.
Background
In recent years, with the high informatization of various industries, data is rapidly increasing at an unprecedented speed, a big data era is coming, the characteristics of the big data, such as the massive property, the complexity and the like, require that an index mechanism of the big data must meet the requirements of supporting various queries, supporting efficient retrieval, being easy to maintain and the like.
With the rapid development of video monitoring software and hardware and face snapshot technology, face data is rapidly increasing, and challenges are brought to the application of the face data, face recognition is to extract features for recognition from a face image according to a certain strategy, that is, to map data in a face space to a feature space, and recognition of a large-scale face library is difficult to be practical due to long comparison time caused by linear increase of the amount of face data, so that it is difficult to efficiently complete confidence and concatenation of a face file and rapid retrieval of a person track of the large-scale face library. How to perform cluster analysis on the human face big data and extract valuable knowledge is a problem which needs to be solved urgently in the current research. Therefore, a target tracking method and a target tracking system based on face clustering are provided.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the method can perform cluster analysis on the human faces, and realizes a rapid retrieval method of the personnel tracks of the large-scale human face library.
The invention solves the technical problems through the following technical scheme, and the invention comprises the following steps:
s1: inputting a face picture and related personnel information, and establishing a personnel library, namely a bottom library;
s2: extracting characteristic values of the input face picture;
s3: establishing a face picture snapshot library, and extracting characteristic values of face pictures snapshot by a face snapshot machine;
s4: calculating the difference between the face characteristic values by adopting a distance measurement mode of the class center characteristic values, and constructing an index tree by designing a clustering difference storage strategy;
s5: filing the face picture with the face feature value similarity reaching a threshold value, and taking the identity card number as an ID;
s6: performing profiling processing on the face picture with the face characteristic value similarity not reaching the threshold value, and automatically creating an ID (identity) by a system;
s7: establishing a personnel motion track analysis model, and analyzing personnel motion records in a selected time area;
s8: the classified personnel files are compared and analyzed, and results are retrieved;
s9: and dynamically drawing the motion trail of the retrieved personnel snapshot record according to the time sequence based on the map.
Further, in the step S1, the information related to the person includes a face picture, an identification number, a name, a gender, and an address.
Further, in the steps S2 and S3, the feature value extraction means that a uniform feature vector expression rule is constructed by analyzing the face image description corresponding to each feature in the geometric feature vector of the face image provided by the face snapshot machine, and the feature value extraction of the face data is realized through a convolutional neural network.
Further, in step S4, the distance measurement mode of the class center feature values comprehensively considers the distinguishability and independence of the face feature vectors, and the distance measurement scheme of the class center feature values is adopted to calculate the difference between each face feature value, that is, the extracted face features are compared with the existing class center features, the short features are firstly used for searching, the dissimilar classes are filtered out, the search range is narrowed, then the long features are used for searching, the class to which the face features belong is finally determined, and the class center features are recalculated after the face features are included in the classes, wherein one class represents one person.
Further, in the step S4, clustering is implemented according to the similarity, in each cluster, according to the idea that the closer point to the centroid is queried with a higher probability, the centroid is used as the center of the circle to perform concentric circle layering on the clustering, binary codes with different lengths are adopted for each layer to express, and an index tree is constructed by using the codes to implement fast data retrieval.
Further, in step S7, the motion trajectory analysis model is created by using MapReduce on a Hadoop platform, after the clustering task is completed, the classes and class features thereof are extracted, and the class features are used as input to perform 1: and N, comparing, acquiring a first bottom library person with the similarity threshold value, associating the type of the first bottom library person with the identity ID of the bottom library person, retrieving a snap record of the person, and dynamically drawing a track based on a Canvas technology to display the track to a user.
The invention also discloses a target tracking system based on face clustering, which adopts the tracking method to realize the target tracking work based on face clustering and comprises the following steps:
a personnel library management module: the system is used for importing a single face image or a plurality of face images in batch from the outside of the system, and supports editing operation on face image information;
a snapshot library management module: the system is used for extracting the characteristic value of the imported image and supporting retrieval operation according to time and region conditions;
a face clustering module: the system is used for dynamically clustering the face characteristic values and establishing a personnel file by adopting a distance measurement mode of the class center characteristic values;
personnel movement track analysis module: the system is used for retrieving a snapshot record of a person according to the selected time and area and the uploaded face picture and drawing a motion track based on a map;
the control processing module: the system is used for sending instructions to other modules to complete related actions;
the personnel library management module, the snapshot library management module, the face clustering module and the personnel motion trail analysis module are all connected with the control processing module.
Compared with the prior art, the invention has the following advantages: the target tracking method based on the face clustering aims at the characteristics of complex structure, dynamic characteristics, massive data, diversification, incapability of sequencing and the like of large-scale face data in the city level, improves the MapReduce task processing parallelism by designing a clustering difference storage strategy, realizes quick data retrieval, greatly improves the data processing efficiency and the result display real-time property, and is worthy of popularization and use.
Drawings
FIG. 1 is a schematic flow chart of a target tracking method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of face feature value clustering according to an embodiment of the present invention;
fig. 3 is a flowchart of a human motion trajectory analysis model established in the embodiment of the present invention.
Detailed Description
The following examples are given for the detailed implementation and specific operation of the present invention, but the scope of the present invention is not limited to the following examples.
The embodiment provides a technical scheme: a target tracking method based on face clustering comprises the following steps: firstly, creating a personnel library, and inputting relevant personnel information, such as a face picture, an identity card number, a name, a gender, an address and the like; secondly, extracting characteristic values of face data by analyzing face image description corresponding to each characteristic in a face image geometric characteristic vector provided by a face snapshot machine; and then, calculating the difference between each face characteristic value by adopting a distance measurement scheme of the similar center characteristic value, comparing a face picture captured by the front-end face capturing machine with the face pictures in the personnel library, wherein the similarity reaches a set threshold value, establishing a personnel file taking the identity card number as the ID and classifying the captured picture into the file, and if the similarity does not reach the set threshold value, automatically generating the ID by the system and establishing the personnel file. And finally, extracting a characteristic value according to the face picture uploaded by the system, comparing the characteristic value with a person file cover picture, retrieving a person passing record with the similarity reaching a set threshold value, and dynamically drawing a target motion track based on the GIS.
As shown in fig. 1, the present invention adopts a target tracking method based on face clustering, which comprises:
1) inputting a face picture and related personnel information, and establishing a personnel database;
2) extracting characteristic values of the input face picture;
3) and establishing a face picture snapshot library. Extracting characteristic values of face pictures captured by a face capturing machine;
4) and clustering the human faces. Calculating the difference between the face characteristic values by adopting a distance measurement scheme of the class center characteristic values, and constructing an index tree by designing a clustering difference storage strategy to realize rapid data retrieval;
5) filing the face picture with the face feature value similarity reaching the threshold value and taking the identity card number as an ID;
6) performing filing processing on the face picture with the face characteristic value similarity not reaching the threshold value and automatically creating an ID by the system;
7) establishing a personnel motion track analysis model, and analyzing personnel motion records in a selected time area;
8) the classified personnel files are compared and analyzed, and the result is retrieved;
9) and dynamically drawing the retrieved personnel snapshot records according to the time sequence based on the map.
In this embodiment, the feature value extraction refers to constructing a uniform feature vector expression rule by analyzing a face image description corresponding to each feature in a face image geometric feature vector provided by a face snapshot machine, and implementing 128-dimensional feature value extraction of face data based on a convolutional neural network.
In this embodiment, the distance measurement technique of the class center feature values refers to a technique of comprehensively considering the distinguishability and independence of face feature vectors, specifically, a distance measurement scheme of the class center feature values is adopted to calculate the difference between each face feature value, i.e., the extracted face features are compared with the existing class center features, a coarse search is performed through short features, a large number of dissimilar classes are filtered out, the search range is narrowed, then a fine search is performed through long features, the class to which the features belong is finally determined, and the class center features are recalculated after the features are included in the class, so that class information is more accurately described, interference on a clustering result due to description among the features can be effectively avoided, and efficient dynamic clustering of the face feature values is realized.
In this embodiment, the data fast retrieval method is that one type represents one person by designing a difference storage strategy of clustering, that is, automatic clustering is performed according to similarity, the retrieval number can be greatly reduced by completing clustering on mass data, and the parallelism of MapReduce task processing is improved.
In this embodiment, the personnel motion trajectory analysis model is established by using MapReduce under a Hadoop platform, after a clustering task is completed, classes and class characteristics thereof are extracted, and 1: and N, comparing, acquiring TOP1 bottom library personnel with the similarity threshold, associating the type with the unique identification (namely identity ID) of the bottom library personnel, searching personnel snapshot records, and dynamically drawing a track based on a Canvas technology to display the track to a user.
As shown in fig. 2, the distance measurement technique of the class center eigenvalue is adopted to calculate the difference between each face eigenvalue, so as to realize the efficient dynamic clustering of the face eigenvalue, and the specific steps include:
1) extracting a face bottom library characteristic value;
2) extracting characteristic values of the face pictures provided by the accessed face snapshot machine;
3) calculating the characteristic value difference between the human face picture provided by the human face snapshot machine and the human face base picture by adopting a distance measurement mode of the class center characteristic value;
4) and filing the face picture according to the comparison result, if a set threshold value is reached, establishing a personnel file taking the personnel identification card as an ID, and if the set threshold value is not reached, automatically generating the personnel ID file.
As shown in fig. 3, the present invention further provides a target tracking system based on face clustering, which is used for operating the target tracking method based on face clustering, and includes:
(1) a personnel library management module: a single face image can be imported from the outside of the system, a plurality of face images can also be imported in batch, and operations such as editing of face image information are supported;
(2) snapshot library management module: extracting a characteristic value of the imported image, and supporting operations such as retrieval and the like according to conditions such as time, regions and the like;
(3) a face clustering module: a distance measurement scheme of the class center characteristic value is adopted to realize the efficient dynamic clustering of the face characteristic value and establish a personnel file;
(4) personnel movement track analysis module: and searching personnel snapshot records according to the selected time and area and the uploaded face picture, and drawing a motion track based on a map.
To sum up, the target tracking method based on face clustering according to the embodiment improves the MapReduce task processing parallelism and realizes fast data retrieval by designing the difference storage strategy of clustering aiming at the characteristics of complex structure, dynamic characteristics, massive data, diversity, incapability of sequencing and the like of large-scale face data in the city level, greatly improves the data processing efficiency and the real-time performance of result display, and is worthy of popularization and application.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (7)
1. A target tracking method based on face clustering is characterized by comprising the following steps:
s1: inputting a face picture and related personnel information, and establishing a personnel library, namely a bottom library;
s2: extracting characteristic values of the input face picture;
s3: establishing a face picture snapshot library, and extracting characteristic values of face pictures snapshot by a face snapshot machine;
s4: calculating the difference between the face characteristic values by adopting a distance measurement mode of the class center characteristic values, and constructing an index tree by designing a clustering difference storage strategy;
s5: filing the face picture with the face characteristic value similarity reaching a threshold value, and taking the identity card number as an ID;
s6: performing filing processing on the face picture with the face characteristic value similarity not reaching the threshold value and automatically creating an ID by the system;
s7: establishing a personnel motion track analysis model, and analyzing personnel motion records in a selected time area;
s8: the classified personnel files are compared and analyzed, and results are retrieved;
s9: and dynamically drawing the motion trail of the retrieved personnel snapshot record according to the time sequence based on the map.
2. The target tracking method based on face clustering according to claim 1, characterized in that: in step S1, the information related to the person includes a face picture, an identification number, a name, a gender, and an address.
3. The target tracking method based on face clustering as claimed in claim 2, characterized in that: in the steps S2 and S3, the feature value extraction refers to constructing a uniform feature vector expression rule by analyzing the face image description corresponding to each feature in the geometric feature vector of the face image provided by the face snapshot machine, and implementing feature value extraction of the face data by a convolutional neural network.
4. The target tracking method based on face clustering according to claim 3, characterized in that: in step S4, the distance measurement mode of the class center feature values comprehensively considers the distinguishability and independence of the face feature vectors, calculates the difference between each face feature value by using the distance measurement scheme of the class center feature values, i.e., compares the extracted face features with the existing class center features, searches through short features first, filters out dissimilar classes, reduces the search range, searches through long features, finally determines the class to which the face features belong, and recalculates the class center features after the face features are included in the class, wherein one class represents one person.
5. The target tracking method based on face clustering as claimed in claim 4, wherein: in step S4, clustering is implemented according to the similarity, and in each cluster, according to the idea that the closer point to the centroid is queried, the greater the probability is, the concentric circle layering is performed on the clusters with the centroid as the center of the circle, and each layer is expressed by using binary codes with different lengths, and an index tree is constructed by using the codes, thereby implementing data retrieval.
6. The target tracking method based on face clustering as claimed in claim 5, wherein: in step S7, the motion trajectory analysis model is created by using MapReduce on a Hadoop platform, and after the clustering task is completed, the classes and class features thereof are extracted, and the class features are used as input to perform 1: and N, comparing, acquiring a first bottom library personnel with the similarity threshold, associating the type of personnel with the identity ID of the bottom library personnel, searching personnel snapshot records, and dynamically drawing a track based on a Canvas technology to display the track to a user.
7. A target tracking system based on face clustering is characterized in that the tracking method according to any one of claims 1 to 6 is adopted to realize target tracking work based on face clustering, and the target tracking system comprises the following steps:
a personnel library management module: the system is used for importing a single face image or a plurality of face images in batch from the outside of the system, and supports editing operation on face image information;
a snapshot library management module: the system is used for extracting the characteristic value of the imported image and supporting retrieval operation according to time and region conditions;
a face clustering module: the system is used for dynamically clustering the face characteristic values and establishing a personnel file by adopting a distance measurement mode of the class center characteristic values;
personnel motion trail analysis module: the system is used for retrieving a snapshot record of a person according to the selected time and area and the uploaded face picture and drawing a motion track based on a map;
the control processing module: the system is used for sending instructions to other modules to complete related actions;
the personnel library management module, the snapshot library management module, the face clustering module and the personnel motion trail analysis module are all connected with the control processing module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210706747.3A CN114937306A (en) | 2022-06-21 | 2022-06-21 | Target tracking method and system based on face clustering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210706747.3A CN114937306A (en) | 2022-06-21 | 2022-06-21 | Target tracking method and system based on face clustering |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114937306A true CN114937306A (en) | 2022-08-23 |
Family
ID=82868030
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210706747.3A Pending CN114937306A (en) | 2022-06-21 | 2022-06-21 | Target tracking method and system based on face clustering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114937306A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115880745A (en) * | 2022-09-07 | 2023-03-31 | 以萨技术股份有限公司 | Data processing system for acquiring human face image characteristics |
CN115966313A (en) * | 2023-03-09 | 2023-04-14 | 创意信息技术股份有限公司 | Integrated management platform based on face recognition |
CN117453942A (en) * | 2023-12-21 | 2024-01-26 | 北京瑞莱智慧科技有限公司 | File aggregation method, device, computer equipment and medium for driving path |
CN117909440A (en) * | 2024-03-12 | 2024-04-19 | 厦门蓝极档案技术有限公司 | Intelligent archive index and retrieval system |
CN117909440B (en) * | 2024-03-12 | 2024-06-04 | 厦门蓝极档案技术有限公司 | Intelligent archive index and retrieval system |
-
2022
- 2022-06-21 CN CN202210706747.3A patent/CN114937306A/en active Pending
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115880745A (en) * | 2022-09-07 | 2023-03-31 | 以萨技术股份有限公司 | Data processing system for acquiring human face image characteristics |
CN115966313A (en) * | 2023-03-09 | 2023-04-14 | 创意信息技术股份有限公司 | Integrated management platform based on face recognition |
CN115966313B (en) * | 2023-03-09 | 2023-06-09 | 创意信息技术股份有限公司 | Integrated management platform based on face recognition |
CN117453942A (en) * | 2023-12-21 | 2024-01-26 | 北京瑞莱智慧科技有限公司 | File aggregation method, device, computer equipment and medium for driving path |
CN117453942B (en) * | 2023-12-21 | 2024-03-19 | 北京瑞莱智慧科技有限公司 | File aggregation method, device, computer equipment and medium for driving path |
CN117909440A (en) * | 2024-03-12 | 2024-04-19 | 厦门蓝极档案技术有限公司 | Intelligent archive index and retrieval system |
CN117909440B (en) * | 2024-03-12 | 2024-06-04 | 厦门蓝极档案技术有限公司 | Intelligent archive index and retrieval system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114937306A (en) | Target tracking method and system based on face clustering | |
CN109815364B (en) | Method and system for extracting, storing and retrieving mass video features | |
CN107562742B (en) | Image data processing method and device | |
US8571358B2 (en) | Methods and apparatuses for facilitating content-based image retrieval | |
US11481432B2 (en) | Reverse image search method, apparatus and application system | |
CN104573130A (en) | Entity resolution method based on group calculation and entity resolution device based on group calculation | |
CN107153670A (en) | The video retrieval method and system merged based on multiple image | |
Lomio et al. | Classification of building information model (BIM) structures with deep learning | |
Sumbul et al. | Informative and representative triplet selection for multilabel remote sensing image retrieval | |
CN111368867B (en) | File classifying method and system and computer readable storage medium | |
CN111709303A (en) | Face image recognition method and device | |
CN113963303A (en) | Image processing method, video recognition method, device, equipment and storage medium | |
Trad et al. | Large scale visual-based event matching | |
Bastani et al. | OTIF: efficient tracker pre-processing over large video datasets | |
Zhang et al. | Dataset-driven unsupervised object discovery for region-based instance image retrieval | |
CN109241315B (en) | Rapid face retrieval method based on deep learning | |
Papapetros et al. | Visual loop-closure detection via prominent feature tracking | |
CN112115281A (en) | Data retrieval method, device and storage medium | |
Mas et al. | An interactive transcription system of census records using word-spotting based information transfer | |
Dhoot et al. | Efficient Dimensionality Reduction for Big Data Using Clustering Technique | |
CN108536769B (en) | Image analysis method, search method and device, computer device and storage medium | |
CN115146103A (en) | Image retrieval method, image retrieval apparatus, computer device, storage medium, and program product | |
CN104463864A (en) | Multistage parallel key frame cloud extraction method and system | |
CN115082999A (en) | Group photo image person analysis method and device, computer equipment and storage medium | |
Kim et al. | Two-stage person re-identification scheme using cross-input neighborhood differences |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |