CN110825808A - Distributed human face database system based on edge calculation and generation method thereof - Google Patents

Distributed human face database system based on edge calculation and generation method thereof Download PDF

Info

Publication number
CN110825808A
CN110825808A CN201910901100.4A CN201910901100A CN110825808A CN 110825808 A CN110825808 A CN 110825808A CN 201910901100 A CN201910901100 A CN 201910901100A CN 110825808 A CN110825808 A CN 110825808A
Authority
CN
China
Prior art keywords
face
face image
database
index
distributed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910901100.4A
Other languages
Chinese (zh)
Inventor
余恒兵
李杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Terminus Technology Co Ltd
Original Assignee
Chongqing Terminus Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Terminus Technology Co Ltd filed Critical Chongqing Terminus Technology Co Ltd
Priority to CN201910901100.4A priority Critical patent/CN110825808A/en
Publication of CN110825808A publication Critical patent/CN110825808A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/27Replication, distribution or synchronisation of data between databases or within a distributed database system; Distributed database system architectures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Abstract

The embodiment of the application provides a distributed human face database system based on edge calculation, which comprises: the system comprises terminal equipment, a distributed face database, a database management server and a data interface server, wherein the distributed face database is connected with the terminal equipment, the database management server is connected with the distributed face database, and the data interface server is connected with the distributed face database and the database management server. The system establishes a more comprehensive face database, improves the accuracy of a face recognition algorithm, places a large amount of analysis calculation on the edge by taking the terminal equipment as an edge calculation node, is beneficial to quickly extracting relevant data of the face image, and is more convenient for a user to retrieve and query the face image according to the index label by classifying the face image characteristics.

Description

Distributed human face database system based on edge calculation and generation method thereof
Technical Field
The application relates to the technical field of face recognition, in particular to a distributed face database system based on edge calculation and a generation method thereof.
Background
The face database contains a large number of face images, wherein the identity of the same person often contains a plurality of face images in the database, and the face images have changes in ambient light, expression, posture, race, skin color, accessories (such as wearing no glasses) and the like; the database associates an index tag with each facial image, and the index tag records the index information of the identity, sex, age, expression, posture, race, skin color, accessories and the like of each facial image. The corresponding facial images can be queried from the facial database through the index information, for example, multiple facial images corresponding to the same person identity can be queried, and multiple facial images of different persons with the same type of posture (such as a front face and a side face) or expression (such as a smiling face and angry) can be queried. The face database can be used for testing the accuracy of the face recognition algorithm, and the face recognition algorithm with the neural network can be used for learning and training by utilizing the face database.
At present, most of face databases widely used in the industry employ a few volunteers (about tens to hundreds of people) to collect face images under various environmental lights, expressions, postures and accessories, so that the number of the face images in the database is relatively limited, the representativeness of the faces of the volunteers is not strong enough, and a face recognition algorithm tested or trained by the face database may not achieve an expected effect in practical application, even obvious errors occur.
Therefore, how to establish a more comprehensive face database and reduce the error rate of the face recognition algorithm is an urgent problem to be solved by those skilled in the art.
Disclosure of Invention
In view of this, an object of the present application is to provide a distributed face database system based on edge calculation and a generating method thereof, so as to solve the technical problem in the prior art that, due to limited face images acquired by a face database, a face recognition algorithm tested or trained by the face database has obvious errors, and the face images cannot be recognized quickly and accurately.
In view of the above, in a first aspect of the present application, a distributed human face database system based on edge calculation is provided, including:
the system comprises terminal equipment, a distributed human face database, a database management server and a data interface server;
the terminal equipment is used for collecting a face image and extracting an index tag of the face image;
the distributed face database is connected with the terminal equipment and is used for storing the face image and the index tag uploaded by the terminal equipment;
the database management server is connected with the distributed face database and used for aggregating the index tags in the distributed face database to generate a total index table;
and the data interface server is connected with the distributed face database and the database management server, and packs the face images and the index labels meeting the query conditions in the distributed face database according to the scheduling of the database management server to generate a face image packet.
In some embodiments, the terminal device includes:
the system comprises an acquisition guiding device, a camera, an analysis module and a label classification module;
the acquisition guide device is connected with the camera device and used for prompting a user to make various expressions and various postures;
the camera collects the expressions and postures prompted by the collection guiding device to do by the user, and generates a face image;
the analysis module extracts the characteristics of the face image and inputs the extracted characteristics of the face image into the label classification module;
and the label classification module generates a corresponding index label according to the characteristics of the face image.
In some embodiments, the analysis module comprises:
the system comprises an environment illumination characteristic analysis submodule, an expression characteristic analysis submodule, a posture characteristic analysis submodule, a race complexion characteristic analysis submodule and an accessory characteristic analysis submodule;
the environment illumination characteristic analysis submodule is used for extracting the collected environment illumination intensity of the face image;
the expression feature analysis submodule is used for extracting morphological features of important organs in the face area;
the gesture feature analysis submodule is used for extracting the front and side gesture features of the face region;
the race skin color feature analysis submodule is used for extracting an average chroma value of a face area.
In some embodiments, the acquisition guidance device comprises:
the acquisition guiding device is arranged as a display screen, and the display screen is used for displaying the prompt symbols of the expressions and the postures.
In some embodiments, the terminal device is further configured to:
and inputting the identity information of the user, and transmitting the identity information of the user as an index tag to the distributed human face database.
In some embodiments, the distributed face database is composed of a plurality of sub-databases;
the sub-database is associated with a plurality of terminal devices, and is used for storing the face images uploaded by the terminal devices and the index tags thereof and uploading the index tags to the database management server.
In some embodiments, the sub-database comprises:
the sub-databases are arranged on the independent servers, and the sub-databases correspond to the independent servers one to one.
In view of the foregoing, in a second aspect of the present application, a method for generating a distributed human face database system based on edge calculation is further provided, where the method includes:
the method comprises the steps that terminal equipment collects a face image, analyzes and extracts the characteristics of the face image and generates a corresponding index tag;
uploading the index tag and the face image to a distributed face database;
the distributed face database stores the face image and the index tag, and provides the index tag to a data management server;
the data management server aggregates the index tags to generate a total index table; the distributed face database stores the face images and the index labels;
the data interface server extracts the face images in the distributed face database according to the scheduling of the data management server, packs the face images and the index labels and generates a face image packet;
and testing and training a face recognition algorithm according to the face image packet.
In some embodiments, the terminal device collects a face image, and analyzes and extracts features of the face image, including:
the user performs corresponding actions according to the prompts of the expressions and the postures displayed by the acquisition guiding device;
the camera shoots corresponding actions of the user and collects a face image;
analyzing the face image through an environment illumination characteristic analysis submodule, an expression characteristic analysis submodule, a posture characteristic analysis submodule, a race complexion characteristic analysis submodule and an accessory characteristic analysis submodule, and respectively extracting environment illumination, an expression, a posture, a race, complexion and accessory characteristics of the face image;
and inputting the corresponding characteristics of the face image into a label classification module to generate a corresponding index label.
In some embodiments, inputting the corresponding features of the face image into the tag classification module, and generating the corresponding index tag includes:
the environment illumination characteristic analysis submodule inputs the extracted characteristics of the face image into the label classification module, and the label classification module generates an index label of environment illumination intensity;
the expression feature analysis submodule inputs the extracted features of the face image into the tag classification module, and the tag classification module generates an index tag of smiling face or crying face types;
the gesture feature analysis submodule inputs the features extracted from the face image into the label classification module, and the label classification module generates index labels on the front side and the side;
the human skin color feature analysis submodule inputs the features extracted from the human face image into the label classification module, and the label classification module generates black, yellow and white index labels.
The embodiment of the application provides a distributed human face database system based on edge calculation, the system establishes a more comprehensive human face database, the accuracy of a human face recognition algorithm is improved, a terminal device is used as an edge calculation node, a large amount of analysis calculation is placed at the edge, the rapid extraction of relevant data of a human face image is facilitated, and the human face image characteristics are subjected to label classification, so that a user can conveniently retrieve and query the human face image according to an index label, a user side needing the human face image can perform interactive query through a data interface server, a human face image packet meeting the query requirement is generated and obtained, and convenience is provided for the user side to test or train the human face recognition algorithm.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is a schematic structural diagram of a distributed human face database system based on edge calculation according to an embodiment of the present invention;
FIG. 2 is a flowchart of a distributed human face database system using method based on edge calculation according to an embodiment of the present invention;
fig. 3 is a flowchart of step S201 provided in the embodiment of the present invention.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Specifically, as shown in fig. 1, the distributed human face database system based on edge calculation of this embodiment includes:
the system comprises terminal equipment 1, a distributed human face database 2, a database management server 3 and a data interface server 4.
The terminal device 1 is used for collecting a face image and extracting an index tag of the face image. The terminal device 1 can extract index tags in multiple dimensions for each face image, for example, index tags reflecting ambient light such as "natural light", "strong light", "weak light", index tags reflecting expressions such as "smiling face", "angry", "crying face", index tags reflecting face gestures such as "front face", "side face", index tags reflecting race or skin color such as "white", "black", "yellow", and index tags reflecting accessories such as "wearing glasses". The system adopts an edge calculation framework, and the terminal equipment 1 is used as an edge node in the framework, so that the acquisition of the face image and the calculation amount of extracting the index label by analyzing the face image are born. The terminal device 1 may employ various hardware with image acquisition and calculation processing functions, including but not limited to a camera, a smart phone, a PAD, and the like. The terminal equipment 1 is arranged in places such as community property departments, certificate handling public security management departments, entrances and exits of office buildings, elevators and the like; the face image can be used as a face image recorded in a face database on one hand, and can be used for community property service, entrance guard passage, certificate handling real-name authentication and other aspects on the other hand.
The terminal device 1 is further configured to: and inputting the identity information of the user, and transmitting the identity information of the user to the distributed human face database 2 as an index tag.
The distributed face database 2 is connected with the terminal device 1 and is used for storing the index tags uploaded by the terminal device 1. The distributed face database 2 may be set up on a number of different storage servers.
The database management server 3 is connected with the distributed face database 2 and is used for aggregating the index labels in the distributed face database 2 to generate a total index table; the total index table records a storage server of each face image and index labels of various types of the face images.
The data interface server 4 is connected with the distributed face database 2 and the database management server 3, and packs the face images and the index labels meeting the query conditions in the distributed face database 2 according to the scheduling of the database management server 3 to generate a face image packet. And, the facial image packet is externally outputted through the data interface server 4. The generated face image packet can be used as sample data to be applied to testing or training of a face recognition algorithm and the like. For example, when a certain user needs to obtain a certain number of face images meeting the requirements for testing or training the face recognition algorithm, a call request may be provided to the data interface server 4, where the call request includes a query condition for the face images, and may also include, for example, a number requirement for the face images; the data interface server 4 provides the query conditions to the database management server 3, and the database management server 3 confirms the face images meeting the query conditions in the distributed face database 2 and the storage servers of the face images meeting the query conditions by retrieving the index tags recorded in the total index table; furthermore, the database management server 3 issues scheduling instructions to the storage servers, and the storage servers provide the face images and the index labels which meet the query conditions on the server to the data interface server 4; the data interface server 4 packs the face image and the index tag to generate a face image packet, and provides the face image packet to the user side through the data interface server 4.
In the embodiment, a comprehensive face database is established, the accuracy of the face recognition algorithm is improved, the terminal device is used as an edge computing node, a large amount of analysis calculation is placed at the edge, the rapid extraction of relevant data of the face image is facilitated, the face image features are subjected to label classification, a user can conveniently retrieve and query the face image according to the index label, a user side needing the face image can perform interactive query through the data interface server to generate and obtain a face image packet meeting the query requirement, and therefore convenience is brought to the user side for testing or training the face recognition algorithm.
In one embodiment, the terminal device 1 includes: the system comprises an acquisition guiding device 5, a camera 6, an analysis module 7 and a label classifying module 8;
the acquisition guide device 5 is connected with the camera 6, and the acquisition guide device 5 is used for prompting a user to make various expressions and various postures;
specifically, the acquisition guide device 5 is set as a display screen, and the display screen is used for displaying the prompt symbols of the expressions and the postures; prompting the user to make various expressions and actions such as smiling face, frown and the like, and make various postures such as front face, side face and the like;
the camera 6 collects the expressions and postures prompted by the collection guiding device to do by the user, and generates a face image;
the analysis module 7 extracts the features of the face image and inputs the extracted features of the face image into the tag classification module 8;
and the label classifying module 8 generates a corresponding index label according to the characteristics of the face image.
In one embodiment, the analysis module 7 comprises:
an environment illumination characteristic analysis submodule 9, an expression characteristic analysis submodule 10, a posture characteristic analysis submodule 11, a race complexion characteristic analysis submodule 12 and an accessory characteristic analysis submodule 13;
the environment illumination characteristic analysis submodule 9 is used for extracting the collected environment illumination intensity of the face image; specifically, the ambient light characteristic analysis submodule 9 may count the average brightness value characteristic of the pixels of the face image, and use the average brightness value characteristic as a judgment basis for the ambient light intensity;
the expression feature analysis submodule 10 is used for extracting morphological features of important organs in the face area;
the posture feature analysis submodule 11 is used for extracting the front and side posture features of the face region;
the race skin color feature analysis submodule 12 is used for extracting an average chroma value of a face area.
Further, the ambient light feature analysis submodule 9 inputs the features extracted from the face image into the tag classification module 8, and the tag classification module 8 generates an index tag of the ambient light intensity;
the expression feature analysis submodule 10 inputs the features extracted from the face image into the tag classification module 8, and the tag classification module 8 generates an index tag of the smiling face or the crying face type;
the pose feature analysis submodule 11 inputs the features extracted from the face image into the tag classification module 8, and the tag classification module 8 generates index tags on the front side and the side;
the human skin color feature analysis submodule 12 inputs the features extracted from the face image into the tag classification module 8, and the tag classification module 8 generates black, yellow and white index tags.
In one embodiment, the distributed face database 2 is composed of a plurality of sub-databases 14;
the sub-database 14 is associated with a plurality of terminal devices 1, and is configured to store the face images uploaded by the terminal devices 1 and the index tags thereof, and upload the index tags to the database management server 3.
Specifically, the sub-databases 14 are arranged on independent servers, and the sub-databases 14 correspond to the independent servers one to one;
for example, for the police management department, a sub-database may be deployed by the police server in the a site, and a sub-database may be deployed by the police server in the B site, where the servers in different sites are independent of each other.
As shown in fig. 2, the method for generating a distributed human face database system based on edge calculation in this embodiment may include the following steps:
s201, terminal equipment collects a face image, analyzes and extracts the characteristics of the face image and generates a corresponding index tag;
furthermore, the collected face image can be used as a face image recorded in a face database on one hand, and can be used for community property service, entrance guard passage, certificate handling real-name authentication and other aspects on the other hand.
The terminal equipment can extract index labels under multiple dimensions for each face image. In addition, the system adopts an edge calculation framework, and the terminal equipment is used as an edge node in the framework, so that the acquisition of the face image and the calculation amount of extracting the index label by analyzing the face image are born.
S202, uploading the index tag and the face image to a distributed face database;
s203, the distributed human face database stores the human face image and the index tag; and providing the index tag to a data management server by a distributed face database;
s204, the data management server aggregates the index tags to generate a total index table; the total index table records a storage server of each face image and index labels of various types of the face images;
s205, the data interface server extracts the face images in the distributed face database according to the scheduling of the data management server, and packs the face images and the index labels to generate a face image packet. Specifically, when a certain user needs to obtain a certain number of face images meeting the requirements to perform face recognition algorithm testing or training, a call request can be made to the data interface server, where the call request includes query conditions for the face images and may also include, for example, the number requirements of the face images; the data interface server provides the query conditions for the database management server, and the database management server confirms the face images which accord with the query conditions in the distributed face database and the storage servers of the face images which accord with the query conditions by retrieving the index tags recorded in the total index table; furthermore, the database management server issues scheduling instructions to the storage servers, and the storage servers provide the face images and the index labels which meet the query conditions on the server to the data interface server; the data interface server packs the face image and the index label to generate a face image packet;
for example, 10000 smiling face images with glasses are extracted to be tested or trained, a smiling face index tag is inquired in a general index table in a data management server, a data interface server extracts corresponding smiling face images in a distributed face database according to the smiling face index tag (for example, corresponding face images are extracted from 20 sub-databases), and the face images and the index tags are packaged by the data interface server to form a face image package.
And S206, testing and training a face recognition algorithm according to the face image packet.
In one embodiment, as shown in fig. 3, the step S201 includes:
s2011, the user performs corresponding actions according to the prompts of the expressions and the gestures displayed by the acquisition guiding device;
s2012, the camera shoots corresponding actions of the user and collects a face image;
s2013, analyzing the face image through the environment illumination feature analysis submodule, the expression feature analysis submodule, the posture feature analysis submodule, the race skin color feature analysis submodule and the accessory feature analysis submodule, and respectively extracting environment illumination, expression, posture, race, skin color and accessory features of the face image.
S2014, inputting the corresponding features of the face image into a tag classification module, and generating a corresponding index tag;
for example, the expression feature analysis module inputs a feature extracted from the face image into the tag classification module, and the tag classification module generates an index tag of a smiling face or a crying face type.
In one embodiment, step S2014 includes:
the environment illumination characteristic analysis submodule inputs the extracted characteristics of the face image into the label classification module, and the label classification module generates an index label of environment illumination intensity;
the expression feature analysis submodule inputs the extracted features of the face image into the tag classification module, and the tag classification module generates an index tag of smiling face or crying face types;
the gesture feature analysis submodule inputs the features extracted from the face image into the label classification module, and the label classification module generates index labels on the front side and the side;
the human skin color feature analysis submodule inputs the features extracted from the human face image into the label classification module, and the label classification module generates black, yellow and white index labels.
The application method of the edge-calculation-based distributed human face database system can improve the accuracy of the human face recognition algorithm, rapidly extracts the human face data, reduces the error rate of the human face recognition algorithm caused by incomplete human face database, and is more rapid and convenient for retrieval and query of human face images.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. A distributed human face database system based on edge computation, comprising: the system comprises terminal equipment, a distributed human face database, a database management server and a data interface server;
the terminal equipment is used for collecting a face image and extracting an index tag of the face image;
the distributed face database is connected with the terminal equipment and is used for storing the face image and the index tag uploaded by the terminal equipment;
the database management server is connected with the distributed face database and used for aggregating the index tags in the distributed face database to generate a total index table;
and the data interface server is connected with the distributed face database and the database management server, and packs the face images and the index labels meeting the query conditions in the distributed face database according to the scheduling of the database management server to generate a face image packet.
2. The system of claim 1, wherein the terminal device comprises:
the system comprises an acquisition guiding device, a camera, an analysis module and a label classification module;
the acquisition guide device is connected with the camera device and used for prompting a user to make various expressions and various postures;
the camera collects the expressions and postures prompted by the collection guiding device to do by the user, and generates a face image;
the analysis module extracts the characteristics of the face image and inputs the extracted characteristics of the face image into the label classification module;
and the label classification module generates a corresponding index label according to the characteristics of the face image.
3. The system of claim 2, wherein the analysis module comprises:
the system comprises an environment illumination characteristic analysis submodule, an expression characteristic analysis submodule, a posture characteristic analysis submodule, a race complexion characteristic analysis submodule and an accessory characteristic analysis submodule;
the environment illumination characteristic analysis submodule is used for extracting the collected environment illumination intensity of the face image;
the expression feature analysis submodule is used for extracting morphological features of important organs in the face area;
the gesture feature analysis submodule is used for extracting the front and side gesture features of the face region;
the race skin color feature analysis submodule is used for extracting an average chroma value of a face area.
4. The system of claim 2, wherein the acquisition guidance device comprises:
the acquisition guiding device is arranged as a display screen, and the display screen is used for displaying the prompt symbols of the expressions and the postures.
5. The system of claim 1, the terminal device, further configured to:
and inputting the identity information of the user, and transmitting the identity information of the user as an index tag to the distributed human face database.
6. The system of claim 1, wherein the distributed face database is comprised of a plurality of sub-databases;
the sub-database is associated with a plurality of terminal devices, and is used for storing the face images uploaded by the terminal devices and the index tags thereof and uploading the index tags to the database management server.
7. The system of claim 6, wherein the sub-database comprises:
the sub-databases are arranged on the independent servers, and the sub-databases correspond to the independent servers one to one.
8. A distributed human face database system generation method based on edge calculation is characterized by comprising the following steps:
the method comprises the steps that terminal equipment collects a face image, analyzes and extracts the characteristics of the face image and generates a corresponding index tag;
uploading the index tag and the face image to a distributed face database;
the distributed face database stores the face image and the index tag, and provides the index tag to a data management server;
the data management server aggregates the index tags to generate a total index table; the distributed face database stores the face images and the index labels;
the data interface server extracts the face images in the distributed face database according to the scheduling of the data management server, packs the face images and the index labels and generates a face image packet;
and testing and training a face recognition algorithm according to the face image packet.
9. The method of claim 8, wherein the terminal device collects a face image, analyzes and extracts features of the face image, and generates a corresponding index tag, comprising:
the user performs corresponding actions according to the prompts of the expressions and the postures displayed by the acquisition guiding device;
the camera shoots corresponding actions of the user and collects a face image;
analyzing the face image through an environment illumination characteristic analysis submodule, an expression characteristic analysis submodule, a posture characteristic analysis submodule, a race complexion characteristic analysis submodule and an accessory characteristic analysis submodule, and respectively extracting environment illumination, an expression, a posture, a race, complexion and accessory characteristics of the face image;
and inputting the corresponding characteristics of the face image into a label classification module to generate a corresponding index label.
10. The method of claim 9, wherein inputting the corresponding features of the face image into a tag classification module to generate corresponding index tags comprises:
the environment illumination characteristic analysis submodule inputs the extracted characteristics of the face image into the label classification module, and the label classification module generates an index label of environment illumination intensity;
the expression feature analysis submodule inputs the extracted features of the face image into the tag classification module, and the tag classification module generates an index tag of smiling face or crying face types;
the gesture feature analysis submodule inputs the features extracted from the face image into the label classification module, and the label classification module generates index labels on the front side and the side;
the human skin color feature analysis submodule inputs the features extracted from the human face image into the label classification module, and the label classification module generates black, yellow and white index labels.
CN201910901100.4A 2019-09-23 2019-09-23 Distributed human face database system based on edge calculation and generation method thereof Pending CN110825808A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910901100.4A CN110825808A (en) 2019-09-23 2019-09-23 Distributed human face database system based on edge calculation and generation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910901100.4A CN110825808A (en) 2019-09-23 2019-09-23 Distributed human face database system based on edge calculation and generation method thereof

Publications (1)

Publication Number Publication Date
CN110825808A true CN110825808A (en) 2020-02-21

Family

ID=69548216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910901100.4A Pending CN110825808A (en) 2019-09-23 2019-09-23 Distributed human face database system based on edge calculation and generation method thereof

Country Status (1)

Country Link
CN (1) CN110825808A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112308031A (en) * 2020-11-25 2021-02-02 浙江大华系统工程有限公司 Universal face recognition and face feature information base generation method, device and equipment
CN112446294A (en) * 2020-10-30 2021-03-05 四川天翼网络服务有限公司 Distributed face data scheduling method, system, terminal and storage medium
CN113221728A (en) * 2021-05-10 2021-08-06 湖南科技大学 Learning environment and state monitoring method and device based on machine vision

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101706872A (en) * 2009-11-26 2010-05-12 上海交通大学 Universal open type face identification system
CN103793697A (en) * 2014-02-17 2014-05-14 北京旷视科技有限公司 Identity labeling method of face images and face identity recognition method of face images
CN105138977A (en) * 2015-08-18 2015-12-09 成都鼎智汇科技有限公司 Face identification method under big data environment
CN105427421A (en) * 2015-11-16 2016-03-23 苏州市公安局虎丘分局 Entrance guard control method based on face recognition
CN107679546A (en) * 2017-08-17 2018-02-09 平安科技(深圳)有限公司 Face image data acquisition method, device, terminal device and storage medium
CN108319938A (en) * 2017-12-31 2018-07-24 奥瞳系统科技有限公司 High quality training data preparation system for high-performance face identification system
CN109670393A (en) * 2018-09-26 2019-04-23 平安科技(深圳)有限公司 Human face data acquisition method, unit and computer readable storage medium
CN110188226A (en) * 2019-04-29 2019-08-30 苏宁易购集团股份有限公司 A kind of customer portrait generation method and device based on recognition of face

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101706872A (en) * 2009-11-26 2010-05-12 上海交通大学 Universal open type face identification system
CN103793697A (en) * 2014-02-17 2014-05-14 北京旷视科技有限公司 Identity labeling method of face images and face identity recognition method of face images
CN105138977A (en) * 2015-08-18 2015-12-09 成都鼎智汇科技有限公司 Face identification method under big data environment
CN105427421A (en) * 2015-11-16 2016-03-23 苏州市公安局虎丘分局 Entrance guard control method based on face recognition
CN107679546A (en) * 2017-08-17 2018-02-09 平安科技(深圳)有限公司 Face image data acquisition method, device, terminal device and storage medium
CN108319938A (en) * 2017-12-31 2018-07-24 奥瞳系统科技有限公司 High quality training data preparation system for high-performance face identification system
CN109670393A (en) * 2018-09-26 2019-04-23 平安科技(深圳)有限公司 Human face data acquisition method, unit and computer readable storage medium
CN110188226A (en) * 2019-04-29 2019-08-30 苏宁易购集团股份有限公司 A kind of customer portrait generation method and device based on recognition of face

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112446294A (en) * 2020-10-30 2021-03-05 四川天翼网络服务有限公司 Distributed face data scheduling method, system, terminal and storage medium
CN112308031A (en) * 2020-11-25 2021-02-02 浙江大华系统工程有限公司 Universal face recognition and face feature information base generation method, device and equipment
CN113221728A (en) * 2021-05-10 2021-08-06 湖南科技大学 Learning environment and state monitoring method and device based on machine vision

Similar Documents

Publication Publication Date Title
US20190005359A1 (en) Method and system for predicting personality traits, capabilities and suggested interactions from images of a person
Wong et al. Patch-based probabilistic image quality assessment for face selection and improved video-based face recognition
CN109740446A (en) Classroom students ' behavior analysis method and device
CN110532970B (en) Age and gender attribute analysis method, system, equipment and medium for 2D images of human faces
CN109948447B (en) Character network relation discovery and evolution presentation method based on video image recognition
CN110825808A (en) Distributed human face database system based on edge calculation and generation method thereof
CN110464366A (en) A kind of Emotion identification method, system and storage medium
CN105913507B (en) A kind of Work attendance method and system
CN110490238A (en) A kind of image processing method, device and storage medium
KR20140058409A (en) Systems and methods for image-to-text and text-to-image association
CN109829072A (en) Construct atlas calculation and relevant apparatus
CN107392151A (en) Face image various dimensions emotion judgement system and method based on neutral net
Chandran et al. Missing child identification system using deep learning and multiclass SVM
CN109558792B (en) Method and system for detecting internet logo content based on samples and features
CN111797756A (en) Video analysis method, device and medium based on artificial intelligence
CN110427881A (en) The micro- expression recognition method of integration across database and device based on the study of face local features
CN108171208A (en) Information acquisition method and device
CN111738199A (en) Image information verification method, image information verification device, image information verification computing device and medium
Galiyawala et al. Person retrieval in surveillance using textual query: a review
CN109345427B (en) Classroom video frequency point arrival method combining face recognition technology and pedestrian recognition technology
CN106503747A (en) A kind of image recognition statistical analysis system
CN113538720A (en) Embedded face recognition attendance checking method based on Haisi intelligent AI chip
CN110443122A (en) Information processing method and Related product
CN114399827B (en) College graduate career character testing method and system based on facial micro-expression
CN111191620B (en) Method for constructing human-object interaction detection data set

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200221