CN113409507B - Control method based on face recognition - Google Patents

Control method based on face recognition Download PDF

Info

Publication number
CN113409507B
CN113409507B CN202110662630.5A CN202110662630A CN113409507B CN 113409507 B CN113409507 B CN 113409507B CN 202110662630 A CN202110662630 A CN 202110662630A CN 113409507 B CN113409507 B CN 113409507B
Authority
CN
China
Prior art keywords
module
emotion
person
unit
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110662630.5A
Other languages
Chinese (zh)
Other versions
CN113409507A (en
Inventor
姚成国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Newabel Electronic Co ltd
Original Assignee
Shenzhen Newabel Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Newabel Electronic Co ltd filed Critical Shenzhen Newabel Electronic Co ltd
Priority to CN202110662630.5A priority Critical patent/CN113409507B/en
Publication of CN113409507A publication Critical patent/CN113409507A/en
Application granted granted Critical
Publication of CN113409507B publication Critical patent/CN113409507B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/38Individual registration on entry or exit not involving the use of a pass with central registration
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program

Abstract

The invention discloses a control method based on face recognition, which specifically comprises the following steps: step one, human face collection; step two, safety early warning; step three, carrying out network message investigation; step four, the invention relates to the technical field of entrance guard identification. According to the control method based on the face recognition, facial expressions of people are recognized, when negative emotion of people is detected, the atmosphere adjusting unit consolidates the people in the form of short videos and reciting short texts, emotion change of the people is monitored in real time in consolation, when emotion deterioration occurs, the consolation mode is timely changed, emotion of the people is continuously and tentatively relieved, different consolation modes are flexibly generated for different people, the probability of emotion improvement of the people is effectively improved, the people are guaranteed to have positive emotion after entering a cell, meanwhile, consolation type saving is conducted on the people at the gate of the cell, and convenience is provided for construction of a harmonious community.

Description

Control method based on face recognition
Technical Field
The invention relates to the technical field of entrance guard identification, in particular to a control method based on face identification.
Background
Face recognition is a biometric technique for identifying an identity based on facial feature information of a person. The face recognition uses a camera or a video camera to collect images or video streams containing faces, automatically detects and tracks the faces in the images, and further performs a series of related application operations on the detected face images. The technology comprises image acquisition, feature positioning, identity confirmation and search and the like.
The face recognition type entrance guard is widely used, but the conventional community entrance guard can only recognize information-input personnel and other personnel, so that the entrance guard has a single function, along with the development of a smart community, the face recognition is mature prior art, the application in different application scenes is common, an emotion recognition element is added in the smart community, when the personnel have negative emotions, even consolation, the people often cannot achieve effective consolation effect and easily arouses the negative emotions of the personnel because the emotions of the people are complex and changeable and the system is relatively rigid and only mechanically plays instructions output in advance, so that the entrance guard can continuously detect the expression changes of the personnel, thereby obtaining different ways suitable for different personnel, after emotion recognition, the target is consoled, and the consolation effect is guaranteed.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides a control method based on face recognition, and solves the problems of single function and rigid entrance guard.
(II) technical scheme
In order to achieve the purpose, the invention is realized by the following technical scheme: a control method based on face recognition specifically comprises the following steps:
step one, face acquisition: the acquisition unit acquires information of all personnel in the cell, namely, the address determination module is used for authenticating official authentication information of the personnel in the cell, wherein the official authentication information comprises a house property card and a house renting certificate, and the identity card and the account book information of the personnel are authenticated by the identity authentication module, the face information of the person is collected by a face collecting module in the emotion recording unit, and the facial emotion is divided into positive emotion (positive emotion such as happiness and excitement) and negative emotion (negative emotion such as sadness, hate, misery and anger) respectively through the emotion division module, the identification acquisition module is used for acquiring the facial expressions of the person under the positive emotion and the facial expressions of the person under the negative emotion, and then the facial information and the facial expressions of the person are stored in the personal storage module;
step two, safety early warning: in the face recognition process, the entrance guard recognition system detects that the visitor is an unrecorded person, directly enters a visitor screening module in the safety early warning unit, exchanges information with wanted information on a network through the networking synchronization module, records the visitor and face information through the face comparison module, directly starts the alarm module when an inosculation condition occurs, synchronously alarms through the networking synchronization module, and directly signs identity information, a contact way and a place needing to go to the security department of a community when the inosculation condition does not occur;
step three, network message investigation: the method comprises the steps that short videos or short sentences with the emotion relaxing function are collected on line directly through a network collection module in a consolation display unit, then information collected from the network is screened through a manual screening module, and then classified storage is conducted through a classified storage module according to the modes of videos and characters;
step four, interactive comfort: in the process of face recognition, after determining that a person belongs to the cell after comparing the face collected in the first step, the entrance guard recognition system recognizes the facial expression of the person at the moment through a recognition and determination module in a voice interaction unit, wherein the voice interaction unit comprises the recognition and determination module, a voice broadcast module and a time-delay camera module, the output end of the recognition and determination module is connected with the input end of the voice broadcast module, the output end of the voice broadcast module is connected with the input end of the time-delay camera module, then the comparison and verification are carried out on the facial expression information under different moods stored in a personal storage module in the first step, the person is recognized to be in a negative mood state at the moment, short video playing collected in the third step is directly carried out, the voice in the video is broadcasted through the voice broadcast module, and then the facial expression of the person watching the video is collected through the time-delay camera module, the atmosphere adjusting unit is started through an emotion change monitoring module in the character judging unit, namely, the initial expression recording module is used for recording the facial expression of the person before watching the video, the data analysis and integration module is used for analyzing and integrating the facial expression of the person, comprehensively analyzes the emotional state of the person through the watching time length, the body action and the emotional change condition of the person, records the facial expression change of the person in the watching process by using the change recording module, when the emotion of the person fluctuates and the person is improved, the video types played in the short video playing module when the emotion of the person is improved are recorded, and then the data analysis and integration module is used for analyzing and integrating the video types, analyzing the mood of the person, recording the video type and the short text type when the mood of the person is improved, and associating the video type and the short text type with the ID of the person in the database, and when the emotion fluctuation of the person occurs, playing a corresponding video type or short text type associated with the person ID.
Preferably, the entrance guard identification system is in bidirectional connection with the central processing module, the central processing module is in bidirectional connection with the acquisition unit, the character evaluation unit, the emotion recording unit, the safety early warning unit, the voice interaction unit and the comfort display unit, and the output end of the emotion recording unit is connected with the input ends of the character evaluation unit, the safety early warning unit and the voice interaction unit.
Preferably, the acquisition unit comprises an identity authentication module and an address determination module, and an output end of the address determination module is connected with an input end of the identity authentication module.
Preferably, the character evaluation unit comprises an emotion change monitoring module, a short video playing module and a short sentence broadcasting module, and the output end of the emotion change monitoring module is connected with the input ends of the short video playing module and the short sentence broadcasting module respectively.
Preferably, the emotion recording unit comprises a face acquisition module, an emotion dividing module, an identification acquisition module and a personal storage module, wherein the output end of the face acquisition module is connected with the input end of the emotion dividing module, the output end of the emotion dividing module is connected with the input end of the identification acquisition module, and the output end of the identification acquisition module is connected with the input end of the personal storage module.
Preferably, the safety precaution unit includes visitor screening module, networking synchronization module, people's face comparison module and alarm module, visitor screening module's output and networking synchronization module's input are connected, networking synchronization module's output and people's face comparison module's input are connected, people's face comparison module's output and alarm module's input are connected, alarm module's output and networking synchronization module's input are connected.
Preferably, the comfort display unit comprises a network collection module, a manual screening module and a classification storage module, wherein the output end of the network collection module is connected with the input end of the manual screening module, and the output end of the manual screening module is connected with the input end of the classification storage module.
Preferably, the atmosphere adjusting unit comprises an initial expression recording module, an information recording module, a change recording module and a data analysis and integration module, wherein the output end of the initial expression recording module is connected with the input end of the information recording module, the output end of the information recording module is connected with the input end of the change recording module, and the output end of the change recording module is connected with the input end of the data analysis and integration module.
(III) advantageous effects
The invention provides a control method based on face recognition. The method has the following beneficial effects:
(1) the control method based on the face recognition records the facial expressions of the personnel under different emotions when the personnel in the community input the face information, identifies the facial expressions of the personnel when the personnel enter the community through an entrance guard, and when the negative emotion of the personnel is detected, the atmosphere adjusting unit consolidates the personnel in the forms of short videos and reciting short texts, monitors the emotion change of the personnel in real time in the consolation process, can timely change the consolation mode when the emotion deteriorates, thereby continuously carrying out heuristic relief on the emotion of the personnel, flexibly generates different consolation modes aiming at different personnel, further effectively improves the probability of the improvement of the emotion of the personnel, ensures that the personnel have positive emotion after entering the community, and simultaneously carries out consolation type saving on the personnel at the doorway of the community, the communication environment can be provided for personnel in the same cell, and convenience conditions are provided for establishment of a harmonious community.
(2) According to the control method based on the face recognition, through the arrangement of the safety early warning unit, on the basis of the face recognition, the personnel which are not recorded are networked for investigation, so that the safety of the internal environment of a cell can be ensured, police can be assisted to accelerate the pursuit of criminals, and the functionality of entrance guard is improved.
(3) According to the control method based on the face recognition, the emotion of a person is judged by the aid of the system and the manual cooperation mode through the arrangement of the figure judging unit, so that the situation that the entrance guard is damaged due to emotional impatience of the person is avoided when the entrance guard recognition system consols, and long-term stable operation of the system is guaranteed.
Drawings
FIG. 1 is a schematic block diagram of the system of the present invention;
FIG. 2 is a system schematic block diagram of an acquisition unit of the present invention;
FIG. 3 is a schematic block diagram of a system of a person evaluation unit according to the present invention;
FIG. 4 is a system schematic block diagram of the emotion entry unit of the present invention;
FIG. 5 is a system schematic block diagram of the safety precaution unit of the present invention;
FIG. 6 is a system schematic block diagram of a voice interaction unit of the present invention;
FIG. 7 is a schematic block diagram of a system of a comfort display unit of the present invention;
FIG. 8 is a system schematic block diagram of an ambience adjusting unit of the present invention.
In the figure, 1, an entrance guard identification system; 2. a central processing module; 3. a collection unit; 4. a character evaluation unit; 5. an emotion recording unit; 6. a safety early warning unit; 7. a voice interaction unit; 8. a comfort display unit; 9. an identity authentication module; 10. an address determination module; 11. an emotion change monitoring module; 12. a short video playing module; 13. a short sentence broadcasting module; 14. a face acquisition module; 15. an emotion classification module; 16. an identification acquisition module; 17. a personal storage module; 18. a visitor screening module; 19. a networking synchronization module; 20. a face comparison module; 21. an alarm module; 22. identifying and confirming the module; 23. a voice broadcasting module; 24. a time-delay camera module; 25. a network collection module; 26. a manual screening module; 27. a classified storage module; 28. an atmosphere adjusting unit; 29. an initial expression recording module; 30. an information recording module; 31. a change recording module; 32. and a data analysis and integration module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 to 8, an embodiment of the present invention provides a technical solution: a control method based on face recognition specifically comprises the following steps:
step one, face acquisition: acquiring information of all personnel in a cell through an acquisition unit 3, namely authenticating official authentication information of the personnel in the cell by using an address determination module 10, wherein the official authentication information comprises a house property certificate and a house renting certificate, authenticating identity cards and account book information of the personnel by using an identity authentication module 9, acquiring facial information of the personnel by using a face acquisition module 14 in an emotion recording unit 5 after the authentication is completed, dividing facial emotion into positive emotion and negative emotion respectively by using an emotion division module 15, acquiring facial expression of the personnel under the positive emotion and facial expression under the negative emotion by using an identification acquisition module 16, and storing the facial information and the facial expression of the personnel in a personal storage module 17;
step two, safety early warning: in the process of face recognition, the access control recognition system 1 detects that the visitor is an unrecorded person, directly enters the visitor screening module 18 in the safety early warning unit 6, exchanges information with wanted information on a network through the networking synchronization module 19, records the visitor and face information through the face comparison module 20, directly starts the alarm module 21 when an inosculation condition occurs, and synchronously alarms through the networking synchronization module 19, and directly signs identity information, contact information and places needing to go to at a security department of a community when the inosculation condition does not occur;
step three, network message investigation: short videos or short sentences with the emotion relaxing function are gathered on line directly through a network gathering module 25 in the consolation display unit 8, then information gathered from the network is screened through a manual screening module 26, and then classified and stored through a classified storage module 27 according to the modes of videos and characters;
step four, interactive comfort: in the process of face recognition, after the entrance guard recognition system 1 determines that a person belongs to the cell after comparing the face collected in the first step, the facial expression of the person at the moment is recognized through the recognition module 22 in the voice interaction unit 7, wherein the voice interaction unit 7 comprises the recognition module 22, the voice broadcast module 23 and the delay camera module 24, the output end of the recognition module 22 is connected with the input end of the voice broadcast module 23, the output end of the voice broadcast module 23 is connected with the input end of the delay camera module (24), then the comparison and verification are carried out on the facial expression information under different moods stored in the personal storage module 17 in the first step, the person at the moment is recognized to be in a negative mood state, short piece video playing collected in the third step is directly carried out, voice in the video is broadcasted through the voice broadcast module 23, and then the facial expression of the person watching the video is collected through the delay camera module 24, the atmosphere adjusting unit 28 is started through the emotion change monitoring module 11 in the character judging unit 4, namely, the facial expression of a person before watching a video is recorded through the initial expression recording module 29, the emotion state of the person is comprehensively analyzed through the watching time length, the body action and the emotion change condition of the person through the data analysis and integration module 32, the facial expression change in the watching process of the person is recorded through the change recording module 31, when the emotion of the person fluctuates and is improved, the video type played in the short video playing module 12 when the emotion of the person is improved is recorded, then the mood of the person is analyzed through the data analysis and integration module 32, the video type and the short text type when the emotion of the person is improved are recorded and are associated with the person ID in the database, when the emotion of the person fluctuates, the corresponding video type or short text type associated with the person ID is played,
it should be noted that, when analyzing the mood of the person, judging whether the person has a situation of aggravation of negative emotion, the aggravated situation occurs, playing the video type when the recorded emotion is improved, simultaneously starting the short sentence broadcasting module 13 to broadcast the short message, starting the atmosphere adjusting unit 28 to record again, and recording the short message type played in the short sentence broadcasting module 13 when the emotion of the person is improved, playing the short message type when the recorded emotion is improved, simultaneously closing the playing of the short video playing module 12, only playing the short message broadcasting through the short sentence broadcasting module 13, and under the situation that the emotion is always worsened and the improvement fluctuation is not stable, no video and short message is played any more.
As a preferred scheme, the entrance guard identification system 1 is in bidirectional connection with the central processing module 2, the central processing module 2 is in bidirectional connection with the acquisition unit 3, the character judging unit 4, the emotion recording unit 5, the safety early warning unit 6, the voice interaction unit 7 and the comfort display unit 8 respectively, the output end of the emotion recording unit 5 is connected with the input ends of the character judging unit 4, the safety early warning unit 6 and the voice interaction unit 7 respectively, further, the facial expressions of a person under different emotions are recorded when the person enters the district through the entrance guard, the facial expressions of the person are identified when the person enters the district through the entrance guard, the atmosphere adjusting unit 28 consolidates the person in the form of short video and recitation short text when the emotion of the person is detected to be negative, and the emotion change of the person is monitored in real time in the consolation process, when the emotion worsens, the comfort mode can be timely changed, so that the emotion of the person is continuously and tentatively relieved, different comfort modes are flexibly generated for different persons, the emotion improvement probability of the person is effectively improved, the person is guaranteed to have positive emotion after entering the community, meanwhile, the comfort mode is saved for the person at the entrance of the community, an exchange environment is provided for the person in the same community, and convenience is provided for establishment of a harmonious community.
Preferably, the acquisition unit 3 includes an identity authentication module 9 and an address determination module 10, and an output end of the address determination module 10 is connected to an input end of the identity authentication module 9.
As a preferred scheme, the character evaluation unit 4 includes an emotion change monitoring module 11, a short video playing module 12, and a short sentence broadcasting module 13, and an output end of the emotion change monitoring module 11 is connected to input ends of the short video playing module 12 and the short sentence broadcasting module 13, respectively.
As a preferred scheme, the emotion recording unit 5 includes a face acquisition module 14, an emotion classification module 15, an identification acquisition module 16 and a personal storage module 17, an output end of the face acquisition module 14 is connected with an input end of the emotion classification module 15, an output end of the emotion classification module 15 is connected with an input end of the identification acquisition module 16, and an output end of the identification acquisition module 16 is connected with an input end of the personal storage module 17.
As the preferred scheme, the safety precaution unit 6 comprises a visitor screening module 18, a networking synchronization module 19, a human face comparison module 20 and an alarm module 21, wherein the output end of the visitor screening module 18 is connected with the input end of the networking synchronization module 19, the output end of the networking synchronization module 19 is connected with the input end of the human face comparison module 20, the output end of the human face comparison module 20 is connected with the input end of the alarm module 21, and the output end of the alarm module 21 is connected with the input end of the networking synchronization module 19.
Preferably, the comfort display unit 8 comprises a network collection module 25, a manual screening module 26 and a classification storage module 27, wherein an output end of the network collection module 25 is connected with an input end of the manual screening module 26, and an output end of the manual screening module 26 is connected with an input end of the classification storage module 27, so that the network collection can effectively utilize the network effect to improve the quality of short videos and short texts with the comfort negative emotion function.
Preferably, the atmosphere adjusting unit 28 includes an initial expression recording module 29, an information recording module 30, a change recording module 31 and a data analysis and integration module 32, wherein an output end of the initial expression recording module 29 is connected with an input end of the information recording module 30, an output end of the information recording module 30 is connected with an input end of the change recording module 31, and an output end of the change recording module 31 is connected with an input end of the data analysis and integration module 32.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation. The use of the phrase "comprising one of the elements does not exclude the presence of other like elements in the process, method, article, or apparatus that comprises the element.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. A control method based on face recognition is characterized in that: the method specifically comprises the following steps:
step one, face acquisition: the acquisition unit (3) is used for acquiring information of all personnel in the cell, namely, the address determination module (10) is used for authenticating official authentication information of the personnel in the cell, wherein the official authentication information comprises a house property certificate and a house renting certificate, and the identity card and the account book information of the personnel are authenticated through an identity authentication module (9), the face information of the person is collected through a face collecting module (14) in the emotion recording unit (5), and the facial emotion is divided into positive emotion and negative emotion respectively by an emotion division module (15), and the facial expressions of the person in positive emotions and the facial expressions of the person in negative emotions are acquired by an identification acquisition module (16), then storing the person's facial information and facial expressions in a personal storage module (17);
step two, safety early warning: in the process of face recognition, an access control recognition system (1) detects that a visitor is an unrecorded person, directly enters a visitor screening module (18) in a safety early warning unit (6), exchanges information with wanted information on a network through a networking synchronization module (19), records the visitor and face information through a face comparison module (20), directly starts an alarm module (21) when an inosculation condition occurs, and synchronously alarms through the networking synchronization module (19), and directly signs identity information, contact modes and places needing to go to a security department of a community when the inosculation condition does not occur;
step three, network message investigation: short videos or short sentences with the emotion relaxing function are gathered on line directly through a network gathering module (25) in a consolation display unit (8), then information gathered from the network is screened through a manual screening module (26), and then classified storage is carried out through a classified storage module (27) according to the modes of videos and characters;
step four, interactive comfort: in the process of face recognition, after the entrance guard recognition system (1) determines that a person belongs to the cell after comparing the face collected in the first step, the facial expression of the person at the moment is recognized through a recognition module (22) in a voice interaction unit (7), wherein the voice interaction unit (7) comprises the recognition module (22), a voice broadcast module (23) and a delay camera module (24), the output end of the recognition module (22) is connected with the input end of the voice broadcast module (23), the output end of the voice broadcast module (23) is connected with the input end of the delay camera module (24), and then the facial expression information stored in a personal storage module (17) in the first step and under different emotions is compared and verified, the person is recognized to be in a negative emotion state at the moment, and short videos or short sentences collected in the third step are directly played, voice in the video is broadcasted through a voice broadcasting module (23), facial expression of a person watching the video is collected through a delayed camera module (24), an atmosphere adjusting unit (28) is started through an emotion change monitoring module (11) in a character judging unit (4), namely, the facial expression of the person before watching the video is recorded through an initial expression recording module (29), a data analysis and integration module (32) is utilized to comprehensively analyze the emotional state of the person through watching duration, limb actions and emotion change conditions of the person, a change recording module (31) is utilized to record the facial expression change of the person in the watching process, and when the emotion of the person fluctuates and improves, the video type played in a short video playing module (12) or the short sentence type played by a short sentence broadcasting module (13) when the emotion of the person improves, and is associated with said person in the database, and when the person again experiences mood swings, the video type or the essay type associated with said person is played.
2. The control method based on the face recognition as claimed in claim 1, wherein: entrance guard's identification system (1) and central processing module (2) realize both way junction, central processing module (2) realizes both way junction with acquisition unit (3), personage judge unit (4), emotion entering unit (5), safety precaution unit (6), pronunciation interactive unit (7) and consolation display element (8) respectively, personage judge unit (4) and atmosphere regulation unit (28) realize both way junction, the output of emotion entering unit (5) is connected with personage judge unit (4), safety precaution unit (6) and pronunciation interactive unit's (7) input respectively.
3. The control method based on the face recognition as claimed in claim 1, wherein: the acquisition unit (3) comprises an identity authentication module (9) and an address determination module (10), wherein the output end of the address determination module (10) is connected with the input end of the identity authentication module (9).
4. The control method based on the face recognition as claimed in claim 1, wherein: the character evaluation unit (4) comprises an emotion change monitoring module (11), a short video playing module (12) and a short sentence broadcasting module (13), and the output end of the emotion change monitoring module (11) is connected with the input ends of the short video playing module (12) and the short sentence broadcasting module (13) respectively.
5. The control method based on the face recognition as claimed in claim 1, wherein: the emotion recording unit (5) comprises a face acquisition module (14), an emotion dividing module (15), an identification acquisition module (16) and a personal storage module (17), the output end of the face acquisition module (14) is connected with the input end of the emotion dividing module (15), the output end of the emotion dividing module (15) is connected with the input end of the identification acquisition module (16), and the output end of the identification acquisition module (16) is connected with the input end of the personal storage module (17).
6. The control method based on the face recognition as claimed in claim 1, wherein: safety precaution unit (6) include visitor screening module (18), networking synchronization module (19), people's face and compare module (20) and alarm module (21), the output of visitor screening module (18) is connected with the input of networking synchronization module (19), the output of networking synchronization module (19) is connected with the input that people's face compared module (20), people's face is compared the output of module (20) and is connected with the input of alarm module (21), the output of alarm module (21) is connected with the input of networking synchronization module (19).
7. The control method based on the face recognition as claimed in claim 1, wherein: the consolation display unit (8) comprises a network collection module (25), a manual screening module (26) and a classification storage module (27), wherein the output end of the network collection module (25) is connected with the input end of the manual screening module (26), and the output end of the manual screening module (26) is connected with the input end of the classification storage module (27).
8. The control method based on the face recognition as claimed in claim 1, wherein: the atmosphere adjusting unit (28) comprises an initial expression recording module (29), an information recording module (30), a change recording module (31) and a data analysis integration module (32), the output end of the initial expression recording module (29) is connected with the input end of the information recording module (30), the output end of the information recording module (30) is connected with the input end of the change recording module (31), and the output end of the change recording module (31) is connected with the input end of the data analysis integration module (32).
CN202110662630.5A 2021-06-15 2021-06-15 Control method based on face recognition Active CN113409507B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110662630.5A CN113409507B (en) 2021-06-15 2021-06-15 Control method based on face recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110662630.5A CN113409507B (en) 2021-06-15 2021-06-15 Control method based on face recognition

Publications (2)

Publication Number Publication Date
CN113409507A CN113409507A (en) 2021-09-17
CN113409507B true CN113409507B (en) 2021-12-03

Family

ID=77684047

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110662630.5A Active CN113409507B (en) 2021-06-15 2021-06-15 Control method based on face recognition

Country Status (1)

Country Link
CN (1) CN113409507B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114626818B (en) * 2022-03-16 2023-05-02 湖南检信智能科技有限公司 Big data-based pre-post emotion comprehensive assessment method

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014106816A (en) * 2012-11-28 2014-06-09 Glory Ltd Entrance/exit management apparatus and entrance/exit management method
CN103530912A (en) * 2013-09-27 2014-01-22 深圳市迈瑞思智能技术有限公司 Attendance recording system having emotion identification function, and method thereof
US10061977B1 (en) * 2015-04-20 2018-08-28 Snap Inc. Determining a mood for a group
CN107833018A (en) * 2017-11-10 2018-03-23 郑州工业应用技术学院 A kind of enterprise management system based on face recognition technology
GB2572317A (en) * 2018-03-06 2019-10-02 Skoogmusic Ltd Control apparatus and method
CN108830975A (en) * 2018-04-27 2018-11-16 安徽继远软件有限公司 A kind of Intelligent human-face identification gate control system and control method
CN108961509A (en) * 2018-07-13 2018-12-07 安徽灵图壹智能科技有限公司 A kind of cell recognition of face entrance guard security system and its method
CN109472212A (en) * 2018-10-16 2019-03-15 广州师盛展览有限公司 A kind of mood analysis based on the identification of people face is registered interaction systems
CN111666780A (en) * 2019-03-05 2020-09-15 北京入思技术有限公司 Intelligent door control security method based on emotion recognition technology
CN212334294U (en) * 2020-03-27 2021-01-12 重庆厚齐科技有限公司 Multimedia safety warning system for elevator
CN111667599A (en) * 2020-06-09 2020-09-15 安徽省徽腾智能交通科技有限公司 Face recognition card punching system and method
CN111881822A (en) * 2020-07-27 2020-11-03 深圳市爱深盈通信息技术有限公司 Access control method, device, equipment and storage medium based on face recognition
CN112863038A (en) * 2021-01-15 2021-05-28 胜高连锁酒店管理股份有限公司 Hotel access control recognition system and method based on face recognition

Also Published As

Publication number Publication date
CN113409507A (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN106204815B (en) A kind of access control system based on human face detection and recognition
US9230547B2 (en) Metadata extraction of non-transcribed video and audio streams
KR102177235B1 (en) An attendance check system using deep learning based face recognition
CN103530912A (en) Attendance recording system having emotion identification function, and method thereof
CN105718874A (en) Method and device of in-vivo detection and authentication
CN104376250A (en) Real person living body identity verification method based on sound-type image feature
CN109256136A (en) A kind of audio recognition method and device
CN112102850B (en) Emotion recognition processing method and device, medium and electronic equipment
CN108537922A (en) Visitor's method for early warning based on recognition of face and system
CN108399671A (en) A kind of Internet of Things vena metacarpea video gate inhibition integrated system
US20150019206A1 (en) Metadata extraction of non-transcribed video and audio streams
CN110458591A (en) Advertising information detection method, device and computer equipment
CN113409507B (en) Control method based on face recognition
CN110516568B (en) College multi-scene data management method and system based on face recognition
CN111275444A (en) Contract signing-based double recording method and device, terminal and storage medium
CN102929887A (en) Quick video retrieval method and system based on sound feature identification
CN110211590A (en) A kind of processing method, device, terminal device and the storage medium of meeting hot spot
CN103208144A (en) Dormitory-management system based on face recognition
CN112989950A (en) Violent video recognition system oriented to multi-mode feature semantic correlation features
CN111507256A (en) Face recognition system for counter information acquisition
CN108091016A (en) The smart lock that a kind of vocal print method for unlocking and application this method are opened
CN110796058A (en) Video behavior identification method based on key frame extraction and hierarchical expression
CN111461946A (en) Intelligent public security interrogation system
CN112132057A (en) Multi-dimensional identity recognition method and system
CN110598607A (en) Non-contact and contact cooperative real-time emotion intelligent monitoring system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant