CN113633870B - Emotion state adjustment system and method - Google Patents

Emotion state adjustment system and method Download PDF

Info

Publication number
CN113633870B
CN113633870B CN202111017756.3A CN202111017756A CN113633870B CN 113633870 B CN113633870 B CN 113633870B CN 202111017756 A CN202111017756 A CN 202111017756A CN 113633870 B CN113633870 B CN 113633870B
Authority
CN
China
Prior art keywords
emotion
mobile terminal
current user
expression
sensing data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111017756.3A
Other languages
Chinese (zh)
Other versions
CN113633870A (en
Inventor
李莉
熊曾艳
鲍禹璇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Polytechnic University
Original Assignee
Wuhan Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Polytechnic University filed Critical Wuhan Polytechnic University
Priority to CN202111017756.3A priority Critical patent/CN113633870B/en
Publication of CN113633870A publication Critical patent/CN113633870A/en
Application granted granted Critical
Publication of CN113633870B publication Critical patent/CN113633870B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0027Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the hearing sense
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0044Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense

Abstract

The invention discloses an emotion state adjusting system and method. The system comprises a emotion detection module, a mobile terminal and a control module, wherein the emotion detection module acquires facial expression images and hand pressure sensing data of a current user and sends the facial expression images and the hand pressure sensing data to the mobile terminal; the mobile terminal performs feature extraction on the facial expression image to obtain expression feature information, and processes the hand pressure sensing data to obtain an emotion fluctuation signal; the mobile terminal determines a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal, and sends the virtual experience scene to the emotion detection module; the emotion detection module adjusts an emotion state of a current user based on the virtual experience scene. By means of the method, the virtual experience scene is determined according to the expression characteristic information and the emotion fluctuation signal of the user, and then the emotion state adjustment is carried out based on the virtual experience scene, so that the emotion state of the user is accurately detected, and the emotion state of the user is effectively adjusted.

Description

Emotion state adjustment system and method
Technical Field
The invention relates to the technical field of the Internet of things, in particular to an emotion state adjusting system and method.
Background
The accurate identification of emotion features is helpful for the development of subsequent training systems, such as the identification of autism emotion features, and at present, comprehensive intervention methods such as special education, psychological behavior treatment and the like are mostly adopted in clinical treatment. In the prior art, emotion brain electrical data are acquired, then the emotion brain electrical data are subjected to visual processing and displayed on an interface, and then music treatment is performed according to the emotion brain electrical data of the autism children. However, the emotional result of the autism children cannot be accurately detected through the electroencephalogram data, and the emotional state of the user cannot be effectively regulated through music treatment, so that how to accurately detect the emotional state of the user and effectively regulate the emotional state of the user is a technical problem to be solved urgently.
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The invention mainly aims to provide an emotion state adjusting system and method, which aim to solve the technical problem of how to accurately detect the emotion state of a user and effectively adjust the emotion state of the user.
In order to achieve the above object, the present invention provides an emotional state adjustment system, which includes an emotion detection module and a mobile terminal;
the emotion detection module is used for collecting facial expression images and hand pressure sensing data of a current user and sending the facial expression images and the hand sensing data to the mobile terminal;
the mobile terminal is used for extracting the characteristics of the facial expression image to obtain expression characteristic information, and processing the hand pressure sensing data to obtain an emotion fluctuation signal;
the mobile terminal is further configured to determine a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal, and send the virtual experience scene to the emotion detection module;
the emotion detection module is further configured to adjust an emotion state of the current user based on the virtual experience scene.
Preferably, the mobile terminal is further configured to determine an emotion score of the current user according to the emotion fluctuation signal;
the mobile terminal is further used for determining emotion healing strategies according to the expression characteristic information and the emotion scores;
the mobile terminal is further used for determining a virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
Preferably, the mobile terminal is further configured to determine an emotional state level of the current user according to the expression feature information and the emotion score;
the mobile terminal is further used for determining an emotion healing strategy according to the emotion state grade.
Preferably, the mobile terminal is further configured to determine whether the emotion score is greater than a preset threshold;
and the mobile terminal is further used for determining the emotion state grade of the current user according to the expression characteristic information when the emotion score is larger than the preset threshold value.
Preferably, the mobile terminal is further configured to analyze the mood swings signal to obtain a mood swings analysis chart;
the mobile terminal is further used for determining the emotion score of the current user according to the emotion fluctuation analysis graph.
Preferably, the mobile terminal is further configured to generate a virtual character picture according to the expression characteristic information, and determine a virtual background picture according to the emotion fluctuation signal;
the mobile terminal is further configured to generate a virtual experience scene according to the virtual character picture, the virtual background picture and the emotion healing policy.
Preferably, the mobile terminal is further configured to generate an emotion change log of the current user according to the facial expression image and the emotion fluctuation analysis graph, so that a supervisor can view the emotion change log of the current user.
In addition, in order to achieve the above object, the present invention also proposes an emotional state adjustment method applied to the emotional state adjustment system as described above, the emotional state adjustment system including an emotion detection module and a mobile terminal, the emotional state adjustment method including the steps of:
the emotion detection module acquires facial expression images and hand pressure sensing data of a current user and sends the facial expression images and the hand sensing data to the mobile terminal;
the mobile terminal performs feature extraction on the facial expression image to obtain expression feature information, and processes the hand pressure sensing data to obtain an emotion fluctuation signal;
the mobile terminal determines a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal, and sends the virtual experience scene to the emotion detection module;
the emotion detection module adjusts the emotion state of the current user based on the virtual experience scene.
Preferably, the step of determining a virtual experience scene according to the expression feature information and the mood swings signal includes:
the mobile terminal determines the emotion score of the current user according to the emotion fluctuation signal;
the mobile terminal determines an emotion healing strategy according to the expression characteristic information and the emotion score;
and the mobile terminal determines a virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
Preferably, the step of determining an emotion healing strategy according to the expression feature information and the emotion score includes:
the mobile terminal determines the emotion state grade of the current user according to the expression characteristic information and the emotion score;
and the mobile terminal determines an emotion healing strategy according to the emotion state grade.
According to the invention, firstly, the emotion detection module collects facial expression images and hand pressure sensing data of a current user, the facial expression images and the hand pressure sensing data are sent to the mobile terminal, then the mobile terminal performs feature extraction on the facial expression images to obtain expression feature information, processes the hand pressure sensing data to obtain emotion fluctuation signals, the mobile terminal determines a virtual experience scene according to the expression feature information and the emotion fluctuation signals, sends the virtual experience scene to the emotion detection module, and finally the emotion detection module adjusts the emotion state of the current user based on the virtual experience scene. Compared with the prior art, the method has the advantages that the user is subjected to music treatment according to the electroencephalogram data, so that the emotional state of the user is regulated, the virtual experience scene is determined according to the expression characteristic information and the emotion fluctuation signal of the user, and then the emotional state is regulated based on the virtual experience scene, so that the emotional state of the user is accurately detected, the emotional state of the user is effectively regulated, and the user experience is improved.
Drawings
FIG. 1 is a schematic flow chart of a first embodiment of an emotional state adjustment system according to the invention;
FIG. 2 is a diagram of a wearable eyeglass model of a first embodiment of the emotional state adjustment system of the invention;
FIG. 3 is a schematic view of a therapeutic robot according to a first embodiment of the emotional state adjustment system of the invention;
fig. 4 is a flowchart of a first embodiment of the emotional state adjustment method according to the invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of an emotional state adjustment system according to an embodiment of the invention.
In this embodiment, the emotion state adjustment system includes an emotion detection module 1001 and a mobile terminal 1002; the emotion detection module 1001 is configured to collect facial expression images and hand pressure sensing data of a current user, and send the facial expression images and the hand sensing data to the mobile terminal 1002; the mobile terminal 1002 is configured to perform feature extraction on the facial expression image to obtain expression feature information, and process the hand pressure sensing data to obtain an emotion fluctuation signal; the mobile terminal 1002 is further configured to determine a virtual experience scene according to the expression feature information and the emotion fluctuation signal, and send the virtual experience scene to the emotion detection module 1001; the emotion detection module 1001 is further configured to adjust an emotion state of the current user based on the virtual experience scene.
It should be noted that, the execution main body of this scheme is emotion detection module 1001 and mobile terminal 1002, wherein, emotion detection module and mobile terminal exist interactive mode, emotion detection module includes wearing formula glasses and treats the guaiac robot, it should be understood that wearing formula glasses can gather the facial expression image of current user, treat guaiac robot can gather the hand pressure sensing data of current user, mobile terminal can carry out data processing to facial expression image and the hand pressure sensing data of gathering, obtain expression characteristic information and emotion fluctuation signal, later according to expression characteristic information and emotion fluctuation signal generation virtual experience scene, finally transmit virtual experience scene to wearing formula glasses, so that current user experiences etc..
Fig. 2 is a diagram of wearable glasses model, fig. 2 is a diagram of wearable glasses model of a first embodiment of an emotional state adjusting system according to the present invention, and the wearable glasses have the following structure: the shell is made of ABS plastic or carbon fiber material. After wearing, the utility model is opened intelligently. Virtual reality technology (Virtual Reality VR) masks are composed of an optical lens and an Organic Light Emitting Diode (OLED) screen together to form a VR display system. The face mask is provided with a pinhole lens for monitoring or shooting the facial expression image of the current user in real time. The wearable glasses are divided into a VR experience system and a monitoring system, and then the VR experience system and the monitoring system are integrated with a background or a mobile terminal to achieve the function of remotely monitoring emotion fluctuation in real time.
In a specific implementation, the wearable intelligent glasses provide different training through virtual experience scenes sent by the mobile terminal after being worn, real scenes are carried out to train emotion recognition capability of the autism children, emotion changes of the autism children are recorded in the training process, and finally the emotion changes are displayed to the mobile terminal to provide recording feedback.
Fig. 3 is a schematic diagram of a therapeutic robot according to a first embodiment of the emotional state adjusting system of the present invention, where the therapeutic robot is a lovely cartoon image, so that an autism child can be brought close to the therapeutic robot, the therapeutic robot can play games with children, train and train the behavior rehabilitation of the autism child, and can adjust sound and light according to the needs of the children to better accompany and heal the autism child, and it is required to explain that the therapeutic robot presents different expression changes on the screen according to the detected emotion of the autism child, and detect the psychological behavior changes to start different accompanying modes.
The product appearance of the healing robot is formed by an integral representation of edible environment-friendly silica gel and a plastic panel. The body adopts the plastic plate, and four limbs adopt environmental protection silica gel to drag wantonly, strengthen whole taste. The glasses part adopts a soft lamp panel dot matrix screen, and can present various expression visual pictures. The internal pressure sensing monitoring adopts a pressure sensor module and an intelligent chip. The head adopts the block touch voice control, and through different plate touches, presents various different functions, such as: modes such as music playing, lamplight changing and the like. Can generate emotional interaction with children, and is interesting and more smart.
After the healing robot can receive the electric signal of the pressure sensor, the pressure sensor transmits the signal to the intelligent chip, and the singlechip preferentially detects the pressure sensing degree. The intelligent chip transmits signals to a lamplight or music system by taking data as a reference; and the current pressure data is displayed on the intelligent terminal in a graphical interface, and can be converted into a data mode to be presented in the background.
In this embodiment, the wearable glasses may collect the facial expression image of the current user, i.e. the autism child, in real time through the pinhole lens, then send the facial expression image to the mobile terminal, then the autism child may hold the healing robot by hand, the mobile terminal determines the accompanying expression and the action of the healing robot according to the hand pressure sensor and the facial expression image, then the accompanying expression and the action of the healing robot are transmitted to the mobile terminal, the mobile terminal may form a virtual character image according to the accompanying expression and the action of the healing robot and project the virtual character image to the wearable glasses, the autism child may observe the accompanying expression and the action of the healing robot in real time according to the wearable glasses, then the mobile terminal may generate a virtual background image according to the facial expression image of the autism child and the emotion fluctuation signal corresponding to the hand pressure sensing data, and combine and send the virtual background image and the virtual character image to the wearable glasses, and the autism child may experience real life scene according to the virtual experience scene and interact with the human.
The emotion detection module 1001 is configured to collect facial expression images and hand pressure sensing data of a current user, and send the facial expression images and the hand sensing data to the mobile terminal 1002.
It should be noted that, the present user may be an autism patient, after the autism patient uses the wearable glasses, the facial expression image of the present autism patient is shot through the wearable glasses, the hand pressure sensing data is a handheld healing robot for the autism patient, the inside of the healing robot is provided with a pressure sensor, and then the hand pressure sensing data of the autism patient can be acquired in real time according to the pressure sensor.
In a specific implementation, facial expression images of an autism patient shot by wearable glasses and hand pressure sensing data acquired by a healing robot need to be sent to a mobile terminal, and the mobile terminal can process and display the facial expression images and the hand pressure sensing data.
The healing robot can also send the appearance shape and the appearance change condition of the healing robot to the mobile terminal, so that the mobile terminal can generate a virtual character picture according to the appearance shape and the appearance change condition of the healing robot, the virtual character is consistent with the healing robot, and then the virtual character is sent to the wearable glasses, so that a current user can observe the healing robot in real time through the wearable glasses.
The mobile terminal 1002 is configured to perform feature extraction on the facial expression image to obtain expression feature information, and process the hand pressure sensing data to obtain an emotion fluctuation signal.
The mobile terminal can extract the characteristics of the facial expression image to obtain expression characteristic information, and the expression characteristic information is the expression characteristic information of the current user, wherein the expression characteristic information comprises contour information of the eyebrows and the mouth. The mobile terminal processes the hand pressure sensing data to obtain an emotion fluctuation signal, wherein the emotion fluctuation signal can be understood as an emotion change signal generated by a current user according to the hand pressure sensing data.
The mobile terminal 1002 is further configured to determine a virtual experience scene according to the expression feature information and the emotion fluctuation signal, and send the virtual experience scene to the emotion detection module 1001.
The mobile terminal can determine the emotion score of the current user according to the emotion fluctuation signal, then determine the emotion healing strategy according to the expression characteristic information and the emotion score, and finally determine the virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
The emotion score may be a scoring value corresponding to the emotion fluctuation signal, and then the emotion change condition of the current user is judged according to the scoring value, and the emotion score may be a digital representation, such as 5 or 8, and the higher the emotion score is, the stronger the emotion state of the current user is.
In the specific implementation, an emotion fluctuation analysis graph, namely an emotion fluctuation graph, can be generated according to the emotion fluctuation signal, and the emotion change condition and the emotion score of the current user are determined according to the emotion fluctuation graph, for example, the high part of the graph in the emotion graph is flat, the emotion of the current user is stronger, the emotion score can be 9, for example, the emotion graph is more fluctuant, the emotion fluctuation of the current user is stronger, the emotion score can be 7, and the like.
The mobile terminal can determine the emotion state grade of the current user according to the expression characteristic information and the emotion score, and then determine the emotion healing strategy according to the emotion state grade.
It should be noted that, according to the expression feature information, the expression state of the current user, such as a heart injury, difficulty or tension, etc., can be clarified. The emotional state level may be an emotional state high level, an emotional state medium level, an emotional state low level, or the like.
The emotion healing strategy can be the duration of the virtual experience scene of the current user, can be 5min, can also be 10min and the like.
Assuming that the emotion feature information of the current user is in a tension state and the emotion score is 9, the emotion state level of the corresponding current user is in a tension high level, and the emotion healing strategy corresponding to the tension high level can be 10min, namely the current user can be in a virtual experience scene 9mi by using wearable glasses.
It should be understood that the mobile terminal may also transmit the expression characteristic information of the current user to the healing robot, and the healing robot may display a corresponding expression state or accompanying expression state at the display screen according to the expression characteristic information of the current user.
In a specific implementation, the mobile terminal may further determine whether the emotion score is greater than a preset threshold, and determine the emotion state level of the current user according to the expression feature information when the emotion score is greater than the preset threshold.
The preset threshold value is user-defined and set for 4, for example, the emotion score is 3, and the preset threshold value is 4, so that the emotion score is smaller than the preset threshold value, the emotion of the current user is normal, and the emotion state grade of the current user does not need to be determined; if the emotion score is 5 and the preset threshold value is 4, the emotion score is larger than the preset threshold value, the emotion of the current user is tense, and the emotion state grade of the current user is low.
In this embodiment, the emotion score of 5-6 may be lower, the emotion score of 7-8 may be higher, the emotion score of 9-10 may be higher, etc.
It should be noted that, the mobile terminal may generate a virtual character image according to the expression feature information, determine a virtual background image according to the emotion fluctuation signal, and then generate a virtual experience scene according to the virtual character image, the virtual background image and the emotion healing policy.
The virtual background pictures can be prestored into a plurality of virtual background pictures of the mobile terminal, and a preset background mapping relation table exists in the mobile terminal, wherein a plurality of virtual background pictures and a plurality of user emotion scores exist in the preset background mapping relation table, and the virtual background pictures and the user emotion scores have a one-to-one correspondence.
The emotion detection module 1001 is further configured to adjust an emotion state of the current user based on the virtual experience scene.
It should be noted that, when the current user adjusts the emotion state based on the virtual experience scene, the wearing glasses and the healing robot can also monitor the facial expression image and the hand pressure sensing data in real time, and then generate the emotion change log of the current user according to the emotion fluctuation analysis chart corresponding to the facial expression image and the hand pressure sensing data, so that the supervision personnel can check the emotion change log of the current user.
In this embodiment, firstly, the emotion detection module collects facial expression images and hand pressure sensing data of a current user, and sends the facial expression images and the hand pressure sensing data to the mobile terminal, then the mobile terminal performs feature extraction on the facial expression images to obtain expression feature information, processes the hand pressure sensing data to obtain emotion fluctuation signals, the mobile terminal determines a virtual experience scene according to the expression feature information and the emotion fluctuation signals, sends the virtual experience scene to the emotion detection module, and finally the emotion detection module adjusts the emotion state of the current user based on the virtual experience scene. Compared with the prior art, the method has the advantages that the user is subjected to music treatment according to the electroencephalogram data, so that the emotional state of the user is regulated, the virtual experience scene is determined according to the expression characteristic information and the emotion fluctuation signal of the user, and then the emotional state is regulated based on the virtual experience scene, so that the emotional state of the user is accurately detected, the emotional state of the user is effectively regulated, and the user experience is improved.
In addition, referring to fig. 4, an embodiment of the present invention further proposes an emotional state adjustment method, where the emotional state adjustment method is applied to the emotional state adjustment system described above, the emotional state adjustment system includes an emotion detection module and a mobile terminal, and the emotional state adjustment method includes the following steps:
step S40: the emotion detection module collects facial expression images and hand pressure sensing data of a current user and sends the facial expression images and the hand pressure sensing data to the mobile terminal.
It should be noted that, the present user may be an autism patient, after the autism patient uses the wearable glasses, the facial expression image of the present autism patient is shot through the wearable glasses, the hand pressure sensing data is a handheld healing robot for the autism patient, the inside of the healing robot is provided with a pressure sensor, and then the hand pressure sensing data of the autism patient can be acquired in real time according to the pressure sensor.
In a specific implementation, facial expression images of an autism patient shot by wearable glasses and hand pressure sensing data acquired by a healing robot need to be sent to a mobile terminal, and the mobile terminal can process and display the facial expression images and the hand pressure sensing data.
The healing robot can also send the appearance shape and appearance change condition of the healing robot to the mobile terminal, so that the mobile terminal can generate a virtual figure picture according to the appearance shape and appearance change condition of the healing robot, the virtual figure picture is consistent with the healing robot, and then the virtual figure is sent to the wearing glasses, so that a current user can observe the healing robot in real time through the wearing glasses
Step S41: and the mobile terminal performs feature extraction on the facial expression image to obtain expression feature information, and processes the hand pressure sensing data to obtain an emotion fluctuation signal.
The mobile terminal can extract the characteristics of the facial expression image to obtain expression characteristic information, and the expression characteristic information is the expression characteristic information of the current user, wherein the expression characteristic information comprises contour information of the eyebrows and the mouth. The mobile terminal processes the hand pressure sensing data to obtain an emotion fluctuation signal, wherein the emotion fluctuation signal can be understood as an emotion change signal generated by a current user according to the hand pressure sensing data.
Step S42: and the mobile terminal determines a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal, and sends the virtual experience scene to the emotion detection module.
The mobile terminal can determine the emotion score of the current user according to the emotion fluctuation signal, then determine the emotion healing strategy according to the expression characteristic information and the emotion score, and finally determine the virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
The emotion score may be a scoring value corresponding to the emotion fluctuation signal, and then the emotion change condition of the current user is judged according to the scoring value, and the emotion score may be a digital representation, such as 5 or 8, and the higher the emotion score is, the stronger the emotion state of the current user is.
In the specific implementation, an emotion fluctuation analysis graph, namely an emotion fluctuation graph, can be generated according to the emotion fluctuation signal, and the emotion change condition and the emotion score of the current user are determined according to the emotion fluctuation graph, for example, the high part of the graph in the emotion graph is flat, the emotion of the current user is stronger, the emotion score can be 9, for example, the emotion graph is more fluctuant, the emotion fluctuation of the current user is stronger, the emotion score can be 7, and the like.
The mobile terminal can determine the emotion state grade of the current user according to the expression characteristic information and the emotion score, and then determine the emotion healing strategy according to the emotion state grade.
It should be noted that, according to the expression feature information, the expression state of the current user, such as a heart injury, difficulty or tension, etc., can be clarified. The emotional state level may be an emotional state high level, an emotional state medium level, an emotional state low level, or the like.
The emotion healing strategy can be the duration of the virtual experience scene of the current user, can be 5min, can also be 10min and the like.
Assuming that the emotion feature information of the current user is in a tension state and the emotion score is 9, the emotion state level of the corresponding current user is in a tension high level, and the emotion healing strategy corresponding to the tension high level can be 10min, namely the current user can be in a virtual experience scene 9mi by using wearable glasses.
It should be understood that the mobile terminal may also transmit the expression characteristic information of the current user to the healing robot, and the healing robot may display a corresponding expression state or accompanying expression state at the display screen according to the expression characteristic information of the current user.
In a specific implementation, the mobile terminal may further determine whether the emotion score is greater than a preset threshold, and determine the emotion state level of the current user according to the expression feature information when the emotion score is greater than the preset threshold.
The preset threshold value is user-defined and set for 4, for example, the emotion score is 3, and the preset threshold value is 4, so that the emotion score is smaller than the preset threshold value, the emotion of the current user is normal, and the emotion state grade of the current user does not need to be determined; if the emotion score is 5 and the preset threshold value is 4, the emotion score is larger than the preset threshold value, the emotion of the current user is tense, and the emotion state grade of the current user is low.
In this embodiment, the emotion score of 5-6 may be lower, the emotion score of 7-8 may be higher, the emotion score of 9-10 may be higher, etc.
It should be noted that, the mobile terminal may generate a virtual character image according to the expression feature information, determine a virtual background image according to the emotion fluctuation signal, and then generate a virtual experience scene according to the virtual character image, the virtual background image and the emotion healing policy.
The virtual background pictures can be prestored into a plurality of virtual background pictures of the mobile terminal, and a preset background mapping relation table exists in the mobile terminal, wherein a plurality of virtual background pictures and a plurality of user emotion scores exist in the preset background mapping relation table, and the virtual background pictures and the user emotion scores have a one-to-one correspondence.
Step S43: the emotion detection module adjusts the emotion state of the current user based on the virtual experience scene.
It should be noted that, when the current user adjusts the emotion state based on the virtual experience scene, the wearing glasses and the healing robot can also monitor the facial expression image and the hand pressure sensing data in real time, and then generate the emotion change log of the current user according to the emotion fluctuation analysis chart corresponding to the facial expression image and the hand pressure sensing data, so that the supervision personnel can check the emotion change log of the current user.
In this embodiment, firstly, the emotion detection module collects facial expression images and hand pressure sensing data of a current user, and sends the facial expression images and the hand pressure sensing data to the mobile terminal, then the mobile terminal performs feature extraction on the facial expression images to obtain expression feature information, processes the hand pressure sensing data to obtain emotion fluctuation signals, the mobile terminal determines a virtual experience scene according to the expression feature information and the emotion fluctuation signals, sends the virtual experience scene to the emotion detection module, and finally the emotion detection module adjusts the emotion state of the current user based on the virtual experience scene. Compared with the prior art, the method has the advantages that the user is subjected to music treatment according to the electroencephalogram data, so that the emotional state of the user is regulated, the virtual experience scene is determined according to the expression characteristic information and the emotion fluctuation signal of the user, and then the emotional state is regulated based on the virtual experience scene, so that the emotional state of the user is accurately detected, the emotional state of the user is effectively regulated, and the user experience is improved.
It should be noted that the above-described working procedure is merely illustrative, and does not limit the scope of the present invention, and in practical application, a person skilled in the art may select part or all of them according to actual needs to achieve the purpose of the embodiment, which is not limited herein.
In addition, technical details that are not described in detail in this embodiment may refer to the emotional state adjustment system provided in any embodiment of the present invention, and are not described herein.
Furthermore, it should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. read only memory (Read Onl y Memory, ROM)/RAM, magnetic disk, optical disk) comprising several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (5)

1. An emotion state adjustment system is characterized by comprising an emotion detection module and a mobile terminal;
the emotion detection module is used for collecting facial expression images of a current user through a pinhole lens based on wearable glasses, collecting hand pressure sensing data through a pressure sensor arranged in the healing robot, and sending the facial expression images and the hand pressure sensing data to the mobile terminal;
the mobile terminal is used for extracting features of the facial expression image to obtain expression feature information, and processing the hand pressure sensing data to obtain emotion fluctuation signals, wherein the expression feature information comprises contour information of eyebrows and mouths, and the emotion fluctuation signals are emotion change signals generated by the current user according to the hand pressure sensing data;
the mobile terminal is further used for determining the emotion score of the current user according to the emotion fluctuation signal;
the mobile terminal is further used for determining an emotion healing strategy according to the expression characteristic information and the emotion score, wherein the emotion healing strategy is duration that the current user is in a virtual experience scene;
the mobile terminal is further used for generating a virtual character picture according to the expression characteristic information, the appearance shape corresponding to the healing robot and the appearance change condition, and determining a virtual background picture according to the emotion fluctuation signal;
the mobile terminal is further used for generating a virtual experience scene according to the virtual character picture, the virtual background picture and the emotion healing strategy;
the emotion detection module is further configured to adjust an emotion state of the current user based on the virtual experience scene.
2. The system of claim 1, wherein the mobile terminal is further configured to determine an emotional state level of the current user based on the expression profile information and the emotional score;
the mobile terminal is further used for determining an emotion healing strategy according to the emotion state grade.
3. The system of claim 2, wherein the mobile terminal is further configured to determine whether the mood score is greater than a preset threshold;
and the mobile terminal is further used for determining the emotion state grade of the current user according to the expression characteristic information when the emotion score is larger than the preset threshold value.
4. A system as claimed in claim 3, wherein the mobile terminal is further adapted to analyze the mood swings signal to obtain a mood swings analysis map;
the mobile terminal is further used for determining the emotion score of the current user according to the emotion fluctuation analysis graph.
5. The system of claim 4, wherein the mobile terminal is further configured to generate an emotion change log for the current user based on the facial expression image and the emotion fluctuation analysis graph, such that a supervisor views the emotion change log for the current user.
CN202111017756.3A 2021-08-31 2021-08-31 Emotion state adjustment system and method Active CN113633870B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111017756.3A CN113633870B (en) 2021-08-31 2021-08-31 Emotion state adjustment system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111017756.3A CN113633870B (en) 2021-08-31 2021-08-31 Emotion state adjustment system and method

Publications (2)

Publication Number Publication Date
CN113633870A CN113633870A (en) 2021-11-12
CN113633870B true CN113633870B (en) 2024-01-23

Family

ID=78424822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111017756.3A Active CN113633870B (en) 2021-08-31 2021-08-31 Emotion state adjustment system and method

Country Status (1)

Country Link
CN (1) CN113633870B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116594511B (en) * 2023-07-17 2023-11-07 天安星控(北京)科技有限责任公司 Scene experience method and device based on virtual reality, computer equipment and medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6425764B1 (en) * 1997-06-09 2002-07-30 Ralph J. Lamson Virtual reality immersion therapy for treating psychological, psychiatric, medical, educational and self-help problems
CN103654798A (en) * 2013-12-11 2014-03-26 四川大学华西医院 Method and device for monitoring and recording emotion
CN106620990A (en) * 2016-11-24 2017-05-10 深圳创达云睿智能科技有限公司 Method and device for monitoring mood
CN106955113A (en) * 2017-05-10 2017-07-18 安徽徽韵心理咨询有限公司 A kind of intelligent hug guiding instrument of reaction type
CN109271018A (en) * 2018-08-21 2019-01-25 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard
CN109571494A (en) * 2018-11-23 2019-04-05 北京工业大学 Emotion identification method, apparatus and pet robot
CN109620185A (en) * 2019-01-31 2019-04-16 山东大学 Self-closing disease assistant diagnosis system, equipment and medium based on multi-modal information
CN109683709A (en) * 2018-12-17 2019-04-26 苏州思必驰信息科技有限公司 Man-machine interaction method and system based on Emotion identification
CN110404148A (en) * 2019-08-07 2019-11-05 上海市精神卫生中心(上海市心理咨询培训中心) A kind of self-closing disease mood based on virtual reality technology pacifies method and system
CN111531552A (en) * 2020-03-13 2020-08-14 华南理工大学 Psychological accompanying robot and emotion support method
KR102280722B1 (en) * 2020-02-20 2021-07-22 공영일 Healing Chair Device for Mental Health Care Service

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11393252B2 (en) * 2019-05-01 2022-07-19 Accenture Global Solutions Limited Emotion sensing artificial intelligence

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6425764B1 (en) * 1997-06-09 2002-07-30 Ralph J. Lamson Virtual reality immersion therapy for treating psychological, psychiatric, medical, educational and self-help problems
CN103654798A (en) * 2013-12-11 2014-03-26 四川大学华西医院 Method and device for monitoring and recording emotion
CN106620990A (en) * 2016-11-24 2017-05-10 深圳创达云睿智能科技有限公司 Method and device for monitoring mood
CN106955113A (en) * 2017-05-10 2017-07-18 安徽徽韵心理咨询有限公司 A kind of intelligent hug guiding instrument of reaction type
CN109271018A (en) * 2018-08-21 2019-01-25 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard
CN109571494A (en) * 2018-11-23 2019-04-05 北京工业大学 Emotion identification method, apparatus and pet robot
CN109683709A (en) * 2018-12-17 2019-04-26 苏州思必驰信息科技有限公司 Man-machine interaction method and system based on Emotion identification
CN109620185A (en) * 2019-01-31 2019-04-16 山东大学 Self-closing disease assistant diagnosis system, equipment and medium based on multi-modal information
CN110404148A (en) * 2019-08-07 2019-11-05 上海市精神卫生中心(上海市心理咨询培训中心) A kind of self-closing disease mood based on virtual reality technology pacifies method and system
KR102280722B1 (en) * 2020-02-20 2021-07-22 공영일 Healing Chair Device for Mental Health Care Service
CN111531552A (en) * 2020-03-13 2020-08-14 华南理工大学 Psychological accompanying robot and emotion support method

Also Published As

Publication number Publication date
CN113633870A (en) 2021-11-12

Similar Documents

Publication Publication Date Title
CN110890140B (en) Virtual reality-based autism rehabilitation training and capability assessment system and method
US20200401216A1 (en) Technique for controlling virtual image generation system using emotional states of user
TWI658377B (en) Robot assisted interaction system and method thereof
CN106730238B (en) Environment self-adaptive intelligent sleep assisting device and method
CN105022929B (en) A kind of cognition accuracy analysis method of personal traits value test
CN109585021B (en) Mental state evaluation method based on holographic projection technology
CN109298779B (en) Virtual training system and method based on virtual agent interaction
CN104298722B (en) Digital video interactive and its method
US10249391B2 (en) Representation of symptom alleviation
CN111063416A (en) Alzheimer disease rehabilitation training and capability assessment system based on virtual reality
CN107463780A (en) A kind of virtual self-closing disease treatment system of 3D and treatment method
CN106362260B (en) VR mood regulation device
JP2012524596A (en) Nasal flow device controller
CN110363129B (en) Early autism screening system based on smiling paradigm and audio-video behavior analysis
Rizzo et al. Performance-driven facial animation: basic research on human judgments of emotional state in facial avatars
JP2017153887A (en) Psychosomatic state estimation apparatus, psychosomatic state estimation method, and eyewear
CN113633870B (en) Emotion state adjustment system and method
CN113837153A (en) Real-time emotion recognition method and system integrating pupil data and facial expressions
JP2021176111A (en) Output control apparatus, output control method, and program
CN109816601A (en) A kind of image processing method and terminal device
JP2020173787A (en) Information processing apparatus, information processing system, information processing method, and information processing program
CN111078007A (en) Virtual reality-based whoop catharsis training method and device
CN109172994A (en) A kind of naked eye 3D filming image display system
JP2018180503A (en) Public speaking assistance device and program
KR102430067B1 (en) Virtual reality based cognitive rehabilitation training system and method with training

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant