CN113633870A - Emotional state adjusting system and method - Google Patents

Emotional state adjusting system and method Download PDF

Info

Publication number
CN113633870A
CN113633870A CN202111017756.3A CN202111017756A CN113633870A CN 113633870 A CN113633870 A CN 113633870A CN 202111017756 A CN202111017756 A CN 202111017756A CN 113633870 A CN113633870 A CN 113633870A
Authority
CN
China
Prior art keywords
emotion
mobile terminal
current user
emotional
virtual experience
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111017756.3A
Other languages
Chinese (zh)
Other versions
CN113633870B (en
Inventor
李莉
熊曾艳
鲍禹璇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Polytechnic University
Original Assignee
Wuhan Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Polytechnic University filed Critical Wuhan Polytechnic University
Priority to CN202111017756.3A priority Critical patent/CN113633870B/en
Publication of CN113633870A publication Critical patent/CN113633870A/en
Application granted granted Critical
Publication of CN113633870B publication Critical patent/CN113633870B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0027Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the hearing sense
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0044Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Psychiatry (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Psychology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Social Psychology (AREA)
  • Anesthesiology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Hematology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an emotional state adjusting system and method. The system comprises an emotion detection module, a mobile terminal and a display module, wherein the emotion detection module acquires a facial expression image and hand pressure sensing data of a current user and sends the facial expression image and the hand sensing data to the mobile terminal; the mobile terminal extracts the features of the facial expression image to obtain expression feature information, and processes the hand pressure sensing data to obtain an emotional fluctuation signal; the mobile terminal determines a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal and sends the virtual experience scene to the emotion detection module; the emotion detection module adjusts the emotional state of the current user based on the virtual experience scene. Through the mode, the virtual experience scene is determined according to the expression characteristic information and the emotion fluctuation signal of the user, and then the emotion state is adjusted based on the virtual experience scene, so that the emotion state of the user can be accurately detected, and the emotion state of the user can be effectively adjusted.

Description

Emotional state adjusting system and method
Technical Field
The invention relates to the technical field of Internet of things, in particular to an emotional state adjusting system and method.
Background
The accurate recognition of the emotional characteristics is helpful for the development of a follow-up training system, such as the recognition of the emotional characteristics of autism, and at present, the clinical treatment mostly adopts comprehensive intervention methods such as special education, psychological behavior treatment and the like. In the prior art, emotion electroencephalogram data are acquired, then are subjected to visual processing and are displayed on an interface, and then music treatment is performed according to the emotion electroencephalogram data of the autistic children. However, the emotional result of the autistic children cannot be accurately detected through the electroencephalogram data, and the emotional state of the user cannot be effectively adjusted through music treatment, so that how to accurately detect the emotional state of the user and effectively adjust the emotional state of the user is an urgent technical problem to be solved.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide an emotional state adjusting system and method, and aims to solve the technical problems of accurately detecting the emotional state of a user and effectively adjusting the emotional state of the user.
In order to achieve the purpose, the invention provides an emotional state regulating system, which comprises an emotion detecting module and a mobile terminal;
the emotion detection module is used for acquiring a facial expression image and hand pressure sensing data of a current user and sending the facial expression image and the hand pressure sensing data to the mobile terminal;
the mobile terminal is used for extracting the characteristics of the facial expression image to obtain expression characteristic information, and processing the hand pressure sensing data to obtain an emotional fluctuation signal;
the mobile terminal is further used for determining a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal and sending the virtual experience scene to the emotion detection module;
the emotion detection module is further used for adjusting the emotion state of the current user based on the virtual experience scene.
Preferably, the mobile terminal is further configured to determine an emotion score of the current user according to the emotion fluctuation signal;
the mobile terminal is further used for determining an emotion curing strategy according to the expression characteristic information and the emotion score;
the mobile terminal is further used for determining a virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
Preferably, the mobile terminal is further configured to determine an emotional state level of the current user according to the expression feature information and the emotion score;
the mobile terminal is further used for determining an emotion healing strategy according to the emotion state grade.
Preferably, the mobile terminal is further configured to determine whether the emotion score is greater than a preset threshold;
the mobile terminal is further used for determining the emotion state grade of the current user according to the expression characteristic information when the emotion score is larger than the preset threshold value.
Preferably, the mobile terminal is further configured to analyze the emotional fluctuation signal to obtain an emotional fluctuation analysis chart;
the mobile terminal is further used for determining the emotion score of the current user according to the emotion fluctuation analysis chart.
Preferably, the mobile terminal is further configured to generate a virtual character picture according to the expression feature information, and determine a virtual background picture according to the emotional fluctuation signal;
the mobile terminal is further used for generating a virtual experience scene according to the virtual character picture, the virtual background picture and the emotion healing strategy.
Preferably, the mobile terminal is further configured to generate an emotion change log of the current user according to the facial expression image and the emotion fluctuation analysis chart, so that a supervisor can view the emotion change log of the current user.
In addition, in order to achieve the above object, the present invention further provides an emotional state adjusting method, where the emotional state adjusting method is applied to the emotional state adjusting system, where the emotional state adjusting system includes an emotion detection module and a mobile terminal, and the emotional state adjusting method includes the following steps:
the emotion detection module acquires a facial expression image and hand pressure sensing data of a current user and sends the facial expression image and the hand pressure sensing data to the mobile terminal;
the mobile terminal extracts the features of the facial expression image to obtain expression feature information, and processes the hand pressure sensing data to obtain an emotional fluctuation signal;
the mobile terminal determines a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal and sends the virtual experience scene to the emotion detection module;
the emotion detection module adjusts the emotional state of the current user based on the virtual experience scene.
Preferably, the step of determining a virtual experience scene according to the expression feature information and the emotional fluctuation signal includes:
the mobile terminal determines the emotion score of the current user according to the emotion fluctuation signal;
the mobile terminal determines an emotion healing strategy according to the expression characteristic information and the emotion score;
and the mobile terminal determines a virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
Preferably, the step of determining an emotion healing strategy according to the expression feature information and the emotion score includes:
the mobile terminal determines the emotion state grade of the current user according to the expression characteristic information and the emotion score;
and the mobile terminal determines an emotion healing strategy according to the emotion state grade.
According to the method, firstly, an emotion detection module collects facial expression images and hand pressure sensing data of a current user, the facial expression images and the hand sensing data are sent to a mobile terminal, then the mobile terminal performs feature extraction on the facial expression images to obtain expression feature information, the hand pressure sensing data are processed to obtain an emotion fluctuation signal, the mobile terminal determines a virtual experience scene according to the expression feature information and the emotion fluctuation signal and sends the virtual experience scene to the emotion detection module, and finally, the emotion detection module adjusts the emotion state of the current user based on the virtual experience scene. Compared with the prior art, the method and the device have the advantages that music treatment is carried out on the user according to the electroencephalogram data, and then the emotional state of the user is adjusted, the virtual experience scene is determined according to the expression characteristic information and the emotion fluctuation signal of the user, then the emotional state adjustment is carried out based on the virtual experience scene, the emotional state of the user is accurately detected, the emotional state of the user is effectively adjusted, and the user experience is improved.
Drawings
FIG. 1 is a schematic flow chart of a first embodiment of the emotional state adjustment system of the invention;
FIG. 2 is a diagram of a wearable eyeglass model in accordance with a first embodiment of the emotional state adjustment system of the invention;
FIG. 3 is a diagram of a healing robot model according to a first embodiment of the emotional state adjustment system of the invention;
fig. 4 is a flowchart illustrating an emotional state adjustment method according to a first embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of an emotional state adjustment system according to an embodiment of the present invention.
In this embodiment, the emotional state adjustment system includes an emotion detection module 1001 and a mobile terminal 1002; the emotion detection module 1001 is configured to collect a facial expression image and hand pressure sensing data of a current user, and send the facial expression image and the hand pressure sensing data to the mobile terminal 1002; the mobile terminal 1002 is configured to perform feature extraction on the facial expression image to obtain expression feature information, and process the hand pressure sensing data to obtain an emotional fluctuation signal; the mobile terminal 1002 is further configured to determine a virtual experience scene according to the expression feature information and the emotion fluctuation signal, and send the virtual experience scene to the emotion detection module 1001; the emotion detection module 1001 is further configured to adjust an emotion state of the current user based on the virtual experience scene.
It should be noted that the main executing body of the scheme is an emotion detection module 1001 and a mobile terminal 1002, wherein the emotion detection module and the mobile terminal have an interactive mode, the emotion detection module comprises wearable glasses and a healing robot, it should be understood that the wearable glasses can collect facial expression images of a current user, the healing robot can collect hand pressure sensing data of the current user, the mobile terminal can perform data processing on the collected facial expression images and hand pressure sensing data to obtain expression characteristic information and emotion fluctuation signals, then a virtual experience scene is generated according to the expression characteristic information and the emotion fluctuation signals, and finally the virtual experience scene is transmitted to the wearable glasses to enable the current user to experience and the like.
Fig. 2 is a diagram of a wearable glasses model according to a first embodiment of the emotion state adjustment system of the present invention, where fig. 2 is a diagram of a wearable glasses model, and the structure of the wearable glasses is as follows: the shell is made of ABS plastic or carbon fiber. After wearing, the intelligent opening is realized. A Virtual Reality technology (Virtual Reality VR) mask is a VR display system which is formed by optical lenses and an Organic Light Emitting Diode (OLED) screen. A pinhole lens is designed on the mask to monitor or shoot the facial expression image of the current user in real time. Wherein, wearing formula glasses have and divide into two subsystems of VR experience system and monitoring system, later VR experience system and monitoring system and backstage or mobile terminal form an organic whole in order to reach the function that remote real time monitoring mood fluctuates and changes.
In the specific implementation, different training is provided through the virtual experience scene sent by the mobile terminal after the wearable intelligent glasses are worn, the real scene is carried out to train the emotion recognition capability of the autistic children, the emotion change of the wearable intelligent glasses is recorded in the training process, and finally the emotion change is displayed to the mobile terminal to provide recording feedback.
Referring to fig. 3, fig. 3 is a model diagram of a healing robot according to a first embodiment of the emotion state adjustment system of the present invention, the healing robot is represented as a lovely cartoon image, which enables autistic children to be close to the healing robot, the healing robot can play games, train and train the action of the autistic children to recover by accompanying the children, and can adjust sound and light according to the needs of the children to better accompany and heal the autistic children.
The appearance of the product of the healing robot is integrally represented by food-grade environment-friendly silica gel and a plastic panel. The body is made of plastic plates, and the four limbs are made of environment-friendly silica gel and can be pulled at will, so that the overall interesting and taste feeling is enhanced. The glasses part adopts soft lamp board dot matrix screen, can demonstrate multiple expression visual picture. The internal pressure monitoring adopts a pressure sensor module and an intelligent chip is arranged. The head adopts the touch speech control of piecemeal, touches through different plates, demonstrates various different functions, if: playing music, light changing and other modes. Can produce an emotional interaction with children, and the interest shows that the whole product is more smart.
After the healing robot can receive the electric signal of the pressure sensor, the pressure sensor transmits the signal to the intelligent chip, and the singlechip preferentially detects the pressure sensitivity degree. The intelligent chip transmits the data to a light or music system by taking the data as a reference; and the current pressure data is displayed to the intelligent terminal in a graphical interface, and can also be converted into a data mode to be presented to the background.
In the embodiment, the wearable glasses can collect facial expression images of current users, namely, autism children, in real time through the pinhole lens, then send the facial expression images to the mobile terminal, then the autism children can hold the healing robot by hands, the mobile terminal determines accompanying expressions and actions of the healing robot according to the hand pressure sensor and the facial expression images, then transmits the accompanying expressions and actions of the healing robot to the mobile terminal, the mobile terminal can form virtual character pictures according to the accompanying expressions and actions of the healing robot and project the virtual character pictures to the wearable glasses, the autism children can observe the accompanying expressions and actions of the healing robot in real time according to the wearable glasses, and then the mobile terminal can also generate virtual background pictures according to the facial expression images of the autism children and mood fluctuation signals corresponding to the hand pressure sensing data, the virtual background picture and the virtual character picture are combined and sent to the wearable glasses, and the autistic children can interact with people according to the virtual experience scene experience real life scene.
The emotion detection module 1001 is configured to collect facial expression images and hand pressure sensing data of a current user, and send the facial expression images and the hand pressure sensing data to the mobile terminal 1002.
It should be noted that, current user can be the autism patient, and the autism patient uses wearing formula glasses after, shoots current autism patient's facial expression image through wearing formula glasses, and hand pressure sensing data is the handheld robot of healing of treating of autism patient, treats that the robot is inside to have pressure sensor, later can gather the hand pressure sensing data of autism patient in real time according to pressure sensor.
In specific implementation, the facial expression image of the autism patient shot by the wearable glasses and the hand pressure sensing data collected by the healing robot need to be sent to the mobile terminal, and the mobile terminal can process and display the facial expression image and the hand pressure sensing data and the like.
The healing robot can also send the appearance shape and the appearance change condition of the healing robot to the mobile terminal, so that the mobile terminal generates a virtual character picture according to the appearance shape and the appearance change condition of the healing robot, the virtual character is consistent with the healing robot, and then the virtual character is sent to the wearable glasses, so that a current user can observe the healing robot in real time through the wearable glasses.
The mobile terminal 1002 is configured to perform feature extraction on the facial expression image to obtain expression feature information, and process the hand pressure sensing data to obtain an emotional fluctuation signal.
The mobile terminal may perform feature extraction on the facial expression image to obtain expression feature information, where it should be noted that the expression feature information is expression feature information of a current user, where the expression feature information includes outline information of eyebrows and mouths. The mobile terminal processes the hand pressure sensing data to obtain an emotion fluctuation signal, wherein the emotion fluctuation signal can be understood as an emotion change signal generated by the current user according to the hand pressure sensing data.
The mobile terminal 1002 is further configured to determine a virtual experience scene according to the expression feature information and the emotion fluctuation signal, and send the virtual experience scene to the emotion detection module 1001.
The mobile terminal can determine the emotion score of the current user according to the emotion fluctuation signal, then determine an emotion healing strategy according to the expression characteristic information and the emotion score, and finally determine a virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
The emotion score can be understood as a score value corresponding to the emotion fluctuation signal, and then the change condition of the current user's emotion is judged according to the score value, the emotion score can be represented by a number, such as 5 or 8, and the higher the emotion score is, the stronger the emotion state of the current user is.
In a specific implementation, an emotion fluctuation analysis graph, that is, an emotion fluctuation curve graph, may be generated according to the emotion fluctuation signal, and the change condition of the emotion and the emotion score of the current user are determined according to the emotion fluctuation curve graph, for example, if the curve in the emotion curve graph is high and gentle, the emotion of the current user is strong, the emotion score may be 9, for example, if the emotion curve graph is high and gentle, the emotion score may be 7, and the like.
The mobile terminal can determine the emotional state grade of the current user according to the expression characteristic information and the emotional score, and then determine the emotional cure strategy according to the emotional state grade.
It should be noted that the expression state of the current user, such as injury, difficulty, or tension, can be specified according to the expression feature information. The emotional state grade can be high emotional state grade, middle emotional state grade, low emotional state grade and the like.
The emotion healing strategy can be the time length of the current user in the virtual experience scene, and can be 5min or 10 min.
Assuming that the expression characteristic information of the current user is in a tense state, and the emotion score is 9, the corresponding emotion state level of the current user is in a tense high level, and the emotion healing strategy corresponding to the tense high level may be 10min, that is, the current user may be in a virtual experience scene 9mi by using wearable glasses.
It should be understood that the mobile terminal may also transmit the expression feature information of the current user to the healing robot, and the healing robot may display the corresponding expression state or accompanying expression state on the display screen according to the expression feature information of the current user.
In specific implementation, the mobile terminal may further determine whether the emotion score is greater than a preset threshold, and determine the current emotion state level of the user according to the expression feature information when the emotion score is greater than the preset threshold.
The preset threshold is set by the user in a self-defined manner and can be 4, for example, if the emotion score is 3 and the preset threshold is 4, the emotion score is smaller than the preset threshold, the emotion of the current user is normal, and the emotion state grade of the current user does not need to be determined; if the emotion score is 5 and the preset threshold is 4, the emotion score is greater than the preset threshold, the current user is nervous, and the current user's emotional state level is known to be low.
In this embodiment, the interval of 5-6 emotion scores may be low emotion state, the interval of 7-8 emotion scores may be medium emotion state, the interval of 9-10 emotion scores may be high emotion state, and the like.
It should be noted that the mobile terminal may generate a virtual character picture according to the expression feature information, determine a virtual background picture according to the emotional fluctuation signal, and then generate a virtual experience scene according to the virtual character picture, the virtual background picture and the emotional cure strategy.
The virtual background pictures can be pre-stored in a plurality of virtual background pictures of the mobile terminal, and it should be noted that a preset background mapping relation table exists in the mobile terminal, wherein the preset background mapping relation table includes a plurality of virtual background pictures and a plurality of user emotion scores, and the virtual background pictures and the user emotion scores have a one-to-one correspondence relationship.
The emotion detection module 1001 is further configured to adjust an emotion state of the current user based on the virtual experience scene.
It should be noted that when the current user adjusts the emotional state based on the virtual experience scene, the wearable glasses and the healing robot may also monitor the facial expression image and the hand pressure sensing data in real time, and then generate the emotion change log of the current user according to the emotion fluctuation analysis chart corresponding to the facial expression image and the hand pressure sensing data, so that the supervisor can view the emotion change log of the current user.
In the embodiment, firstly, the emotion detection module collects facial expression images and hand pressure sensing data of a current user, the facial expression images and the hand sensing data are sent to the mobile terminal, then, the mobile terminal performs feature extraction on the facial expression images to obtain expression feature information, the hand pressure sensing data are processed to obtain emotion fluctuation signals, the mobile terminal determines a virtual experience scene according to the expression feature information and the emotion fluctuation signals, the virtual experience scene is sent to the emotion detection module, and finally, the emotion detection module adjusts the emotion state of the current user based on the virtual experience scene. Compared with the prior art, the method and the device have the advantages that music treatment is carried out on the user according to the electroencephalogram data, and then the emotional state of the user is adjusted, the virtual experience scene is determined according to the expression characteristic information and the emotion fluctuation signal of the user, then the emotional state adjustment is carried out based on the virtual experience scene, the emotional state of the user is accurately detected, the emotional state of the user is effectively adjusted, and the user experience is improved.
In addition, referring to fig. 4, an embodiment of the present invention further provides an emotional state adjustment method, where the emotional state adjustment method is applied to the emotional state adjustment system described above, where the emotional state adjustment system includes an emotion detection module and a mobile terminal, and the emotional state adjustment method includes the following steps:
step S40: the emotion detection module collects facial expression images and hand pressure sensing data of a current user and sends the facial expression images and the hand pressure sensing data to the mobile terminal.
It should be noted that, current user can be the autism patient, and the autism patient uses wearing formula glasses after, shoots current autism patient's facial expression image through wearing formula glasses, and hand pressure sensing data is the handheld robot of healing of treating of autism patient, treats that the robot is inside to have pressure sensor, later can gather the hand pressure sensing data of autism patient in real time according to pressure sensor.
In specific implementation, the facial expression image of the autism patient shot by the wearable glasses and the hand pressure sensing data collected by the healing robot need to be sent to the mobile terminal, and the mobile terminal can process and display the facial expression image and the hand pressure sensing data and the like.
The healing robot can also send the appearance shape and the appearance change condition of the healing robot to the mobile terminal, so that the mobile terminal generates a virtual character picture according to the appearance shape and the appearance change condition of the healing robot, the virtual character picture is consistent with the healing robot, and then the virtual character picture is sent to the wearable glasses, so that a current user can observe the healing robot in real time through the wearable glasses
Step S41: and the mobile terminal extracts the features of the facial expression image to obtain expression feature information, and processes the hand pressure sensing data to obtain an emotional fluctuation signal.
The mobile terminal may perform feature extraction on the facial expression image to obtain expression feature information, where it should be noted that the expression feature information is expression feature information of a current user, where the expression feature information includes outline information of eyebrows and mouths. The mobile terminal processes the hand pressure sensing data to obtain an emotion fluctuation signal, wherein the emotion fluctuation signal can be understood as an emotion change signal generated by the current user according to the hand pressure sensing data.
Step S42: and the mobile terminal determines a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal and sends the virtual experience scene to the emotion detection module.
The mobile terminal can determine the emotion score of the current user according to the emotion fluctuation signal, then determine an emotion healing strategy according to the expression characteristic information and the emotion score, and finally determine a virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
The emotion score can be understood as a score value corresponding to the emotion fluctuation signal, and then the change condition of the current user's emotion is judged according to the score value, the emotion score can be represented by a number, such as 5 or 8, and the higher the emotion score is, the stronger the emotion state of the current user is.
In a specific implementation, an emotion fluctuation analysis graph, that is, an emotion fluctuation curve graph, may be generated according to the emotion fluctuation signal, and the change condition of the emotion and the emotion score of the current user are determined according to the emotion fluctuation curve graph, for example, if the curve in the emotion curve graph is high and gentle, the emotion of the current user is strong, the emotion score may be 9, for example, if the emotion curve graph is high and gentle, the emotion score may be 7, and the like.
The mobile terminal can determine the emotional state grade of the current user according to the expression characteristic information and the emotional score, and then determine the emotional cure strategy according to the emotional state grade.
It should be noted that the expression state of the current user, such as injury, difficulty, or tension, can be specified according to the expression feature information. The emotional state grade can be high emotional state grade, middle emotional state grade, low emotional state grade and the like.
The emotion healing strategy can be the time length of the current user in the virtual experience scene, and can be 5min or 10 min.
Assuming that the expression characteristic information of the current user is in a tense state, and the emotion score is 9, the corresponding emotion state level of the current user is in a tense high level, and the emotion healing strategy corresponding to the tense high level may be 10min, that is, the current user may be in a virtual experience scene 9mi by using wearable glasses.
It should be understood that the mobile terminal may also transmit the expression feature information of the current user to the healing robot, and the healing robot may display the corresponding expression state or accompanying expression state on the display screen according to the expression feature information of the current user.
In specific implementation, the mobile terminal may further determine whether the emotion score is greater than a preset threshold, and determine the current emotion state level of the user according to the expression feature information when the emotion score is greater than the preset threshold.
The preset threshold is set by the user in a self-defined manner and can be 4, for example, if the emotion score is 3 and the preset threshold is 4, the emotion score is smaller than the preset threshold, the emotion of the current user is normal, and the emotion state grade of the current user does not need to be determined; if the emotion score is 5 and the preset threshold is 4, the emotion score is greater than the preset threshold, the current user is nervous, and the current user's emotional state level is known to be low.
In this embodiment, the interval of 5-6 emotion scores may be low emotion state, the interval of 7-8 emotion scores may be medium emotion state, the interval of 9-10 emotion scores may be high emotion state, and the like.
It should be noted that the mobile terminal may generate a virtual character picture according to the expression feature information, determine a virtual background picture according to the emotional fluctuation signal, and then generate a virtual experience scene according to the virtual character picture, the virtual background picture and the emotional cure strategy.
The virtual background pictures can be pre-stored in a plurality of virtual background pictures of the mobile terminal, and it should be noted that a preset background mapping relation table exists in the mobile terminal, wherein the preset background mapping relation table includes a plurality of virtual background pictures and a plurality of user emotion scores, and the virtual background pictures and the user emotion scores have a one-to-one correspondence relationship.
Step S43: the emotion detection module adjusts the emotional state of the current user based on the virtual experience scene.
It should be noted that when the current user adjusts the emotional state based on the virtual experience scene, the wearable glasses and the healing robot may also monitor the facial expression image and the hand pressure sensing data in real time, and then generate the emotion change log of the current user according to the emotion fluctuation analysis chart corresponding to the facial expression image and the hand pressure sensing data, so that the supervisor can view the emotion change log of the current user.
In the embodiment, firstly, the emotion detection module collects facial expression images and hand pressure sensing data of a current user, the facial expression images and the hand sensing data are sent to the mobile terminal, then, the mobile terminal performs feature extraction on the facial expression images to obtain expression feature information, the hand pressure sensing data are processed to obtain emotion fluctuation signals, the mobile terminal determines a virtual experience scene according to the expression feature information and the emotion fluctuation signals, the virtual experience scene is sent to the emotion detection module, and finally, the emotion detection module adjusts the emotion state of the current user based on the virtual experience scene. Compared with the prior art, the method and the device have the advantages that music treatment is carried out on the user according to the electroencephalogram data, and then the emotional state of the user is adjusted, the virtual experience scene is determined according to the expression characteristic information and the emotion fluctuation signal of the user, then the emotional state adjustment is carried out based on the virtual experience scene, the emotional state of the user is accurately detected, the emotional state of the user is effectively adjusted, and the user experience is improved.
It should be noted that the above-described work flows are only exemplary, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of them to achieve the purpose of the solution of the embodiment according to actual needs, and the present invention is not limited herein.
In addition, the technical details that are not described in detail in this embodiment may be referred to the emotional state adjustment system provided in any embodiment of the present invention, and are not described herein again.
Further, it is to be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g. Read Only Memory (ROM)/RAM, magnetic disk, optical disk), and includes several instructions for enabling a terminal device (e.g. a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. The system for adjusting the emotional state is characterized by comprising an emotion detection module and a mobile terminal;
the emotion detection module is used for acquiring a facial expression image and hand pressure sensing data of a current user and sending the facial expression image and the hand pressure sensing data to the mobile terminal;
the mobile terminal is used for extracting the characteristics of the facial expression image to obtain expression characteristic information, and processing the hand pressure sensing data to obtain an emotional fluctuation signal;
the mobile terminal is further used for determining a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal and sending the virtual experience scene to the emotion detection module;
the emotion detection module is further used for adjusting the emotion state of the current user based on the virtual experience scene.
2. The system of claim 1, wherein the mobile terminal is further configured to determine an emotion score for the current user based on the emotional fluctuation signal;
the mobile terminal is further used for determining an emotion curing strategy according to the expression characteristic information and the emotion score;
the mobile terminal is further used for determining a virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
3. The system of claim 2, wherein the mobile terminal is further configured to determine a level of an emotional state of the current user based on the expressive feature information and the emotion score;
the mobile terminal is further used for determining an emotion healing strategy according to the emotion state grade.
4. The system of claim 3, wherein the mobile terminal is further configured to determine whether the sentiment score is greater than a preset threshold;
the mobile terminal is further used for determining the emotion state grade of the current user according to the expression characteristic information when the emotion score is larger than the preset threshold value.
5. The system of claim 2, wherein the mobile terminal is further configured to analyze the mood fluctuation signal to obtain a mood fluctuation analysis map;
the mobile terminal is further used for determining the emotion score of the current user according to the emotion fluctuation analysis chart.
6. The system of claim 2, wherein the mobile terminal is further configured to generate a virtual character picture according to the expressive feature information, and determine a virtual background picture according to the emotional fluctuation signal;
the mobile terminal is further used for generating a virtual experience scene according to the virtual character picture, the virtual background picture and the emotion healing strategy.
7. The system of claim 5, wherein the mobile terminal is further configured to generate an emotion change log of the current user based on the facial expression image and the emotion fluctuation analysis chart, so that a supervisor can view the emotion change log of the current user.
8. An emotional state adjustment method, applied to the emotional state adjustment system according to any one of claims 1 to 7, the emotional state adjustment system including an emotion detection module and a mobile terminal, the emotional state adjustment method comprising:
the emotion detection module acquires a facial expression image and hand pressure sensing data of a current user and sends the facial expression image and the hand pressure sensing data to the mobile terminal;
the mobile terminal extracts the features of the facial expression image to obtain expression feature information, and processes the hand pressure sensing data to obtain an emotional fluctuation signal;
the mobile terminal determines a virtual experience scene according to the expression characteristic information and the emotion fluctuation signal and sends the virtual experience scene to the emotion detection module;
the emotion detection module adjusts the emotional state of the current user based on the virtual experience scene.
9. The emotional state adjustment method of claim 8, wherein the step of determining a virtual experience scene according to the emotional characteristic information and the emotional fluctuation signal comprises:
the mobile terminal determines the emotion score of the current user according to the emotion fluctuation signal;
the mobile terminal determines an emotion healing strategy according to the expression characteristic information and the emotion score;
and the mobile terminal determines a virtual experience scene according to the expression characteristic information, the emotion fluctuation signal and the emotion healing strategy.
10. The emotional state adjustment method of claim 9, wherein the determining an emotional cure strategy based on the emotional characteristic information and the emotional score comprises:
the mobile terminal determines the emotion state grade of the current user according to the expression characteristic information and the emotion score;
and the mobile terminal determines an emotion healing strategy according to the emotion state grade.
CN202111017756.3A 2021-08-31 2021-08-31 Emotion state adjustment system and method Active CN113633870B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111017756.3A CN113633870B (en) 2021-08-31 2021-08-31 Emotion state adjustment system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111017756.3A CN113633870B (en) 2021-08-31 2021-08-31 Emotion state adjustment system and method

Publications (2)

Publication Number Publication Date
CN113633870A true CN113633870A (en) 2021-11-12
CN113633870B CN113633870B (en) 2024-01-23

Family

ID=78424822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111017756.3A Active CN113633870B (en) 2021-08-31 2021-08-31 Emotion state adjustment system and method

Country Status (1)

Country Link
CN (1) CN113633870B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116594511A (en) * 2023-07-17 2023-08-15 天安星控(北京)科技有限责任公司 Scene experience method and device based on virtual reality, computer equipment and medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6425764B1 (en) * 1997-06-09 2002-07-30 Ralph J. Lamson Virtual reality immersion therapy for treating psychological, psychiatric, medical, educational and self-help problems
CN103654798A (en) * 2013-12-11 2014-03-26 四川大学华西医院 Emotion monitoring and recording method and device
CN106620990A (en) * 2016-11-24 2017-05-10 深圳创达云睿智能科技有限公司 Method and device for monitoring mood
CN106955113A (en) * 2017-05-10 2017-07-18 安徽徽韵心理咨询有限公司 A kind of intelligent hug guiding instrument of reaction type
CN109271018A (en) * 2018-08-21 2019-01-25 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard
CN109571494A (en) * 2018-11-23 2019-04-05 北京工业大学 Emotion identification method, apparatus and pet robot
CN109620185A (en) * 2019-01-31 2019-04-16 山东大学 Autism auxiliary diagnosis system, device and medium based on multi-modal information
CN109683709A (en) * 2018-12-17 2019-04-26 苏州思必驰信息科技有限公司 Man-machine interaction method and system based on Emotion identification
CN110404148A (en) * 2019-08-07 2019-11-05 上海市精神卫生中心(上海市心理咨询培训中心) A kind of self-closing disease mood based on virtual reality technology pacifies method and system
CN111531552A (en) * 2020-03-13 2020-08-14 华南理工大学 Psychological accompanying robot and emotion support method
US20200349337A1 (en) * 2019-05-01 2020-11-05 Accenture Global Solutions Limited Emotion sensing artificial intelligence
KR102280722B1 (en) * 2020-02-20 2021-07-22 공영일 Healing Chair Device for Mental Health Care Service

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6425764B1 (en) * 1997-06-09 2002-07-30 Ralph J. Lamson Virtual reality immersion therapy for treating psychological, psychiatric, medical, educational and self-help problems
CN103654798A (en) * 2013-12-11 2014-03-26 四川大学华西医院 Emotion monitoring and recording method and device
CN106620990A (en) * 2016-11-24 2017-05-10 深圳创达云睿智能科技有限公司 Method and device for monitoring mood
CN106955113A (en) * 2017-05-10 2017-07-18 安徽徽韵心理咨询有限公司 A kind of intelligent hug guiding instrument of reaction type
CN109271018A (en) * 2018-08-21 2019-01-25 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard
CN109571494A (en) * 2018-11-23 2019-04-05 北京工业大学 Emotion identification method, apparatus and pet robot
CN109683709A (en) * 2018-12-17 2019-04-26 苏州思必驰信息科技有限公司 Man-machine interaction method and system based on Emotion identification
CN109620185A (en) * 2019-01-31 2019-04-16 山东大学 Autism auxiliary diagnosis system, device and medium based on multi-modal information
US20200349337A1 (en) * 2019-05-01 2020-11-05 Accenture Global Solutions Limited Emotion sensing artificial intelligence
CN110404148A (en) * 2019-08-07 2019-11-05 上海市精神卫生中心(上海市心理咨询培训中心) A kind of self-closing disease mood based on virtual reality technology pacifies method and system
KR102280722B1 (en) * 2020-02-20 2021-07-22 공영일 Healing Chair Device for Mental Health Care Service
CN111531552A (en) * 2020-03-13 2020-08-14 华南理工大学 Psychological accompanying robot and emotion support method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116594511A (en) * 2023-07-17 2023-08-15 天安星控(北京)科技有限责任公司 Scene experience method and device based on virtual reality, computer equipment and medium
CN116594511B (en) * 2023-07-17 2023-11-07 天安星控(北京)科技有限责任公司 Scene experience method and device based on virtual reality, computer equipment and medium

Also Published As

Publication number Publication date
CN113633870B (en) 2024-01-23

Similar Documents

Publication Publication Date Title
US11656680B2 (en) Technique for controlling virtual image generation system using emotional states of user
TWI658377B (en) Robot assisted interaction system and method thereof
US12067891B2 (en) Information processing device, information processing method, and program
CN111063416A (en) Alzheimer disease rehabilitation training and capability assessment system based on virtual reality
JP2005237561A (en) Information processing device and method
CN113556603B (en) Method and device for adjusting video playing effect and electronic equipment
CN114514563A (en) Creating optimal work, learning, and rest environments on electronic devices
JP2018180503A (en) Public speaking assistance device and program
CN111442464B (en) Air conditioner and control method thereof
JP2020173787A (en) Information processing apparatus, information processing system, information processing method, and information processing program
CN113633870A (en) Emotional state adjusting system and method
CN111654752B (en) Multimedia information playing method and device, electronic equipment and storage medium
JP2011206471A (en) Game device, control method of game device, and program
CN116133594A (en) Sound-based attention state assessment
US11403289B2 (en) Systems and methods to facilitate bi-directional artificial intelligence communications
WO2020175969A1 (en) Emotion recognition apparatus and emotion recognition method
JP5629364B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
CN116997880A (en) Attention detection
CN113808281A (en) Method, system, device and storage medium for generating virtual sprite image of automobile
JP3848076B2 (en) Virtual biological system and pattern learning method in virtual biological system
US20240177406A1 (en) Interactive wearable electronic device with facial expression and face recognition and prompting functions and processing method for interactive wearable electronic device
JP6963669B1 (en) Solution providing system and mobile terminal
JP4355823B2 (en) Information processing device for facial expressions
TW202424697A (en) Virtual reality interactive system for expression simulation and emotions training by using virtual reality Achieving effects of assisting autistic patients in cognition and training related to emotions and expressions
CN114067410A (en) Table game interaction system based on artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant