CN113288144A - Emotion state display terminal and method based on emotion guidance - Google Patents

Emotion state display terminal and method based on emotion guidance Download PDF

Info

Publication number
CN113288144A
CN113288144A CN202110568055.2A CN202110568055A CN113288144A CN 113288144 A CN113288144 A CN 113288144A CN 202110568055 A CN202110568055 A CN 202110568055A CN 113288144 A CN113288144 A CN 113288144A
Authority
CN
China
Prior art keywords
emotion
emotional state
user
information
actual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110568055.2A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Haode Translation Information Technology Co ltd
Original Assignee
Chongqing Haode Translation Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Haode Translation Information Technology Co ltd filed Critical Chongqing Haode Translation Information Technology Co ltd
Priority to CN202110568055.2A priority Critical patent/CN113288144A/en
Publication of CN113288144A publication Critical patent/CN113288144A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7455Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The invention relates to an emotion state display terminal based on emotion guidance, wherein at least one mobile terminal automatically collects and/or inputs actual emotion state information by a user; the mobile terminal is further configured to: storing and displaying the ambient condition information, the body signal information, the actual emotional state information and the sensation information in a manner that the mapping associations of the ambient conditions, the body signals, the actual emotional states and the sensations gradually increase, wherein each ambient condition is associated with at least one body signal mapping, each body signal is associated with at least one actual emotional state mapping, and each actual emotional state is associated with at least one sensation mapping. The invention can accurately collect and display the emotion state change trend of the cloud server, purposefully guide the emotion change of the user and realize effective guidance of the emotion of the actor.

Description

Emotion state display terminal and method based on emotion guidance
The invention discloses an emotion control device and method with application number of 201810106194.1, and is a divisional application of an invention patent with application date of 2018, 2 month and 2 days.
Technical Field
The invention relates to the technical field of display terminals, in particular to an emotional state display terminal and method based on emotion guidance.
Background
Emotion and feelings are a pair of interactive psychological states, transformed into each other, and opposing each other. The emotion curve is flattened, and the physical and psychological health is kept. The feeling is obtained by the user through external condition stimulation, the user generates psychological emotion change through the feeling, and the emotion is fed back through the body signal.
Emotion is an energy within the body and characteristic signals associated with a certain emotion can be determined by body surface measurements, such as chest or wrist measurements. However, the emotion-related feature data are very complex, and the directivity of these emotion feature data is not clear because of individual differences. If the emotion of the person is guided and regulated based on the emotion signal, the opposite effect is often generated.
CN104939810A provides a method and an apparatus for controlling emotion, wherein the method comprises: A) obtaining a pulse frequency; B) when the value of the pulse frequency is higher than a preset frequency value, acquiring sound information; C) detecting the sound intensity and the beat speed in the sound information; D) judging whether the user is in an emotional runaway state or not according to the sound intensity and the beat speed; E) when the emotion is out of control, the voice information is stopped being acquired, a preset audio is played, and the problems that the emotion is out of control and is automatically detected and controlled according to the state are solved through the steps. Although the technical scheme can detect and prompt related personnel to control the emotion, the control occurs after the emotional runaway state, and once the personnel is involved in the emotional runaway, the outcome which is difficult to control occurs, and the expected technical effect cannot be achieved.
CN107456218A discloses an emotion sensing system and wearable equipment, this emotion sensing system include detectable heart rate and respiratory signal's chest node and/or detectable pulse and blood oxygen signal's wrist node, host computer and cloud ware, chest node and/or wrist node wireless connection the host computer, host computer wireless connection the cloud ware, the signal that chest node and/or wrist node gathered passes through the host computer uploads to the cloud ware, the cloud ware is according to the signal of gathering, based on predetermined database, utilizes emotion analysis algorithm to confirm the emotional state that the signal of gathering corresponds to, and feed back to the host computer. The system can timely and reliably sense the emotional state of the user, can help the user to manage and adjust the emotion of the user, can be published on a social network, and enhances social contact and entertainment based on emotional communication. The technical scheme only provides a specific means for signal acquisition, and introduces emotion analysis models of CNN and LSTM very generally in the specification, so that although a user can obtain a calculated emotion state, the current optimal emotion calculation result is greatly different from the actual emotion and cannot cope with complicated individual conditions, and the technical scheme cannot guide and manage the emotion of the user in practice.
CN107485402A discloses an emotion monitoring device and system. The device includes: the receiving module is used for receiving state information which is sent by a first terminal and indicates the emotion state of a first user, and the first terminal and the emotion monitoring device have a binding relationship; the control module is used for presenting information reflecting the emotional state of the first user according to the state information. The invention solves the problem that the second user grasps the emotion of the first user in real time, but it does not provide a solution for how to guide and manage the emotion of the first user.
CN107025371A discloses a method and a system for dynamically monitoring and managing emotions, which comprises the steps of emotion monitoring, early warning reminding, emotion adjusting and training and a data storage center.
CN107007291A discloses a system and method for recognizing intensity of tension emotion based on multiple physiological parameters, including off-line training and on-line monitoring, where the off-line training includes inducing tension emotion of a user, collecting multiple physiological signals of the user, and signal processing, where the signal processing includes preprocessing, feature extraction, and pattern recognition; the preprocessing comprises the steps of utilizing an adaptive filter to restrain power frequency interference on an electroencephalogram signal, utilizing a band-pass filter to remove the power frequency interference after amplifying an electrocardio signal, a respiration signal and a picoelectric signal, and utilizing an information processing toolkit to intercept effective data. The invention has the beneficial effects that: the method comprises the steps of collecting central nerve signals and autonomic nerve signals reflecting human nervous system information, training off line to establish a cross-person or individual-specific classification model, utilizing the classification model to carry out real-time identification and detection on the intensity of the tension of a user, carrying out early warning on the over-high-intensity tension, storing emotional physiological signals of the user in the whole process, and detecting the tension intensity of the user in real time.
CN107464188A discloses an Internet social application system based on the Internet of things emotion perception technology, which comprises a hardware acquisition module and a social APP, wherein the hardware acquisition module measures and records the heart rate and body surface temperature data of a user in real time and sends the data to a mobile terminal; install social APP on the mobile terminal, social APP has basic chat function, and embedded emotion perception algorithm simultaneously, after correctly matcing the hardware acquisition module, can real-time reception hardware acquisition module upload data, the current hardware of analysis uses user's emotion, through sign figure colour change or vibration transmission emotion perception information in social APP, realize emotion perception and interaction. The method aims at the current situation that scientific measurement is lacked and real-time emotion change of an interactive user exists in current internet social application software, integrates the emotion perception technology of the internet of things into the internet social application, fills a blind area that the emotion interaction of the user is ignored by the current internet social application, increases interaction experience in the internet social contact, and enriches social quality.
None of the above prior art proposes an emotional state display terminal based on emotional guidance.
Disclosure of Invention
The present invention stems from the discovery that no one person is able to accurately describe the psychological feelings of a certain mood, whether sadness, joy or suffering. Although there are methods of brain wave scanning, feature signal detection, and the like, and a large number of artificial intelligence algorithms such as CNN (convolutional neural network), RNN (cyclic neural network), DNN (deep neural network) are used to construct the emotion model, it is only said that the adaptive effect of an individual is a bad idea.
From the psychologist's perspective, the mood is manageable, at least guidable. Such management and guidance requires scientific methods and devices to assist in achieving the desired results. The object of the invention is therefore to provide a device and a method for calibrating and recording emotions by the user himself, and for the targeted training of the emotion control. According to the invention, on one hand, the training of the cloud server emotion analysis algorithm can be completed through the teaching process, and on the other hand, the difference between the actual emotion state and the theoretical emotion state can be continuously adjusted by the user, so that the device gradually acquires the emotion characteristics of the user, and further effectively manages the emotion of the user.
An emotional control device, comprising at least: the mobile terminal is used for automatically collecting and/or inputting actual emotional state information by a user; the detector is used for acquiring a body signal in a mode of indirectly or directly contacting the body of the user, and the body signal is sent to the cloud server through the mobile terminal; the cloud server is used for determining theoretical emotional state information corresponding to the body signals by utilizing an emotion analysis algorithm based on a preset database; the mobile terminal is characterized in that the cloud server analyzes and determines a theoretical emotional state corresponding to the body signal collected by the detector based on the body signal and feeds the theoretical emotional state back to the mobile terminal, the cloud server or the mobile terminal analyzes the change trend of the actual emotional state based on the feeling input by the user through the mobile terminal, the actual emotional state information and/or the external condition information, and the mobile terminal sends an early warning prompt to the current user when the change trend of the actual emotional state exceeds a critical value at a foreseeable time point. According to the invention, the external conditions, the emotion, the feeling and the body signals are comprehensively analyzed, and the related relation is recorded, so that the actual emotion state change trend of the user can be determined, and the emotion is adjusted in the process of changing from the emotion to the extreme trend, so that the emotion state is maintained in a required state. In addition, the invention can share the emotion change trend in the cloud server, thereby avoiding the contact and conversation cooperation of partners when the emotion is easy to break out, reducing the infection range of negative emotion and promoting the happy life of people.
According to a preferred embodiment, the cloud server analyzes and completes a teaching process of the cloud server based on actual emotional state information, body signals and/or external conditions input by a user, wherein the cloud server configures parameters of an emotion analysis algorithm in advance according to at least two extreme emotional state information determined by the teaching process. The feeling information input by the user and the external condition change information are used for teaching the cloud server, so that the cloud server can determine the emotion analysis algorithm. After the emotion analysis algorithm is determined, the actual emotion state is accurately analyzed based on the external conditions and the body signals, the analysis probability of the actual emotion state of an individual is improved, and the personalized emotion analysis algorithm is favorably formulated.
According to a preferred embodiment, the mobile terminal stores or provides the feelings input by the user in a text, voice, video and/or graphic manner and the corresponding automatically acquired actual emotional states to the cloud server in an associated manner, or records external conditions causing the actual emotional states of the user and stores or provides the external conditions to the cloud server in an associated manner with the corresponding actual emotional states, and the cloud server analyzes the correlation between the specific emotional states of the user and the external conditions and warns the causing of the specific emotional states based on the correlation. After the correlation between the actual emotional state and the external condition and the body signal is determined, guidance is facilitated to be conducted in the process that the actual emotional state is converted to the specific emotional state, and the emotion is enabled to be converted rapidly, so that the fact that the emotion of the user is converted into extreme emotion is avoided, or the fact that the user does not work well under the extreme emotion is avoided.
According to a preferred embodiment, the cloud server stores the current emotional state information input by the user and the external condition related to the actual emotional state information provided by the mobile terminal in a manner of being associated with each other, and the mobile terminal is configured to retrieve, by the user, the actual emotional state information stored in the cloud server and/or the mobile terminal in a manner of being related to the external condition. The external conditions, the body signals and the emotional states are associated, so that the user can search when needing to know the emotional history of the user, and the user can quickly know the emotional state of the user under the specified external conditions. In addition, the cloud server is beneficial to collecting big data and knowing the stimulation of external conditions to the emotional state of the crowd. Particularly, for actors needing emotion guidance and excitation, relevant emotions are retrieved and emotion outbreaks are guided through external conditions or scenes, and improvement of playing skills is facilitated.
According to a preferred embodiment, the cloud server corrects a theoretical emotional state determined based on the body signal analysis collected by the detector according to the actual emotional state information, and an emotion management configuration file formed by the corrected theoretical emotional state is stored in the mobile terminal in a manner of being retrievable according to the body signal. The mood is personalized, differing based on the personality of the individual. Even under the same external conditions, different emotions are generated. Therefore, personalized emotional state analysis is important. The retrieval is carried out based on the body signals, so that the user can know the emotional state record information of the user through the body information characteristics. And for experts or academic institutions that need to study emotions, a large number of samples can be provided for their study body to correlate with the patient's emotions. Moreover, the actors can know the emotion and feeling of the person with physical pain through the retrieval of the body signals, so that the emotions of the actors are guided to achieve real expression performance.
According to a preferred embodiment, in the teaching process, the mobile terminal applies stimulation information capable of inducing emotion to a user, and automatically collects and/or inputs actual emotional state information corresponding to the stimulation information by the user, meanwhile, the detector collects body signals of the user corresponding to the stimulation information, and the cloud server pre-configures parameters of an emotion analysis algorithm based on at least two kinds of actual emotional state information and the body signals in the teaching process and corrects theoretical emotional state information corresponding to the body signals. By applying stimulation information, the corresponding emotional state and the body signal of the user can be accurately obtained, and therefore accurate teaching is conducted on the cloud server. Preferably, through the teaching process, the related variation trend of extreme emotions with low probability of occurrence can be preferentially obtained, so that the emotion state information of the cloud server is perfected.
According to a preferred embodiment, the step of the mobile terminal inputting the actual emotional state by the user comprises: the user selects the type and level of the current emotion in a click-by-click manner, and/or the user inputs the actual emotional state of the user himself in a text, voice, video or graphic manner. The user selects the emotion level in a click mode, so that the change of the emotion state of the user and different changes of the emotion under the same external condition can be recorded.
According to a preferred embodiment, the mobile terminal stores and displays the external condition information, the body signal information, the actual emotional state information and the feeling information in a way that the mapping associations of the external conditions, the body signals, the actual emotional states and the feelings are gradually increased, wherein each external condition is associated with at least one body signal mapping, each body signal is associated with at least one actual emotional state mapping, and each actual emotional state is associated with at least one feeling mapping. The invention displays the external condition, the emotional state, the body signal and the feeling in a correlation manner, and is beneficial to the observation of the change of the emotional state of the user and the adjustment of the negative emotion of the user.
According to a preferred embodiment, the cloud server records the actual emotional state and the theoretical emotional state based on an iterative analysis of at least one ambient condition, body signal and/or feeling at each moment, and the mobile terminal displays ambient condition changes, body signal changes, feeling changes and trends of changes in the actual emotional state in a manner of displaying an identifier with mapping associations, reminds the user of extreme trends of changes in the actual emotional state in a manner of color changes, sound and/or vibration, and/or displays a recommendation to change the actual emotional state. The invention displays the change trend of the emotional state in the modes of color change, sound and/or vibration, thereby being beneficial to the user to judge the change trend of the emotional state and knowing the degree of the influence of the external conditions. Preferably, for actors, judgment of whether the emotion states of the actors accord with expected scene is facilitated, namely judgment of whether the emotions of the actors are in place is facilitated, and therefore expression of skill is improved.
The invention also provides an emotion control method, which at least comprises the following steps: automatically collecting and/or inputting actual emotional state information by a user; acquiring body signals in a mode of indirectly or directly contacting the body of a user, and determining theoretical emotional state information corresponding to the body signals by utilizing an emotion analysis algorithm based on a preset database; the mobile terminal is characterized in that theoretical emotional states corresponding to the body signals collected by the detector are analyzed and determined based on the body signals collected by the detector and fed back to the mobile terminal, the change trend of the actual emotional states is analyzed based on feelings input by a user through the mobile terminal, the actual emotional state information and/or external condition information, and the mobile terminal sends an early warning prompt to the current user when the change trend of the actual emotional states exceeds a critical value at a foreseeable time point.
The invention provides an emotional state display terminal based on emotional guidance, wherein at least one mobile terminal automatically collects and/or inputs actual emotional state information by a user; the mobile terminal is further configured to:
storing and displaying the ambient condition information, the body signal information, the actual emotional state information and the sensation information in a manner that the mapping associations of the ambient conditions, the body signals, the actual emotional states and the sensations gradually increase, wherein one ambient condition is associated with at least one body signal mapping, each body signal is associated with at least one actual emotional state mapping, and each actual emotional state is associated with at least one sensation mapping.
Preferably, the mobile terminal is further configured to:
in the case where the cloud server records the actual emotional state and the theoretical emotional state based on an iterative analysis of at least one external condition, body signal, and/or sensation at each moment, the external condition information, body signal information, actual emotional state information, and sensation information are stored and displayed in a manner such that the mapping associations of the external condition, body signal, actual emotional state, and sensation gradually increase.
Preferably, the mobile terminal is further configured to:
the feeling of user input, the actual emotional state associated therewith, the external condition and the physical signal are displayed in a circular array consisting of at least two circles, wherein each circle is divided into a number of spaces for recording information.
Preferably, the mobile terminal is further configured to:
storing and displaying the feeling input by the user, the actual emotional state associated therewith, the external condition, and the body signal in a circular list consisting of four circles; the circular list comprises an outer ring mark and an inner ring mark.
Preferably, the inner ring marks comprise a first inner ring mark and a second inner ring mark with the radius larger than that of the first inner ring mark; the outer ring mark comprises a first outer ring mark and a second outer ring mark with the radius larger than that of the first outer ring mark; each space of the first inner circle mark is used for storing an external condition related to an actual emotional state; each space of the second inner circle identification is used for storing a physical feature associated with an actual emotional state; each space of the first outer ring of marks is used for storing the actual emotional state; each space identified by the second outer circle is used for storing the feeling of user input.
Preferably, the mobile terminal is further configured to: and searching the actual emotional state information stored in the cloud server and/or the mobile terminal by the user according to a mode related to the external condition.
Preferably, the mobile terminal displays the teaching emotion video generated by the cloud server in real time.
Preferably, in the real-time teaching process, the mobile terminal collects the emotional state information of the user, the external conditions corresponding to the time axis one by one and the feeling of the user and stores the information to the user name, and the cloud server performs theoretical emotional analysis according to the emotional state information and the external conditions collected by the mobile terminal, and repeats the theoretical emotional analysis for many times until the teaching is finished.
Preferably, the mobile terminal is further configured to:
in the teaching process, stimulation information capable of inducing emotion is applied to a user, and actual emotional state information corresponding to the stimulation information is automatically collected and/or input by the user.
The invention also provides an emotion state display method based on emotion guidance, which at least comprises the following steps: the mobile terminal automatically collects and/or inputs actual emotional state information by a user; the method further comprises the following steps: storing and displaying the ambient condition information, the body signal information, the actual emotional state information and the sensation information in a manner that the mapping associations of the ambient conditions, the body signals, the actual emotional states and the sensations gradually increase, wherein each ambient condition is associated with at least one body signal mapping, each body signal is associated with at least one actual emotional state mapping, and each actual emotional state is associated with at least one sensation mapping.
The invention has the beneficial technical effects that:
the emotion control device and the emotion control method can perform personalized analysis on the emotion state of the user, and improve accurate judgment on individual emotion state change. In addition, the cloud platform can be used for carrying out big data sharing on the emotional state data in various industries, and the cloud platform is beneficial to analyzing the influence of the change of external conditions on the emotional state of the group by government departments. For actors, negotiators, doctors and other professionals who need emotion adjustment, the invention can help them adjust their emotional state or guide their emotional change to an emotional state suitable for work. In particular, a good emotional state is beneficial to not only work but also life, avoids physical damage of the user due to negative emotions, and prolongs the life. The emotion control device and the emotion control method are beneficial to the happy work and life of people.
Drawings
FIG. 1 is a schematic illustration of a mobile terminal of the present invention;
fig. 2 is a schematic diagram of the logical structure of the present invention.
List of reference numerals
10: the mobile terminal 20: detector
30: cloud server 31: database with a plurality of databases
Detailed Description
The following detailed description is made with reference to the accompanying drawings.
Example 1
As shown in fig. 2, the present invention provides an emotion controlling apparatus. The emotion control device of the present invention includes at least a mobile terminal 10, a detector 20, and a cloud server 30. The mobile terminal 10 is used to automatically collect and/or input actual emotional state information by the user. Preferably, the mobile terminal 1O comprises an image or video capture device. The mobile terminal can be an intelligent terminal, such as a mobile device like a notebook, a mobile phone, an intelligent bracelet and an intelligent watch, and can also be a camera device. Preferably, the image pickup device includes a general image pickup device and a night image pickup device. Preferably, the automatic acquisition is to automatically acquire facial expressions and micro-expressions of the user. Or, the user inputs the self expression picture or video or the emotion described by characters into the mobile terminal.
The detector of the present invention is used to acquire body signals in indirect or direct contact with the body of the user. The detector comprises a plurality of modules for collecting human physiological signals. The detection module of the detector at least comprises one or more of a pulse sensor, a heartbeat sensor, a blood pressure sensor, a respiratory rate sensor, a sound collection module, a hand vibration module and a foot step rate sensor. Preferably, the detection module in the detector can be additionally provided with a microwave signal sensor according to requirements, and the microwave signal sensor is used for detecting the change of brain waves. The detection module in the detector can also be added with an electrode arranged on the head according to the requirement. Preferably, the body signal collected by the detector 20 is sent to the cloud server 30 via the mobile terminal 10.
The cloud server 30 of the present invention is configured to determine theoretical emotional state information corresponding to the body signal by using an emotion analysis algorithm based on a preset database. Preferably, the cloud server 30 is provided with a database 31 storing a plurality of emotion analysis algorithms. The cloud server 30 analyzes and determines a theoretical emotional state corresponding to the body signal collected by the detector 20, and feeds the theoretical emotional state back to the mobile terminal 10. Preferably, the emotion analysis algorithm comprises a Bayesian classification algorithm, a neural network, a support vector machine, a decision tree, case inference-based learning, association rule learning and other machine learning algorithms.
For example: the method steps of the teaching process include:
s1: starting a teaching mode;
s2: and (3) selecting by the user: selecting a teaching object according to the user name, and if the teaching object is a user which is already recorded in the data storage module, directly selecting the teaching object; if the object of the teaching is an uninputed user, inputting emotion information of the user through the mobile terminal and then selecting the user, wherein the emotion information of the user comprises a user name, a user face picture, user sample emotion and external conditions and feelings related to the emotion information;
s3: sample selection, namely directly selecting the stored sample emotion video and the teaching matching content if the sample emotion video and the teaching matching content which are taught at the time are stored under the user name; if the sample emotion video and the teaching matching content taught at this time are not contained in the user name, reading the teaching matching content taught at this time through the mobile terminal, storing the sample emotion video and the teaching matching content taught at this time into the user name, and then selecting the stored sample emotion video and the teaching matching content;
s4: and (3) emotion transformation: calling the sample emotion video selected in the step S3, and performing emotion evolution based on the user face picture to realize video inversion of sample emotion on the user face picture and generate a teaching emotion video;
s5: and (3) real-time teaching: and the mobile terminal displays the teaching emotion video generated in the step S4 and the teaching matching content selected in the step S3 in real time for teaching. For example, when the user is in a dysphoric state, the user inputs the dysphoric information through the mobile terminal. And the mobile terminal collects the dysphoric emotional expressions of the user at the moment, the external conditions corresponding to the time axis one by one and the user experience and stores the external conditions and the user experience in the user name. The cloud server performs theoretical emotion analysis according to the dysphoric expression and the external conditions collected by the mobile terminal, and repeats the theoretical emotion analysis for many times until the teaching is finished;
s6: and (3) effect evaluation: and the cloud server corrects an emotion analysis algorithm through comparison according to the emotion of theoretical emotion analysis and the emotion information of the user input in the teaching process, and records the external conditions and feelings associated with the emotion time.
One such theoretical emotion analysis algorithm is described below. The emotion perception algorithm realizes emotion perception on the basis of a multi-classification algorithm of a support vector machine, a model of the support vector machine is a linear classifier which is defined on a feature space and has the largest interval, the linear classifier can be converted into a problem for solving convex quadratic programming, and a classification result is obtained by solving an optimization problem. In the teaching process, the classification result and the true value Y are considered to be acceptable as long as the classification result and the true value Y are within a range epsilon, and the expression formula of the optimization problem is as follows:
Figure BDA0003080837310000141
Figure BDA0003080837310000142
Figure BDA0003080837310000143
the method is based on nonlinear mapping
Figure BDA0003080837310000144
And mapping the teaching samples in the low-dimensional space to a high-dimensional space F, and then classifying the teaching samples in the high-dimensional space by a linear classifier. Wherein w represents a weight vector, wTFor its transposed vector, b represents a linear threshold parameter, the correlation parameter being obtained by the teaching process.
Preferably, the cloud server 30 completes a teaching process for the cloud server 30 based on the operation of the user on the mobile terminal 10. First, the cloud server 30 requires user cooperation to complete the teaching process. The teachings herein refer to exemplary artificial intelligence programming. When the actual emotional state, the physical signal, and the external condition of the user are stored in the cloud server in an associated manner, the cloud server 30 writes the association relationship between the actual emotional state, the physical signal, and the external condition and the emotion analysis algorithm into the memory through a teaching process, so as to form a personalized emotion analysis algorithm for the user. The more the user and the cloud server 30 teach through the mobile terminal 10, the more accurate the emotion analysis algorithm of the cloud server 30. For example, the weight parameters of various information are adjusted according to sample information input by the user for multiple times, so that the theoretical emotional state of the emotion analysis algorithm is consistent with or approximate to the actual emotional state of the user as much as possible.
The cloud server 30 pre-configures parameters of an emotion analysis algorithm according to at least two extreme emotional state information determined by the teaching process.
Preferably, in the teaching process, the mobile terminal 10 applies stimulation information capable of inducing emotion to the user, and automatically collects and/or inputs actual emotional state information corresponding to the stimulation information by the user. Meanwhile, the detector 20 collects body signals of the user corresponding to the stimulation information. And the cloud server pre-configures parameters of an emotion analysis algorithm based on at least two actual emotional state information and the body signal in a teaching process. Preferably, the cloud server 30 corrects theoretical emotional state information corresponding to the body signal based on the actual emotional state.
Preferably, the stimulation information includes video, pictures, text information, sound and other information capable of inducing emotional reactions of the user. The text information includes text such as a sad scene description, a happy scene description, a warm scene description, a joke, etc., which can cause a user to react. In particular, the stimulus information further includes stimulus information capable of evoking extreme emotions of the user, so as to record the extreme emotions of the user.
Preferably, the cloud server 30 further forms a personalized emotion change curve based on a change trend of the actual emotion state in the teaching process. For example, the mood change is: slight distraction → no distraction → calmness → sadness. Under normal conditions, the user cannot instantaneously transit from a very happy state to a sad state.
By analyzing the theoretical emotional state of the current user continuously, the emotion control apparatus of the present invention can determine the change trend of the emotional state by using the cloud server 30. For example, when the body signal analysis collected by the detector 20 shows that the trend of the current user's emotion change is changing from mild to intense and there is no trend of relief, if the mobile terminal 10 deduces that the current user tends to be angry according to the theoretical emotional state model, it gives a prompt to the user in the form of sound, text, graphics or video specified by the user before the user is angry (for example, the angry level is 4 and the trend is increasing) to help the user timely relieve the feeling of tension or anger without the outbreak of emotion.
Preferably, when the mobile terminal 10 analyzes and confirms that the change trend of the actual emotional state of the current user will exceed the critical value at a predictable time point with respect to the actual emotional state information in at least one continuous time interval, and/or when the cloud server analyzes and confirms that the change trend of the theoretical emotional state of the current user will exceed the critical value at a predictable time point, the cloud server sends an early warning prompt to the current user through the mobile terminal. Specifically, the mobile terminal 10 or the cloud server 30 analyzes the variation trend of the actual emotional state of the current user, and confirms that the variation trend of the actual emotional state will exceed the critical value at a predictable time point. At this time, the cloud server 30 sends various prompts such as vibration, sound, color change and the like to the user through the mobile terminal 10 to remind the user to control the emotion.
Preferably, the mobile terminal 10 stores or provides the feelings input by the user in a text, voice, video and/or graphic manner and the corresponding automatically collected actual emotional states to the cloud server 30 in an associated manner. Alternatively, the mobile terminal 10 records the external conditions causing the actual emotional states of the user and stores or provides the external conditions to the cloud server 30 in a form associated with the corresponding actual emotional states. The cloud server 30 analyzes the correlation between the specific emotional state of the user and the external relationship, and warns the triggering of the specific emotional state based on the correlation. Preferably, the external conditions are, for example, weather conditions, ambient noise, temperature, geographical location information, etc.
For example, on cloudy days, the temperature is low, the user is in an unfamiliar geographic position, the environment is quiet, the actual emotional state of the user is sad, and the mobile terminal collects physical signals. The mobile terminal 10 or the cloud server 30 stores the external condition, the body signal and the actual emotional state at the same time in an associated manner, and records the emotional state change trend within a previous limited time and the emotional state change trend within a next limited time of the actual emotional state. The cloud server 30 analyzes the correlation between the sad emotion of the user and the external relationship, and warns the triggering of the sad emotion based on the correlation. Preferably, the cloud server 30 analyzes the obtained approximate emotional state change trend and the external conditions and physical signals related to the emotional state change trend again, and then gives an important prompt to the user to avoid the user from being in an extreme sad emotion.
Preferably, the actual emotional state information of the current user is continuously analyzed by the mobile terminal 10 for a period of time. If the emotion change of the current user is confirmed to be from happy 1 to fear 1, and to be converted to sadness 1 in fear 2, the trend change does not exceed the critical value fear 4 at a predictable time point. No prompt is given to the user at this point. According to the foregoing preferred embodiment, the actual emotional state may be automatically collected by the mobile terminal or may be manually input by the user. In the teaching stage, the manual input of a user is mainly used; after the teaching stage is finished, the mobile terminal automatically collects the teaching data. Preferably, when the cloud server 30 analyzes and confirms that the change trend of the theoretical emotional state of the current user will exceed a critical value at a predictable time point, a prompt is sent to the current user through the mobile terminal 10.
Preferably, the cloud server 30 stores the current emotional state information input by the user and the external condition related to the actual emotional state information provided by the mobile terminal 10 in a manner of being associated with each other, which facilitates information retrieval. The mobile terminal 10 is arranged to retrieve the actual emotional state information stored in the cloud server 30 and/or the mobile terminal 10 by the user in a manner related to the external conditions.
Emotion retrieval is an important means in the process of emotion training. The user can retrieve the actual emotional state information stored in the cloud server by retrieving the body signal triggering some bad emotion. Preferably, the actual emotional state information may be some kind of feeling of mind recorded by the user in a text, picture and/or video manner, or may be sound, video or image information collected by the mobile terminal 1O.
Preferably, the step of inputting the actual emotional state by the mobile terminal 10 comprises: the user selects the type and level of the current emotion in a click-by-click manner, and/or the user inputs the actual emotional state of the user himself in a text, voice, video or graphic manner.
Preferably, the step of automatically acquiring the actual emotional state of the user by the mobile terminal 10 includes: the mobile terminal 10 collects an actual emotional state in an audio and/or video collection manner, and compares the collected actual emotional state with the emotion type and emotion level selected by the user in a click manner, thereby classifying and ranking the collected actual emotional state.
Specifically, the mobile terminal 10 may provide the actual emotional state information as the search result in a manner correlated with the external condition. For example, if an external condition "cloudy-winter" or a physical signal is entered and retrieved, several relevant emotional states "sad", "crying", etc. are retrieved. The user can know the influence of the external condition and remember the current situation.
Although the cloud server 30 can analyze the body signal and the emotional state, they are based on theoretical research. Each individual has different characters, and some people like cloudy days and have good mood in cloudy days. Some people feel negative in the cloudy days, and the mood is relatively negative in the cloudy days. Therefore, a teaching process is required to correct the theoretical emotional state so that the analysis of the cloud server 30 can be adapted to individual individuals. Preferably, the cloud server 30 corrects a theoretical emotional state determined based on the body signal analysis collected by the detector according to the actual emotional state information. An emotion management profile consisting of the corrected theoretical emotional states is stored at the mobile terminal in a manner retrievable from the body signals.
Preferably, when recording the actual emotional state information, the mobile terminal 10 can store and record the body signal related to a certain emotion in the form of data record when the user using the device of the present invention feels the emotion. The data record may be voice, text, picture, video or a combination thereof. For example, a smartphone is used as the mobile terminal 10 to specifically record and trigger external conditions such as time, place, and weather of the emotion, respectively.
Preferably, the corrected theoretical emotional state can be retrieved by a customizable keyword. Preferably, the retrieved theoretical emotional state is capable of delivering the retrieval result in a manner provided with the approximate actual emotional state. The theoretical emotional state and the approximate actual emotional state are provided together, so that a user can determine the desired retrieval information according to the retrieval result, and the accuracy of information retrieval is improved.
Preferably, the step of correcting the theoretical emotional state based on the actual emotional state comprises: the actual emotional state is compared to the theoretical emotional state and the theoretical emotional state is adjusted qualitatively and/or quantitatively to generate a user emotional profile associated with the current user. Preferably, the user emotional profile also takes into account external conditions associated with each of the actual emotional state information. Such a configuration is advantageous for rapidly guiding the negative emotion of the user to gradually change into the positive emotion, and specifically, setting personalized conditions related to the theoretical emotional state in the emotion configuration file of the user when the theoretical emotional state is adjusted. The personalized condition includes an external condition and a body signal matched with the user. When the user is in an extreme negative emotion and needs to be guided, the emotion configuration file of the user adjusts stimulation information of the mobile terminal based on personalized conditions related to a theoretical emotion state, or changes external conditions of the user to gradually adjust the emotion state of the user to slowly change.
Preferably, the display of the mobile terminal 10 includes an external condition indicator, a physical characteristic indicator, an emotional state indicator, and a feeling indicator. Wherein the change of the external condition identifier, the physical characteristic identifier, the emotional state identifier and the feeling identifier are associated with each other, and at least one external condition qualitatively and/or quantitatively changes the physical characteristic. The at least one physical characteristic qualitatively and/or quantitatively induces a change in the emotional state indicator. The at least one emotional state qualitatively and/or quantitatively induces a change in the user's perception. Preferably, when the user inputs feelings, the cloud server analyzes the actual emotional state of the corresponding external conditions, body signals and feelings sent by the mobile terminal and displays the actual emotional state as an emotional state identifier. Preferably, the external condition identifier, the physical characteristic identifier, the emotional state identifier and the feeling identifier are displayed in real time at the mobile terminal in a synchronous change mode. The user can simultaneously check the external condition mark, the body characteristic mark, the emotional state mark, the feeling mark and other mark changes caused by the change of one mark at the mobile terminal.
For example, the mobile terminal 10 stores and displays the external condition information, the body signal information, the actual emotional state information, and the feeling information in such a manner that the mapping relationship of the external condition, the body signal, the actual emotional state, and the feeling gradually increases. Wherein each ambient condition is associated with at least one body signal map. Each physical signal is associated with at least one actual emotional state map. Each actual emotional state is associated with at least one perception map.
Preferably, the cloud server records the actual emotional state and the theoretical emotional state based on an iterative analysis of at least one ambient condition, body signal, and/or sensation at each moment in time. The mobile terminal 10 displays the ambient condition change, the body signal change, the feeling change and the change trend of the actual emotional state in a manner of displaying the identification with the mapping association, reminds the user of the extreme change trend of the actual emotional state in a manner of color change, sound and/or vibration, and/or displays the suggestion of changing the actual emotional state.
The mobile terminal 1O displays the feeling of user input, the actual emotional state associated therewith, the external condition, and the physical signal in the form of a circular array composed of at least two circles. Wherein each circle is divided into a number of spaces for recording information.
As shown in fig. 1, the mobile terminal 10 stores and displays feelings input by the user, actual emotional states associated therewith, external conditions, and physical signals in the form of a circular list consisting of four circles. Wherein. The circular list includes outer and inner circle identifiers. The inner ring marks comprise a first inner ring mark 11 and a second inner ring mark 12 with the radius larger than that of the first inner ring mark, and the outer ring marks comprise a first outer ring mark 13 and a second outer ring mark 14 with the radius larger than that of the first outer ring mark.
Each space of the first inner circle of marks 11 is used for storing an ambient condition associated with the actual emotional state. Each space of the second inner circle of marks 12 is used for storing a physical feature associated with the actual emotional state. Each space of the first outer circle of marks 13 is used for storing the actual emotional state. Each space of the second outer circle label 14 is used to store a perception of user input. Changes in each type of information may cause changes in other information.
For example, the noise index in the external condition increases, each space of the external condition mark is recorded clockwise and changes in color, and each space of the body feature mark is recorded clockwise and changes in corresponding color. After the noise index increases beyond the personality threshold, each space of the sensory indicia registers the perception clockwise and a corresponding color change occurs. For example, the user input is perceived as fidgeting at level 2. The emotional state indicator changes based on changes in external conditions, changes in physical characteristics, and feelings and levels thereof, resulting in a change in the color of the emotional state indicator. When the actual emotional state of the user is not suitable for work or study, the mobile terminal 10 alerts the user in a manner of sound, vibration, or flickering, prompts the user to improve emotion, or guides emotional change. Preferably, the cloud server 30 issues advice for guiding emotion change to the user through the mobile terminal 10, such as sports advice, music song advice, and the like, in combination with external conditions.
Preferably, the cloud server 30 sends an emotion guidance suggestion, such as a travel suggestion, a visit suggestion, a parent activity suggestion, a work rest suggestion or even a life work rest suggestion, to the user through the mobile terminal 10 based on the emotional state change trend of the user in a period of time, so as to guide the user to live happily and healthily and prolong the life.
Example 2
This embodiment is a further improvement of embodiment 1, and repeated contents are not described again.
The embodiment provides an emotion control method, which at least comprises the following steps: automatically collecting and/or inputting actual emotional state information by a user; the method comprises the steps of collecting body signals in a mode of indirectly or directly contacting the body of a user, and determining theoretical emotional state information corresponding to the body signals by utilizing an emotion analysis algorithm based on a preset database. And analyzing and determining a theoretical emotional state corresponding to the body signal acquired by the detector and feeding the theoretical emotional state back to the mobile terminal. And under the condition that the change trend of the actual emotional state of the current user is analyzed and confirmed to exceed the critical value at the foreseeable time point and/or under the condition that the change trend of the theoretical emotional state of the current user is analyzed and confirmed to exceed the critical value at the foreseeable time point, sending an early warning prompt to the current user, and finishing a teaching process of the cloud server based on the operation of the mobile terminal by the user. And pre-configuring parameters of an emotion analysis algorithm according to at least two extreme emotional state information determined by the teaching process.
Preferably, the method further comprises: and analyzing and completing the teaching process of the cloud server based on the actual emotional state information, the body signals and/or the external conditions input by the user. And pre-configuring parameters of an emotion analysis algorithm according to at least two extreme emotional state information determined by the teaching process.
Preferably, the method further comprises: in the teaching process, the mobile terminal applies stimulation information capable of inducing emotion to a user, and the actual emotional state information corresponding to the stimulation information is automatically collected and/or input by the user. And simultaneously, the detector acquires body signals of the user corresponding to the stimulation information. And the cloud server pre-configures parameters of an emotion analysis algorithm based on at least two actual emotional state information and the body signal in a teaching process, and corrects theoretical emotional state information corresponding to the body signal.
Preferably, the method further comprises: storing or providing feelings input by a user in a text, voice, video and/or graphic mode and the corresponding automatically acquired actual emotional state in a relevant mode to the cloud server. Or recording external conditions causing the actual emotional state of the user and storing or providing the external conditions to the cloud server in a form associated with the corresponding actual emotional state. A correlation between a particular emotional state of the user and the ambient condition is analyzed, and an alert is given to an initiation of the particular emotional state based on the correlation.
Preferably, the method further comprises: and storing the current emotional state information input by the user and the external condition related to the actual emotional state information provided by the mobile terminal in a mutual correlation mode. The mobile terminal is arranged as a means for retrieving the actual emotional state information by the user in a manner correlated to the external conditions.
Preferably, the method further comprises: and correcting the theoretical emotional state determined based on the body signal analysis acquired by the detector according to the actual emotional state information, and storing an emotion management configuration file formed by the corrected theoretical emotional state in the mobile terminal in a manner of being capable of being retrieved according to the body signal.
Preferably, the method further comprises: the step of inputting the actual emotional state by the user comprises:
the user selects the type and level of the current emotion in a click-and-click manner, and/or
The user enters his own actual emotional state in text, voice, video or graphics.
Preferably, the method further comprises: storing and displaying the external condition information, the body signal information, the actual emotional state information, and the feeling information in such a manner that the mapping relationship of the external condition, the body signal, the actual emotional state, and the feeling gradually increases. Wherein each ambient condition is associated with at least one body signal map. Each physical signal is associated with at least one actual emotional state map. Each actual emotional state is associated with at least one perception map.
Preferably, the method further comprises: the actual emotional state and the theoretical emotional state are recorded based on an iterative analysis of at least one ambient condition, body signal, and/or sensation at each moment in time. And displaying the change trend of the external condition, the body signal change, the feeling change and the actual emotional state on the mobile terminal in a mode of displaying the identifier with mapping association. Alerting the user of extreme trends in the actual emotional state by means of color changes, sounds and/or vibrations, and/or displaying suggestions for changing the actual emotional state.
For example, a user with an actor as a job completes teaching with the cloud server 30 through the mobile terminal 10, so that the cloud server 30 determines an emotion analysis algorithm and completes correction of a theoretical emotional state during teaching.
When an actor needs to play a long-lived mother in the scene, the actor needs to retrieve the associated emotional state through custom keywords. After the actor inputs 'long-time reunion' at the mobile terminal, a plurality of information related to emotional states are retrieved, including character information, picture information and video information containing external conditions, physical characteristics and feelings. The actor further selects the mother's mood in accordance with the eligible character characteristics as needed and inputs instructions to direct the mood. The mobile terminal of the invention outputs stimulation information to the user, such as corresponding music and similar scene videos, guides the emotion of the user to change from the current calmness to the slight distraction, and simultaneously the actor inputs the own feelings into the mobile terminal, so that the cloud server analyzes the current actual emotional state of the actor, thereby determining the process and the steps of guiding the emotion. The cloud server continues to apply stimulation information to the actor in a state that the emotion of the actor is determined to have been adjusted to be slight, and the emotion of the actor is adjusted from 'being happy' to excited pleasure with tears. The actor is inputting the moving end with his body signal and his own sensation while being guided. And the cloud server analyzes the change trend of the current actual emotional state according to the body signals and the feelings of the actors. When the change trend of the actual emotional state of the actor is judged to be close to the emotion seen by a long-term reunion son within a foreseeable time, the actor is reminded in a vibration or sound mode. The actors adjust the emotion of themselves to be the target emotion, and smoothly express the emotion and the performance in the scene.
It should be noted that the above-mentioned embodiments are exemplary, and that those skilled in the art, having benefit of the present disclosure, may devise various arrangements that are within the scope of the present disclosure and that fall within the scope of the invention. It should be understood by those skilled in the art that the present specification and figures are illustrative only and are not limiting upon the claims. The scope of the invention is defined by the claims and their equivalents.

Claims (10)

1. An emotional state display terminal based on emotional guidance, at least one mobile terminal automatically collects and/or inputs actual emotional state information by a user; it is characterized in that the preparation method is characterized in that,
the mobile terminal is further configured to:
storing and displaying the external condition information, the body signal information, the actual emotional state information, and the sensation information in such a manner that the mapping association of the external condition, the body signal, the actual emotional state, and the sensation gradually increases, wherein,
each ambient condition is associated with at least one body signal map,
each body signal is associated with at least one actual emotional state map,
each actual emotional state is associated with at least one perception map.
2. The emotion state display terminal based on emotion guidance of claim 1, wherein the mobile terminal is further configured to:
in the case where the cloud server records the actual emotional state and the theoretical emotional state based on an iterative analysis of at least one external condition, body signal, and/or sensation at each moment, the external condition information, body signal information, actual emotional state information, and sensation information are stored and displayed in a manner such that the mapping associations of the external condition, body signal, actual emotional state, and sensation gradually increase.
3. The emotion state display terminal based on emotion guidance of claim 2, wherein the mobile terminal is further configured to:
the feeling of user input, the actual emotional state associated therewith, the external condition and the physical signal are displayed in a circular array consisting of at least two circles, wherein each circle is divided into a number of spaces for recording information.
4. The emotion state display terminal based on emotion guidance of claim 3, wherein the mobile terminal is further configured to:
storing and displaying the feeling input by the user, the actual emotional state associated therewith, the external condition, and the body signal in a circular list consisting of four circles;
the circular list comprises an outer ring mark and an inner ring mark.
5. The emotional state display terminal based on emotional guidance of claim 4, wherein the inner circle identifier comprises a first inner circle identifier (11) and a second inner circle identifier (12) having a radius larger than the first inner circle identifier;
the outer ring mark comprises a first outer ring mark (13) and a second outer ring mark (14) with the radius larger than that of the first outer ring mark;
each space of the first inner circle mark (11) is used for storing an external condition related to an actual emotional state;
each space of the second inner circle identifier (12) is used for storing a physical feature associated with an actual emotional state;
each space of the first outer ring of identifiers (13) is used for storing the actual emotional state;
each space of the second outer circle identifier (14) is used for storing a perception of user input.
6. An emotional state display terminal based on emotional guidance according to any of the previous claims, wherein the mobile terminal is further configured to:
retrieving, by the user, actual emotional state information stored at the cloud server (30) and/or the mobile terminal (10) in a manner correlated to an external condition.
7. The emotion state display terminal based on emotion guidance of claim 6, wherein the mobile terminal displays a teaching emotion video generated by the cloud server in real time.
8. The emotion state display terminal based on emotion guidance of claim 7, wherein in the real-time teaching process, the mobile terminal collects the emotion state information of the user at the moment, external conditions corresponding to the time axis one by one, and feelings of the user and stores the information in the user name, and the cloud server performs theoretical emotion analysis according to the emotion state information and the external conditions collected by the mobile terminal, and repeats the theoretical emotion analysis for multiple times until the teaching is finished.
9. The emotion state display terminal based on emotion guidance of claim 8, wherein the mobile terminal is further configured to:
in the teaching process, stimulation information capable of inducing emotion is applied to a user, and actual emotional state information corresponding to the stimulation information is automatically collected and/or input by the user.
10. An emotional state display method based on emotional guidance at least comprises the following steps: the mobile terminal automatically collects and/or inputs actual emotional state information by a user; it is characterized in that the preparation method is characterized in that,
the method further comprises the following steps:
storing and displaying the external condition information, the body signal information, the actual emotional state information, and the sensation information in such a manner that the mapping association of the external condition, the body signal, the actual emotional state, and the sensation gradually increases, wherein,
each ambient condition is associated with at least one body signal map,
each body signal is associated with at least one actual emotional state map,
each actual emotional state is associated with at least one perception map.
CN202110568055.2A 2018-02-02 2018-02-02 Emotion state display terminal and method based on emotion guidance Withdrawn CN113288144A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110568055.2A CN113288144A (en) 2018-02-02 2018-02-02 Emotion state display terminal and method based on emotion guidance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810106194.1A CN108209946A (en) 2018-02-02 2018-02-02 A kind of emotion control apparatus and method
CN202110568055.2A CN113288144A (en) 2018-02-02 2018-02-02 Emotion state display terminal and method based on emotion guidance

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810106194.1A Division CN108209946A (en) 2018-02-02 2018-02-02 A kind of emotion control apparatus and method

Publications (1)

Publication Number Publication Date
CN113288144A true CN113288144A (en) 2021-08-24

Family

ID=62670524

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201810106194.1A Withdrawn CN108209946A (en) 2018-02-02 2018-02-02 A kind of emotion control apparatus and method
CN202110568055.2A Withdrawn CN113288144A (en) 2018-02-02 2018-02-02 Emotion state display terminal and method based on emotion guidance
CN202110568627.7A Withdrawn CN113288145A (en) 2018-02-02 2018-02-02 Teaching device and method for training emotion control capability

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201810106194.1A Withdrawn CN108209946A (en) 2018-02-02 2018-02-02 A kind of emotion control apparatus and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110568627.7A Withdrawn CN113288145A (en) 2018-02-02 2018-02-02 Teaching device and method for training emotion control capability

Country Status (1)

Country Link
CN (3) CN108209946A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109859822A (en) * 2019-01-15 2019-06-07 浙江强脑科技有限公司 Emotion adjustment method, device and computer readable storage medium
CN110881987B (en) * 2019-08-26 2022-09-09 首都医科大学 Old person emotion monitoring system based on wearable equipment
CN110916688A (en) * 2019-11-25 2020-03-27 西安戴森电子技术有限公司 Method for monitoring emotion based on artificial intelligence technology

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107025371A (en) * 2017-03-09 2017-08-08 安徽创易心理科技有限公司 A kind of mood is dynamically monitored and management method and system
CN107007291A (en) * 2017-04-05 2017-08-04 天津大学 Intense strain intensity identifying system and information processing method based on multi-physiological-parameter
CN107464188A (en) * 2017-06-23 2017-12-12 浙江大学 A kind of internet social networking application system based on Internet of Things mood sensing technology
CN107582077A (en) * 2017-08-17 2018-01-16 南京信息工程大学 A kind of human body state of mind analysis method that behavior is touched based on mobile phone
CN107456218A (en) * 2017-09-05 2017-12-12 清华大学深圳研究生院 A kind of mood sensing system and wearable device

Also Published As

Publication number Publication date
CN113288145A (en) 2021-08-24
CN108209946A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108310587B (en) Sleep control device and method
US10885800B2 (en) Human performance optimization and training methods and systems
US11696714B2 (en) System and method for brain modelling
CN108652648B (en) Depression monitoring device for depression of old people
US11635813B2 (en) Systems and methods for collecting, analyzing, and sharing bio-signal and non-bio-signal data
US10528121B2 (en) Smart wearable devices and methods for automatically configuring capabilities with biology and environment capture sensors
CA2935813C (en) Adaptive brain training computer system and method
EP2972678A1 (en) Wearable computing apparatus and method
CN108652587B (en) Cognitive dysfunction prevention monitoring devices
CN113288144A (en) Emotion state display terminal and method based on emotion guidance
CN110582811A (en) dynamic multisensory simulation system for influencing behavioral changes
CN112163518A (en) Emotion modeling method for emotion monitoring and adjusting system
CN110881987A (en) Old person emotion monitoring system based on wearable equipment
CN108492855A (en) A kind of apparatus and method for training the elderly's attention
CN110693508A (en) Multi-channel cooperative psychophysiological active sensing method and service robot
CN108461125B (en) Memory training device for the elderly
CN112006652B (en) Sleep state detection method and system
KR20220071677A (en) Method and apparatus for measuring degree of immersion for driver or contents viewer
CN117653106A (en) Emotion classification system and method
ES2957419A1 (en) System and method for real-time detection of emotional states using artificial vision and natural language listening (Machine-translation by Google Translate, not legally binding)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210824

WW01 Invention patent application withdrawn after publication