CN115054248A - Emotion monitoring method and emotion monitoring device - Google Patents

Emotion monitoring method and emotion monitoring device Download PDF

Info

Publication number
CN115054248A
CN115054248A CN202111510699.2A CN202111510699A CN115054248A CN 115054248 A CN115054248 A CN 115054248A CN 202111510699 A CN202111510699 A CN 202111510699A CN 115054248 A CN115054248 A CN 115054248A
Authority
CN
China
Prior art keywords
user
emotion
time period
signal
evaluation result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111510699.2A
Other languages
Chinese (zh)
Other versions
CN115054248B (en
Inventor
邸皓轩
李丹洪
张晓武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111510699.2A priority Critical patent/CN115054248B/en
Priority to PCT/CN2022/119258 priority patent/WO2023103512A1/en
Publication of CN115054248A publication Critical patent/CN115054248A/en
Application granted granted Critical
Publication of CN115054248B publication Critical patent/CN115054248B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition

Abstract

The application provides an emotion monitoring method and an emotion monitoring device, which are beneficial to improving the accuracy of emotion recognition. The method comprises the following steps: acquiring photoplethysmography (PPG) signals of a user in real time, acquiring epidermal myoelectricity (SEMG) signals of the user in real time, and acquiring ACC signals of the user in real time; counting the emotion classification results of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user; determining an emotional state evaluation result of the user in the first time period based on the classification result of the emotion of the user at a plurality of moments in the first time period; determining a sleep quality evaluation result of the user in a first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user; and determining whether to carry out emotion early warning or not according to the emotion state evaluation result and the sleep quality evaluation result.

Description

Emotion monitoring method and emotion monitoring device
Technical Field
The present application relates to the field of terminal devices, and more particularly, to an emotion monitoring method and an emotion monitoring apparatus.
Background
In modern society, people pay more and more attention to physical and mental health, and the accumulation of negative emotions such as long-term anxiety, tension, anger, depression, sadness, pain and the like can cause negative psychological diseases such as mania, depression, autism, anxiety and the like.
Psychological and physiological providing researches show that the physiological reaction is greatly related to the emotional state of people, in order to pay attention to the physiological and psychological health of users, wearable equipment represented by a smart bracelet is gradually popularized at present, and the emotion, the movement, the sleep, the health indexes and the like of the users can be monitored.
Currently, emotion monitoring is mainly based on physiological signal measurement, and in a possible implementation manner, because time domain features and frequency domain features of different emotions are different, a photoplethysmography (PPG) signal of a user can be acquired, a Pulse Rate Variability (PRV) sequence is extracted from the PPG signal, the extracted PRV sequence is analyzed in time domain and frequency domain to acquire the time domain features and the frequency domain features, and the time domain features and the frequency domain features are used as input of a neural network to perform emotion recognition.
However, the above method cannot comprehensively reflect physiological characteristics under different emotions, and the emotion recognition accuracy is not high.
Disclosure of Invention
The application provides an emotion monitoring method and an emotion monitoring device, which are beneficial to improving the accuracy of emotion recognition.
In a first aspect, a mood monitoring method is provided, and is applied to a terminal device with a pulse wave sensor, a myoelectric electrode sensor and an acceleration sensor. The method comprises the following steps: the method comprises the steps of acquiring photoplethysmography (PPG) signals of a user in real time through a pulse wave sensor, acquiring epidermodynamic (SEMG) signals of the user in real time through an electromyographic electrode sensor, and acquiring ACC signals of the user in real time through an acceleration sensor. And counting the classification results of the emotion of the user at a plurality of moments in the first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, wherein the classification results comprise positive emotion, calm emotion or negative emotion. And determining the emotional state evaluation result of the user in the first time period based on the classification result of the emotion of the user at a plurality of times in the first time period. And determining the sleep quality evaluation result of the user in the first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user. And determining whether to carry out emotion early warning or not according to the emotion state evaluation result and the sleep quality evaluation result.
In the application, the terminal equipment can obtain the emotion state evaluation result of the user in the first time period by combining the PPG signal, the SEMG signal, the ACC signal, the age information and the gender information of the user, determine the sleep quality evaluation result of the user in the first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user, and comprehensively consider the emotion of the user according to the emotion state evaluation result and the sleep quality evaluation result of the user in the first time period, so that the emotion recognition accuracy is improved, and better mental health service is provided for the user.
With reference to the first aspect, in certain implementations of the first aspect, the counting, in combination with the PPG signal, the SEMG signal, the age information of the user, and the gender information of the user, classification results of emotions of the user at a plurality of times within the first time period includes: and performing feature extraction on the PPG signal, the SEMG signal, the age information of the user and the gender information of the user at a first moment in the plurality of moments to obtain a plurality of groups of biological features of the user at the first moment. And performing feature fusion on the multiple groups of biological features at the first moment to obtain fusion features of the user at the first moment. And obtaining a classification result of the emotion of the user at the first moment based on the fusion characteristics at the first moment.
In the application, the terminal equipment can acquire multiple groups of biological characteristics of the user at the first moment through characteristic extraction, the interpretability and the characterization capability of the fused characteristics are strong, and emotion recognition is performed based on the fused characteristics, so that the accuracy rate of emotion recognition is improved.
With reference to the first aspect, in some implementations of the first aspect, performing feature fusion on multiple sets of biometric features at a first time to obtain a fusion feature of the user at the first time includes: and calculating the weight of each group of the biological characteristics in the plurality of groups of biological characteristics through the flexible maximum transfer function. And obtaining the fusion characteristics of the user at the first moment according to each group of biological characteristics and the weight of each group of biological characteristics.
In the application, different physiological characteristics have different weight values, so that the importance degrees of different physiological signals can be obtained, and the emotion recognition result is more accurate and reliable.
With reference to the first aspect, in certain implementations of the first aspect, the plurality of sets of biometrics comprises at least one of: heart rate characteristics, respiration characteristics, blood oxygen characteristics, blood glucose characteristics, blood pressure characteristics, or epidermal myoelectric characteristics.
With reference to the first aspect, in certain implementations of the first aspect, determining an emotional state evaluation result of the user at the first time period based on classification results of emotions of the user at a plurality of times within the first time period includes: the number of negative emotions of the user in the first time period and the duration of each negative emotion are obtained from the classification results of the emotions at a plurality of times. And determining the emotional state evaluation result of the user in the first time period based on the number of negative emotions and the duration of each negative emotion.
In the application, the terminal equipment can acquire the times of negative emotions of the user in the first time period and the duration of each negative emotion, and the longer the accumulated negative emotion time is, the lower the emotion state evaluation score is, so that the emotion of the user in the first time period can be accurately evaluated.
With reference to the first aspect, in certain implementations of the first aspect, determining a sleep quality assessment result of the user over a first time period with reference to the PPG signal, the ACC signal, age information of the user, and gender information of the user includes: and obtaining the sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axis. Based on the PPG signal, heart rate variation information of the user in a first time period is obtained. And obtaining the total sleeping time, the deep sleeping time and the light sleeping time of the user in the first time period according to the heart rate change information. And determining a sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep time, the deep sleep time and the light sleep time.
In the application, because the sleep conditions of the user may be different under different emotions, the terminal device acquires the sleep evaluation result of the user by acquiring the heart rate change information of the user in the sleep time period, and the emotion recognition accuracy is improved beneficially according to the sleep evaluation result.
With reference to the first aspect, in some implementations of the first aspect, determining whether to perform emotion early warning according to the emotional state evaluation result and the sleep quality evaluation result includes: and obtaining the emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result. And determining to carry out emotion early warning under the condition that the emotion monitoring result meets the preset condition.
In the application, if the terminal device monitors that the emotion monitoring result of the user in the first time period meets the preset condition, the terminal device can perform emotion early warning on the user, so that the psychological health of the user can be concerned in time, and the use experience of the user is improved.
With reference to the first aspect, in certain implementations of the first aspect, obtaining an emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result includes: and carrying out weighted summation on the numerical value after the emotional state evaluation result is quantized and the numerical value after the sleep quality evaluation result is quantized to obtain the emotional score of the user in the first time period. Under the condition that the emotion monitoring result meets the preset condition, determining to carry out emotion early warning, wherein the method comprises the following steps of: and determining to perform emotion early warning under the condition that the emotion score is less than or equal to a preset threshold value.
With reference to the first aspect, in certain implementations of the first aspect, the emotional age of the user is determined according to the emotion monitoring result of the user over the first time period, the number of negative emotions, and age information of the user. Displaying at least one of an emotional age, an emotional monitoring result, or an emotional statement.
In the application, the terminal device can display at least one of the emotional age, the emotional monitoring result or the emotional sentence to the user, so that the emotion adjustment of the user is facilitated.
With reference to the first aspect, in certain implementations of the first aspect, peaks and troughs of the ACC signal are acquired. And determining whether the user is in a non-motion state according to the wave crest and the wave trough of the ACC signal. Combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, counting the emotion classification results of the user at a plurality of times in the first time period, including: under the condition that the user is in a non-motion state, counting the classification results of the emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user.
In the application, because the PPG signal is greatly interfered by motion, emotion recognition needs to be realized in a non-motion state, so that the terminal equipment can judge whether the user is in the non-motion state through the ACC signal, if the user is in the motion state, the collected PPG signal and the collected SEMG signal are invalid signals and need to be collected again, thereby being beneficial to obtaining effective PPG signal and SEMG signal and improving the accuracy of emotion recognition.
In a second aspect, there is provided an emotion monitoring apparatus comprising: for performing the method of any one of the possible implementations of the first aspect described above. In particular, the apparatus comprises means for performing the method of any one of the possible implementations of the first aspect described above.
In a third aspect, there is provided another emotion monitoring apparatus, comprising a processor, coupled to a memory, and configured to execute instructions in the memory to implement the method in any of the possible implementations of the first aspect. Optionally, the apparatus further comprises a memory. Optionally, the apparatus further comprises a communication interface, the processor being coupled to the communication interface.
In one implementation, the emotion monitoring device is a terminal device. When the emotion monitoring apparatus is a terminal device, the communication interface may be a transceiver, or an input/output interface.
In another implementation manner, the emotion monitoring device is a chip configured in the terminal device. When the emotion monitoring apparatus is a chip configured in the terminal device, the communication interface may be an input/output interface.
In a fourth aspect, a processor is provided, comprising: input circuit, output circuit and processing circuit. The processing circuit is configured to receive a signal via the input circuit and transmit a signal via the output circuit, so that the processor performs the method of any one of the possible implementations of the first aspect.
In a specific implementation process, the processor may be a chip, the input circuit may be an input pin, the output circuit may be an output pin, and the processing circuit may be a transistor, a gate circuit, a flip-flop, various logic circuits, and the like. The input signal received by the input circuit may be received and input by, for example and without limitation, a receiver, the signal output by the output circuit may be output to and transmitted by a transmitter, for example and without limitation, and the input circuit and the output circuit may be the same circuit that functions as the input circuit and the output circuit, respectively, at different times. The specific implementation of the processor and various circuits are not limited in this application.
In a fifth aspect, a processing apparatus is provided that includes a processor and a memory. The processor is configured to read instructions stored in the memory, and may receive signals via the receiver and transmit signals via the transmitter to perform the method of any one of the possible implementations of the first aspect.
Optionally, there are one or more processors and one or more memories.
Alternatively, the memory may be integrated with the processor, or provided separately from the processor.
In a specific implementation process, the memory may be a non-transitory (non-transitory) memory, such as a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips, and the type of the memory and the arrangement manner of the memory and the processor are not limited in this application.
It will be appreciated that the associated data interaction process, for example, sending the indication information, may be a process of outputting the indication information from the processor, and receiving the capability information may be a process of receiving the input capability information from the processor. In particular, the data output by the processor may be output to a transmitter and the input data received by the processor may be from a receiver. The transmitter and receiver may be collectively referred to as a transceiver, among others.
The processing device in the fifth aspect may be a chip, the processor may be implemented by hardware or software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory, which may be integrated with the processor, located external to the processor, or stand-alone.
In a sixth aspect, there is provided a computer program product comprising: computer program (also called code, or instructions), which when executed, causes a computer to perform the method of any of the possible implementations of the first aspect described above.
In a seventh aspect, a computer-readable storage medium is provided, which stores a computer program (which may also be referred to as code or instructions) that, when executed on a computer, causes the computer to perform the method in any of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic diagram of a terminal device to which an embodiment of the present application is applicable;
fig. 2 is a schematic physical structure diagram of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic flow chart of an emotion monitoring method provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of an emotion classification process provided by an embodiment of the present application;
FIG. 5 is a graph illustrating the variation of the heart rate during sleep according to an embodiment of the present disclosure;
fig. 6 is a schematic system architecture diagram of a terminal device according to an embodiment of the present application;
fig. 7 is an interface schematic diagram of a terminal device according to an embodiment of the present application;
fig. 8 is a schematic waveform diagram of an ACC signal provided in an embodiment of the present application;
FIG. 9 is a schematic flow chart diagram of another emotion monitoring method provided by an embodiment of the present application;
fig. 10 is a schematic block diagram of an emotion monitoring apparatus provided in an embodiment of the present application;
fig. 11 is a schematic block diagram of another emotion monitoring device provided in an embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings.
Before introducing the emotion monitoring method and emotion monitoring apparatus provided in the embodiments of the present application, the following description is made.
First, in the embodiments shown below, terms and english abbreviations such as PPG signals, ACC signals, pulse wave sensors, etc. are given as illustrative examples for convenience of description, and should not limit the present application in any way. This application is not intended to exclude the possibility that other terms may be defined in existing or future protocols to carry out the same or similar functions.
Second, the first, second and various numerical numbers in the embodiments shown below are merely for convenience of description and are not intended to limit the scope of the embodiments of the present application. For example, to distinguish between different terminal devices, etc.
Third, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a alone, A and B together, and B alone, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, and c, may represent: a, or b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Generally, emotions can be classified into three categories, calm, positive, and negative, wherein positive emotions can include excited, relaxed, surprised, and the like, and negative emotions can include anxious, nervous, angry, depressed, sad, painful, and the like.
Emotional age is a measure of the level of emotional development, and different age groups may correspond to different emotional development levels, and a person's mood corresponds to what emotional performance should be at what age, i.e., has a corresponding emotional age. Generally, the ability of people to manage emotions is gradually enhanced with the age, and gradually increased from naughty in young children, vividness in young children, vigorous growth in young adults, and unrelenting in strong years to the steady growth of middle-aged and elderly people. However, some people do not have the mental activities of becoming mature although the body is gradually growing and maturing. Therefore, it is possible to try to measure the emotional age of a person by using a criterion of differentiating the emotional control ability level according to the age according to the time law of the development of Emotional Quotients (EQ). Table one is an exemplary EQ lookup table.
Watch 1
Figure BDA0003405230880000051
The emotional age can be generalized in this application to four periods: juvenile, active, catatonic and wisdom.
Wherein, the juvenile period corresponds to the grade I in the table I, the emotion in the period changes rapidly and is difficult to control, and the duration of each emotion is short.
The active period corresponds to the second grade in the first table, the emotion in the active period is easily influenced by academic industry, love and family relationship, the negative emotion is easily excited, the duration is long, and the active period has certain self-regulation capability.
The stationary period corresponds to the third grade and the fourth grade in the first table, the period is understood to be convergent emotion, negative emotion is not easy to arouse, and control capability is strong.
The wisdom period corresponds to five, six and seven grades in the first table, and the emotion change in the period is small.
Gender differences also bring about differences in mood. It has been found that women have stronger emotion expression compared with men, that is, women tend to exaggerate the emotion felt by themselves, while men have more obvious heart rate change compared with women, that is, men have stronger emotion experience and are influenced by emotion more, but are unwilling to express the emotion for various reasons.
Emotions may cause physiological responses, such as changes in autonomic nervous activity, including frequency of breathing, speed of exhalation and inhalation, quality of breathing, heart rate, vascular volume, blood pressure, electrodermal, endocrine and exocrine glands, and the like. For another example, brain wave changes, such as alpha waves, beta waves, delta waves, and theta waves, vary with mood, with the alpha waves varying the most. In addition, emotions may also cause adverse reactions such as dizziness, distending pain and the like.
Respiratory changes under different emotions: illustratively, the breath is 20 breaths/minute at rest. With respect to calm mood, breathing depth is not large at happy hours, frequency is slightly fast, rhythm is relatively regular, and illustratively, breathing is 23 times/minute at calm. The sad breaths slowly and the rest time is long, and illustratively, the sad breaths are 9 times/minute. The breathing frequency is very fast during fear, the phenomena of intermittence and pause exist, the amplitude is irregular, and the shivering state is presented, illustratively, the breathing is 64 times/minute during fear. While breathing frequency increases when angry and breathing depth increases dramatically, illustratively, 40 breaths/minute when angry.
Heart rate variation under different moods: the heart rate is 60-80 times/minute when the mood is calm, the patient is influenced by adrenal hormone when the mood is happy, the heart rate is accelerated, and the rhythm is regular. When sadness, the heart beat is accelerated and the artery is contracted, and in severe cases, arrhythmia occurs. Fear is an acceleration of the heart beat accompanied by arrhythmia. When angry, the heartbeat accelerates, and the feeling of palpitation and palpitation is felt.
Negative emotions may change in blood, muscle, sleep, electrical signals, etc. Among them, negative emotions may lead to increased secretion of stress hormones and vasoactive substances such as cortisol, norepinephrine, epinephrine, and catecholamine, leading to systemic vasoconstriction, increased heart rate, increased blood pressure, and abnormal blood oxygen content. Negative emotions can lead to muscle tension, contraction. Negative emotions can lead to decreased sleep quality, difficulty falling asleep, dreaminess.
Through the investigation on the physiological changes of emotion changes, the differences of respiration, heart rate, blood pressure, blood sugar, blood oxygen, myoelectric signals and the like of different emotions are found, and the changes of the physiological indexes caused by positive emotions and negative emotions are different. At present, photoplethysmography (PPG) signals can be applied to monitoring physiological indexes such as heart rate, blood pressure, blood oxygen, blood sugar, and respiration.
Illustratively, blood glucose monitoring may be based on infrared light in the wavelength range of 1250 to 1333nm, 1600 to 1666nm, or 2025 nm. The heart rate can be calculated according to the time sequence peak and frequency domain characteristics acquired by the PPG signal.
Illustratively, blood oxygen saturation can be judged by detecting information of oxyhemoglobin HbO2 and hemoglobin Hb from infrared (600nm to 800nm) and near infrared (800nm to 1000nm) light, respectively, for blood oxygen monitoring.
Illustratively, since the speed of pulse wave transmission is directly related to blood pressure, pulse wave transmission is fast when blood pressure is high and slow when blood pressure is low. Therefore, the systolic pressure and the diastolic pressure of the human pulse wave can be estimated by establishing a characteristic equation, so that noninvasive continuous blood pressure monitoring is realized.
Illustratively, the respiratory signal may be extracted by wavelet decomposition or Empirical Mode Decomposition (EMD) of the PPG signal.
Fig. 1 is a schematic diagram of a hardware structure of a terminal device 100 to which an embodiment of the present application is applicable. As shown in fig. 1, the terminal device 100 includes a pulse wave sensor 101, a myoelectric electrode sensor 102, an acceleration sensor 103, and a signal processing unit 104.
The pulse wave sensor 101 obtains a PPG signal of the user, and sends the pulse wave signal to the signal processing unit 104. The myoelectric electrode sensor 102 is used for acquiring an SEMG signal of a user and sending the SEMG signal to the signal processing unit 104. The acceleration sensor 103 is configured to acquire an ACC signal of the user and send the ACC signal to the signal processing unit 104. The signal processing unit 104 is used for receiving the PPG signal, the SEMG signal, and the ACC signal to monitor and identify the emotion of the user.
Optionally, the terminal device 100 may further comprise a display 105, a storage unit 106, interaction hardware 107, a wireless unit 108 and a battery 109.
The user can perform touch and other operations on the terminal device 100 through the interaction hardware 107 to realize the user interaction operation with the terminal device 100. After the signal processing unit 104 obtains the emotion monitoring result of the user, the emotion monitoring result may be displayed to the user through the display 105, and stored in the storage unit 106. The signal processing unit 104 may also send the emotion monitoring result to other terminal devices or a cloud terminal associated with the terminal device 100 through the wireless unit 108.
Terminal equipment in the embodiments of the present application may refer to user equipment, access terminals, subscriber units, subscriber stations, mobile stations, remote terminals, mobile devices, user terminals, wireless communication devices, user agents, or user devices. The terminal in the embodiment of the present application may be a mobile phone (mobile phone), a tablet computer (pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal, an Augmented Reality (AR) terminal, a Mixed Reality (MR) terminal, an extended reality (XR) terminal, a holographic display terminal, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self driving), or other processing devices connected to a wireless modem, a vehicle-mounted device, a terminal in a 5G network, or a terminal in a future evolution network, etc.
By way of example and not limitation, in the embodiments of the present application, the terminal device may also be a wearable device. Wearable equipment can also be called wearable intelligent equipment, is the general term of applying wearable technique to carry out intelligent design, develop the equipment that can dress to daily wearing, like glasses, gloves, wrist-watch, dress and shoes etc.. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable smart device includes full functionality, large size, and can implement full or partial functionality without relying on a smart phone, such as: smart watches or smart glasses and the like, and only focus on a certain type of application functions, and need to be used in cooperation with other devices such as smart phones, such as various smart bracelets for physical sign monitoring, smart jewelry and the like.
In addition, the terminal device may also be a terminal device in an internet of things (IoT) system. The IoT is an important component of future information technology development, and is mainly technically characterized in that articles are connected with a network through a communication technology, so that an intelligent network with man-machine interconnection and object interconnection is realized. The specific form of the terminal device is not limited in the present application.
It should be understood that in the embodiment of the present application, the terminal device may be an apparatus for implementing a function of the terminal device, or may be an apparatus capable of supporting the terminal device to implement the function, such as a chip system, and the apparatus may be installed in the terminal. In the embodiment of the present application, the chip system may be formed by a chip, and may also include a chip and other discrete devices.
In the following, the terminal device 100 is described as an example of a smart watch. Fig. 2 is a schematic physical structure diagram of a smart watch 200 according to an embodiment of the present application. As shown in fig. 2, the smart watch 200 includes a dial 201 and a band 202. The pulse wave sensor 101 and the myoelectric electrode sensor 102 shown in fig. 1 may be located at the bottom of the dial 201, and the acceleration sensor 103 and the signal processing unit 104 may be located inside the dial 201. The user may wear the terminal device 100 on the wrist through the band 202 to acquire PPG signals, SEMG signals, and ACC signals of the user. Optionally, the dial 201 is configured with the display screen 105, the user may input age information and/or gender information through touch operation on the display screen 105, and the PPG signal, the SEMG signal, the ACC signal collected by the terminal device and the input age information and/or gender information of the user may be stored in the storage unit 106.
It should be noted that the terms of orientation such as "top" and "bottom" used by the terminal device 100 in the embodiments of the present application are mainly used for describing the device, and do not form a limitation on the orientation of the terminal device 100 in an actual application scenario.
Fig. 3 is a schematic flow chart of an emotion monitoring method 300 provided in an embodiment of the present application. The method 300 may be applied to a terminal device deployed with a pulse wave sensor, a myoelectric electrode sensor, and an acceleration sensor, a hardware structure of the terminal device in the embodiment of the present application may be the hardware structure of the terminal device 100 shown in fig. 1, and an entity structure may be the smart watch 200 shown in fig. 2, which is not limited in the embodiment of the present application. The method 300 includes the steps of:
s301, PPG signals of a user are obtained in real time through a photoplethysmography sensor, epidermic myoelectricity SEMG signals of the user are obtained in real time through a myoelectricity electrode sensor, and ACC signals of the user are obtained in real time through an acceleration sensor.
And S302, counting the classification results of the emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, wherein the classification results comprise positive emotion, calm emotion or negative emotion.
And S303, determining the emotional state evaluation result of the user in the first time period based on the classification result of the emotion of the user at a plurality of moments in the first time period.
And S304, determining the sleep quality evaluation result of the user in the first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user.
S305, determining whether to carry out emotion early warning according to the emotion state evaluation result and the sleep quality evaluation result.
The pulse wave sensor may be a PPG sensor, a pressure sensor, or an ultrasound sensor, for example.
For example, in the embodiment of the present application, the PPG signal, the SEMG signal, the ACC signal, the age information of the user, and the gender information of the user may be collectively referred to as a physiological signal.
In this embodiment, the terminal device 100 may obtain emotion classification results of the user at multiple measurement moments in the first time period according to the collected PPG signal of the user, the SEMG signal, the age information of the user, and the gender information of the user. The terminal device 100 can obtain a sleep quality evaluation result of the user in the first time period according to the collected PPG signal, the ACC signal, the age information of the user and the gender information of the user, and obtain an emotion monitoring result of the user in the first time period together with the emotion state evaluation result of the user in the first time period, wherein the emotion monitoring result is beneficial to obtaining a more accurate emotion monitoring result due to the combination of the PPG signal, the SEMG signal, the ACC signal, the age information of the user and the gender information of the user, and the sleep condition of the user is also considered, so that emotion early warning is timely provided for the mental health of the user.
As an alternative embodiment, S302 includes: the PPG signal, the SEMG signal, the age information of the user, and the gender information of the user for a first time of the plurality of times. And performing feature extraction on the PPG signal, the SEMG signal, the age information of the user and the gender information of the user to obtain a plurality of groups of biological features of the user at the first moment. And performing feature fusion on the multiple groups of biological features at the first moment to obtain the fusion features of the user at the first moment. And obtaining a classification result of the emotion of the user at the first moment based on the fusion characteristics of the first moment.
In the embodiment of the present application, the emotion classification process at the first time is described by taking the first time in the plurality of times in the first time period as an example. It should be understood that the first time may be any time among the plurality of times, and at different times, the user may be influenced by surroundings to generate behavior and emotion changes, so that each time has its corresponding PPG signal, SEMG signal and ACC signal.
Illustratively, the first time period is 24 hours, the terminal device 100 may classify the emotion of the user at every other hour according to the collected physiological signals of the user, in this example, the multiple times are 24 times, and the terminal device 100 may obtain 24 emotion classification results in the first time period.
Illustratively, a positive emotion may be represented by a value of 1, a calm emotion by a value of 0, and a negative emotion by a value of-1.
Fig. 4 is a schematic diagram of an emotion classification process provided in an embodiment of the present application. Exemplarily, fig. 4 shows the emotion classification process at the first time described above. As shown in fig. 4, the terminal device 100 may perform feature extraction on the acquired SEMG signal, PPG signal, age information, and gender information to obtain multiple sets of biological features, perform feature fusion on the multiple sets of biological features to obtain fusion features, and input the fusion features into a classification network for classification to obtain an emotion classification result.
Optionally, the terminal device 100 performs feature extraction on the SEMG signal, including preprocessing the SEMG signal to obtain a preprocessed SEMG signal.
Illustratively, the preprocessing of the SEMG signal may be filtering the SEMG signal using a wavelet transform. In the embodiment of the present application, the biological characteristics obtained by the filtered SEMG signal may be referred to as epidermal myoelectric characteristics.
Optionally, the terminal device 100 performs feature extraction on the age information and the gender information of the user, including the terminal device 100 preprocessing the age information and the gender information of the user. For example, the pre-processing of the age information and the gender information of the user may be a numerical processing of the age information and the gender information of the user.
Illustratively, the actual age of the user may be classified and then numerically processed, for example, 0 to 10 years may be represented by a value 1, 11 to 20 years may be represented by a value 2, 21 to 30 years may be represented by a value 3, 31 to 40 years may be represented by a value 4, 41 to 50 years may be represented by a value 5, 51 to 60 years may be represented by a value 6, 61 to 70 years may be represented by a value 7, and 70 years or older may be represented by a value 8.
Illustratively, men may be represented by a value of 0 and women by a value of 1.
Optionally, the terminal device 100 performs feature extraction on the PPG signal, including preprocessing the PPG signal. The terminal device 100 may input the preprocessed PPG signals to a feature extraction network to perform feature extraction, so as to obtain multiple groups of biological features of the user, and may obtain a feature value corresponding to each group of biological features according to regression of the biological features of the user.
Illustratively, the pre-processing of the PPG signal may be filtering the PPG signal, e.g., using a wavelet transform to achieve filtering of the PPG signal.
In one possible implementation, the feature extraction network may perform signal decomposition and frequency domain signal transformation on the preprocessed PPG signal to extract the biometric features.
When performing signal decomposition on the preprocessed PPG signal, for example, the terminal device 100 may detect a time domain feature of the preprocessed PPG signal, extract a time domain feature value and an optimal band of the preprocessed PPG signal, perform waveform segmentation on the extracted optimal band to obtain a plurality of monocycle waveforms, perform sparse decomposition on the monocycle waveforms to obtain atomic feature parameter features of the signal.
When the frequency domain signal transformation is performed on the preprocessed PPG signal, for example, a fourier transform may be used to obtain a frequency domain feature value of the PPG signal.
After performing signal decomposition and frequency domain signal transformation on the preprocessed PPG signal, terminal device 100 may perform feature fusion on the atomic feature parameter features, the time domain feature values, the frequency domain feature values, and the preprocessed PPG signal.
Illustratively, the feature fusion can be realized by means of feature splicing or feature summation.
In one possible implementation, the feature extraction network may be a time-series network, for example, a Gated Recurrent Unit (GRU) network. The GRU is a modified network structure of a Recurrent Neural Network (RNN), and can retain important features of input data through a gate function. In the embodiment of the application, at least one of the heart rate feature, the respiration feature, the blood oxygen feature, the blood sugar feature and the blood pressure feature of the user can be extracted through the feature extraction network.
It should be understood that the above manners of extracting the features of the PPG signal, the SEMG signal, the age information of the user, and the gender information of the user are only examples, and the embodiment of the present application does not limit this.
As can be seen from the above description, the respiration, the heart rate, the blood pressure, the blood sugar, the blood oxygen, the electromyographic signals and the like of different emotions are different, and the changes of the physiological indexes caused by the positive emotion and the negative emotion are different, so that the biological characteristics acquired in the embodiment of the present application can clearly reflect the changes of the biological signals under different emotions, and have strong interpretability and characterization capability.
As an optional embodiment, performing feature fusion on the multiple sets of biometric features at the first time to obtain a fused feature of the user at the first time includes: and calculating the weight of each group of the plurality of groups of biological characteristics through the flexible maximum value transmission softmax function. And obtaining the fusion characteristics of the user at the first moment according to each group of biological characteristics and the weight of each group of biological characteristics.
As shown in fig. 4, after obtaining the plurality of sets of biometric features, the terminal device 100 may perform feature fusion on the plurality of sets of biometric features to obtain a fused feature. Illustratively, the feature fusion in the embodiments of the present application is adaptive feature fusion, and each set of biometric features has a corresponding feature weight. The feature weight is related to the samples, and can be adjusted in a self-adaptive mode according to different samples, so that the importance of the features with different dimensions can be obtained, and the expression capability of the fusion features is higher.
Illustratively, there are N sets of biometrics in common, which are expressed as P ═ P in the form of a feature matrix 1 ;P 2 ;……P N ]And each group of biological features has M elements, so that the dimension of the feature matrix P is N × M, wherein × represents matrix multiplication.
For example, the weighting function F (W, P) ═ softmax (W) M*N *P*P T +B M*1 Axis ═ 0) calculating the weight of each group of biometrics, where W is M*N Trainable weight matrices representing dimensions M x N, B M*1 The trainable bias matrix is expressed in M × 1 dimension, axis ═ 0 represents that softmax is executed separately for each row, and finally, a weight matrix in M × N dimension is obtained.
After obtaining the weight matrix F (W, P), matrix-multiplying the M × N dimensional weight matrix F (W, P) with the N × M dimensional biometric matrix P, so that each set of biometric features can be assigned with corresponding weights, and then obtaining a fused feature matrix through a diagonal function Diag (x), i.e. the fused feature matrix feature (F (W, P) × P), where Diag (x) indicates that diagonal elements are taken to form a new matrix.
As an alternative embodiment, S303 includes: and acquiring the times of negative emotions of the user in the first time period and the duration of each negative emotion from the classification results of the emotions at a plurality of times. And determining the emotional state evaluation result of the user in the first time period based on the number of negative emotions and the duration of each negative emotion.
In the embodiment of the present application, assuming that the first time period is 24 hours, and the terminal device 100 performs emotion recognition on the user every 1 hour, the first time period may be further divided into 24 sub-time periods, and the duration of each sub-time period is 1 hour. The terminal device 100 may obtain an emotion classification result for each sub-period based on the PPG signal, the SEMG signal, the age information of the user, and the gender information acquired for each sub-period, and represent the emotion recognition results for the 24 sub-periods as (0, 0, 0, 0, 0, 1, 0, 0, 0, -1, 0, 0, 0, -1, 0, 0, 0, 0, -1, -1, 0, 1) in a set form, where 0 represents a calm emotion, 1 represents a positive emotion, and 1 represents a negative emotion.
The terminal device 100 may obtain the number of negative emotions in the first time period from the set of emotion classification results, and in this example, it can be seen that the user has 3 negative emotions in the first time period. Further, the terminal device 100 may obtain the duration of each negative emotion according to changes of the respiration, the heart rate, the blood pressure, the blood sugar, the blood oxygen, the myoelectric signal, and the like reflected by the PPG signal and the SEMG signal acquired in the sub-period in which the negative emotion occurs to the user.
Illustratively, the PPG signal and the SEMG signal are continuously acquired by terminal device 100, and feature extraction may be performed on the PPG signal and the SEMG signal every 1s, every 5s, or every 10s, and the duration of the negative emotion starts to accumulate from the initial time when terminal device 100 detects the negative emotion until terminal device 100 does not detect the negative emotion, and then the accumulated duration from the initial time when terminal device 100 initially detects the negative emotion until the end time when terminal device does not detect the negative emotion is the duration of the negative emotion.
Illustratively, the emotional state evaluation result of the user in the first time period can be embodied in the form of scores. The emotional state evaluation score _ e can be obtained by the following formula:
Figure BDA0003405230880000111
wherein T represents a duration of the first period of time, n represents a number of occurrences of negative emotions in the first period of time, and T i Indicating the duration of the ith negative emotion. As can be seen from the above formula, the longer the accumulated negative emotional time is, the lower the emotional state evaluation score is.
Alternatively, the terminal device 100 may display the real-time emotion obtained at each time in the first period to the user. It should be understood that the real-time emotion may be embodied in the form of sentences and images, and the embodiment of the present application is not limited thereto.
Illustratively, when the emotion classification result is a positive emotion, the terminal device 100 may display "keep good mood! "," mood is not wrong! The refueling is continued! "the day of full primordial qi" or "the mood bar is Da-".
Illustratively, when the emotion classification result is a positive emotion, the terminal device 100 may also display a pattern such as "sun", "like", "applause", or "cheering".
Illustratively, when the emotion classification result is a negative emotion, terminal device 100 may display "refuel, to go! And "sunlight always behind the weather" or "black cloud quickly leaves" and the like.
Illustratively, when the emotion classification result is a negative emotion, the terminal device 100 may also display a pattern of "pump up", "encouragement", or "hug", for example.
It should be understood that the above form of representing the emotion is only an example, and other forms may also be adopted to represent the real-time emotion of the user, which is not limited in the embodiment of the present application.
As an alternative embodiment, S304 includes: and obtaining the sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axis. Acquiring heart rate change information of the user in the first time period based on the PPG signal, and obtaining the total sleeping time, the deep sleeping time and the light sleeping time of the user in the first time period according to the heart rate change information. And determining the sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep time, the deep sleep time and the light sleep time.
In the embodiment of the present application, the ACC signal has different gravitational acceleration components in the x-axis, y-axis and z-axis due to the moving state and the non-moving state. Wherein, the value of the x-axis direction represents that the terminal device 100 moves horizontally, the value of the y-axis direction represents that the terminal device 100 moves vertically, and the value of the z-axis direction represents the spatial vertical direction of the terminal device 100. In the non-motion state, the moduli of the gravitational acceleration components of the ACC signal in the x, y and z axes should stabilize at around 9.8. Therefore, during the sleep of the user, if the terminal device 100 monitors that the variation of the modulus values of the gravity acceleration components of the ACC signal in the x-axis, the y-axis and the z-axis exceeds a first preset threshold, and the duration is greater than or equal to a second preset threshold, the terminal device 100 may consider that the sleep of the user is interrupted.
Fig. 5 is a graph illustrating a variation of a sleep heart rate according to an embodiment of the present disclosure. As shown in fig. 5, the heart rate of the user gradually decreases from falling asleep to light sleep (abbreviated as light sleep) to deep sleep (abbreviated as deep sleep). Illustratively, the user begins to go to sleep at time T1, and is in a light sleep state for a time period T1-T2, the duration being T 1 And (4) showing. After time T2, the heart rate of the user tends to be stable, and the user is in a deep sleep state for a time period T2-T3, and the duration is T 2 And (4) showing. Illustratively, the heart rate stabilizes at 55-60 beats/min during deep sleep. After time T3, the heart rate of the user begins to rise slowly, and the user is in a light sleep state for a time period T3-T4, and the duration of the time period is T 3 Indicating that the user is gradually awake until time t 4. This may result in the total sleeping time T _ s ═ T of the user for the first time period 1 +T 2 +T 3 The deep sleep time period T _ d ═ T 2 The time period of light sleep T _ q ═ T 1 +T 3
And obtaining the sleep quality evaluation result of the user in the first time period by combining the total sleep time, the deep sleep time, the light sleep time and the sleep interruption times of the user. The sleep quality evaluation result can be embodied in the form of a score, and the sleep quality evaluation score of the user is recorded as score _ s. Table two shows one possible sleep quality assessment score.
Watch 2
Score of 90~100 80~90 60~80 40~60 <40
Total length of sleep 7 to 8 hours Not less than 9 hours 5 to 6 hours 3 to 4 hours 1 to 2 hours
Time length ratio of deep sleep >25% 15%~25% 10%~15% 3%~10% <3%
Time length ratio of light sleep <50% 50%~65% 65%~75% 75%~90% >90%
Number of sleep interruptions ≤1 2~4 4~6 6~8 >8
The following schematically presents a logical decision code to derive a sleep quality assessment score:
Figure BDA0003405230880000121
as an alternative embodiment, S305 includes: and obtaining the emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result. And determining to carry out emotion early warning under the condition that the emotion monitoring result meets the preset condition.
According to the emotion monitoring method and device, the emotion monitoring result of the user in the first time period can be judged together by combining the emotion state evaluation result and the sleep quality evaluation result of the user, so that the monitoring is more accurate and comprehensive, and the emotion monitoring method and device are beneficial to improving the emotion recognition accuracy.
Illustratively, the terminal device 100 may quantify the emotional state evaluation result and the sleep quality evaluation result, resulting in an emotional state evaluation score _ e and a sleep quality evaluation score _ s. The terminal device 100 may perform weighted summation on the emotional state evaluation score _ e and the sleep quality evaluation score _ s to obtain an emotional score _ out of the user in a first time period, and determine to perform an emotional early warning when the emotional score _ out is less than or equal to a third preset threshold.
The mood score _ out can be obtained by the following formula:
score_out=α×score_s+(1-α)×score_e
where α represents a correction factor. Illustratively, α is 0.5.
In one possible implementation manner, the terminal device 100 may display warning information to the user, where the warning information carries the emotion monitoring result of the user in the first time period, for example, the terminal device 100 displays the emotion score to the user and displays a corresponding emotion sentence to prompt the user to pay attention to the emotional health.
In another possible implementation manner, in the system architecture 600 of the terminal device shown in fig. 6, the terminal device 100 may interact information with the terminal device 110, for example, the terminal device 100 may send warning information to the terminal device 110. Here, the terminal device 110 is a terminal device connected to the terminal device 100 through a network, for example, the terminal device 100 and the terminal device 110 are connected through bluetooth.
Illustratively, the warning information may include a PPG signal, an ACC signal and an SEMG signal of the user, and the terminal device 110 may obtain an emotion monitoring result of the user according to the obtained PPG signal, ACC signal and SEMG signal, in combination with age information and gender information input by the user on the terminal device 110, and display the emotion monitoring result and/or corresponding emotion statements to the user to warn the user.
Illustratively, the warning information may include emotion monitoring results obtained by terminal device 100 according to PPG signals, ACC signals and SEMG signals of the user, and terminal device 110 may display the emotion monitoring results and/or corresponding emotion sentences to the user to warn the user.
Table three shows possible display contents of the terminal device 100 or the terminal device 110 under different scores.
Watch III
Emotional score Display statement example
90~100 You are in good mood today and please continue to maintain
70~90 The sunlight always after wind and rain
<70 What can be seen as a rainbow without experiencing wind and rain
Illustratively, the emotional state evaluation result and the sleep quality evaluation result are embodied in the form of evaluation grades. For example, the emotional state evaluation result and the sleep quality evaluation result are classified into A, B, C, D, E five levels, and when the emotional state evaluation result of the user is a or B and the sleep quality evaluation result is a or B, the terminal device 100 may obtain that the emotion monitoring result of the user in the first time period is a high level, which indicates that the emotion is good. When the emotion state evaluation result of the user is C and the sleep quality evaluation result is C, the terminal device 100 may obtain that the emotion monitoring result of the user in the first time period is a middle level, which indicates that the emotion is normal. When the emotional state evaluation result of the user is D or E and the sleep quality evaluation result is D or E, the terminal device 100 may obtain that the emotion monitoring result of the user in the first time period is a low level, which indicates that the emotion is abnormal or negative.
Illustratively, when the terminal device 100 obtains that the emotion monitoring result of the user in the first time period is a low level, the terminal device 100 may send warning information to the terminal device 110.
It should be understood that the terminal device 100 may display at least one of the emotional state evaluation result, the sleep quality evaluation result, or the emotion monitoring result to the user, which is not limited in this embodiment of the application.
It should be further understood that the above-mentioned emotional state evaluation result, sleep quality evaluation result, and emotion monitoring result embodiment are only examples, and may also be displayed to the user in other forms such as a data histogram, a data pie chart, a line chart, an emoticon, and the like, which is not limited in this embodiment of the present application.
Optionally, the terminal device 100 may store the warning information in the cloud, and the terminal device 110 may periodically obtain the warning information from the cloud. Illustratively, the terminal device 100 stores the emotion monitoring result of the user every 24 hours in the cloud, and the terminal device 110 may acquire the warning information from the cloud every 24 hours, every 48 hours, or every 72 hours, and display corresponding warning content to the user according to the emotion monitoring result of the user carried in the warning information.
For example, if the emotion monitoring result shows that the emotion score of the user is low or low in the past 24 hours, 48 hours or 72 hours, terminal device 110 may consider displaying a sentence encouraging oiling to the user through a health-class Application (APP) on terminal device 110, and may also display the emotion state evaluation result and/or the sleep quality evaluation result of the user through the health-class APP.
Illustratively, if the emotion monitoring result shows that the emotion score of the user is low or low in the past 24 hours, 48 hours or 72 hours, the terminal device 110 may further warn the user through the monitoring APP to prompt the user to communicate or psychologically coach in a gentle manner. For example, sentences such as "weather is good, how much people go out of the walking bar", "chat with friends' often", or "finding a person to complain about a bad mood bar" may be displayed to the user through the monitoring class APP.
In one possible scenario, the terminal device 100 may be a smart watch worn by an immature user, the terminal device 110 may be a parent's mobile phone, the parent's mobile phone is associated with the immature user's smart watch, and the parent may obtain at least one of an emotional state evaluation result, a sleep quality evaluation result, or an emotion monitoring result of the immature user through the mobile phone, and perform timely communication and psychological counseling on the immature user according to the emotion monitoring result.
Illustratively, if the emotion monitoring result shows that the user's emotion score is low or low in the past 24 hours, 48 hours or 72 hours, the terminal device 110 may also intelligently recommend positive multimedia resources to the user through the video APP, the music APP or the news APP on the terminal device 110, thereby playing a role in alleviating the negative emotion of the user.
It should be understood that the period (24 hours, 48 hours, or 72 hours) in which the terminal device 110 obtains the warning information from the cloud is merely an example, and the period in which the warning information is obtained is not limited in the embodiment of the present application.
As an alternative embodiment, the method 300 further comprises: and determining the emotional age of the user according to the emotion monitoring result of the user in the first time period, the number of negative emotions and the age information of the user. Displaying at least one of an emotional age, an emotional monitoring result, or an emotional statement to the user.
In the embodiment of the present application, the terminal device 100 may obtain the emotional age of the user, so that the user may know the control capability of the user on the emotion based on the emotional age, which is beneficial to improving the user experience.
Table four shows a correspondence of emotion scores, negative emotion times and emotional age, for reference only.
Watch four
Emotional score Number of negative emotions x Emotional age
80~100 0 Wisdom phase
80~100 1≤x≤3 Stationary phase of heavy load
80~100 ≥3 In the juvenile childhood
<70 - Active period
In the case that the physical structure of the terminal device 100 is the smart watch 200, fig. 7 is an interface schematic diagram of the smart watch 200 according to an embodiment of the present application. As shown in fig. 7, the smart watch 200 may display the emotional age, emotional score, and corresponding emotional statement of the user. For example, the user's emotional score is 98, the emotional age is a stable period, and the emotional statement is "keep good mood! ".
As an alternative embodiment, the method 300 further comprises: peaks and troughs of the ACC signal are acquired. And determining whether the user is in a non-motion state according to the wave crest and the wave trough of the ACC signal. S302 comprises: under the condition that the user is determined to be in a non-motion state, counting the classification results of the emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user.
In the embodiment of the present application, since the PPG signal is significantly disturbed by motion, the emotion monitoring needs to be performed in a non-motion state. The terminal device 100 may determine whether the PPG signal and the SEMG signal are stable signals acquired when the user is in a non-moving state, through a peak and a trough of the ACC signal.
Fig. 8 is a waveform diagram of an ACC signal provided in an embodiment of the present application. Illustratively, the ACC signal recognized by the terminal device 100 has a time duration of 10s, and as shown in fig. 8, the ACC signal has a waveform with 1 peak and 1 valley within the recognized time duration, where the peak is W 1 Wave trough value of W 2 If W is 1 And W 2 If the difference is smaller than or equal to the fourth preset threshold and the sum of the number of peaks and troughs is smaller than or equal to the fifth preset threshold, terminal device 100 may determine that the user is in a non-moving state, and the acquired PPG signal and SEMG signal are available signals, which may be used for subsequent emotion monitoring.
Fig. 9 is a schematic flow chart of another emotion monitoring method 900 provided in an embodiment of the present application. Method 900 may be performed by terminal device 100, method 900 comprising the steps of:
and S901, acquiring a PPG signal, an SEMG signal, an ACC signal, age information of the user and gender information of the user in real time.
In this step, the PPG signal may be obtained by a pulse wave sensor, the SEMG signal may be obtained by a myoelectric electrode sensor, and the ACC signal may be obtained by an acceleration sensor.
And S902, judging whether the user is in a non-motion state according to the ACC signal. In the case where the user is in a non-moving state, S903 is performed. If the user is in a motion state, the collected PPG signal and the collected SEMG signal are invalid signals and need to be collected again.
And S903, classifying the emotion of the user according to the PPG signal, the SEMG signal, the age information of the user and the gender information of the user to obtain an emotion classification result.
In this step, the terminal device 100 may periodically perform emotion recognition on the collected PPG signal, SEMG signal, age information of the user, and gender information of the user to obtain emotion classification results at a plurality of different times. For example, the terminal device 100 may perform feature extraction on the PPG signal, the SEMG signal, the age information of the user, and the gender information of the user, and perform feature fusion on the extracted biological features, where specific feature extraction and feature fusion processes are described above and are not described herein again.
It should be understood that the emotion classification result in this step includes emotion classification results at a plurality of different time points.
And S904, obtaining the emotion state evaluation score of the user according to the counted negative emotion occurrence times in the emotion classification result and the duration of each negative emotion.
In this step, assuming that there are emotion classification results of the user every hour in 24 hours in the emotion classification results, the emotional state evaluation score of the user in 24 hours can be obtained according to the counted number of negative emotions occurring in 24 hours and the single duration.
And S905, monitoring the sleep quality of the user according to the PPG signal, the ACC signal, the age information of the user and the gender information of the user to obtain the sleep quality evaluation score of the user.
In this step, the terminal device 100 may obtain the number of sleep interruptions of the user according to the ACC signal, obtain the time for the user to enter sleep, the time for the user to exit sleep, the deep sleep duration, and the light sleep duration according to the heart rate variation of the user obtained from the PPG signal, and then evaluate the sleep quality of the user in combination with the number of sleep interruptions, the time for the user to enter sleep, the time for the user to exit sleep, the deep sleep duration, the light sleep duration, and the age and gender of the user to obtain the sleep quality evaluation score of the user. Wherein, the motion state lasting for more than 5s can be regarded as sleep interruption.
S906, weighting and summing the emotional state evaluation score and the sleep quality evaluation score of the user to obtain the emotional score of the user.
And S907, sending early warning information to the cloud under the condition that the emotion score of the user is less than or equal to the emotion threshold, wherein the early warning information carries at least one of the emotion state evaluation score, the sleep quality evaluation score or the emotion score of the user.
In this step, the terminal device 100 may send the warning information to the cloud end when monitoring that the emotion score of the user is low, and the terminal device 110 associated with the terminal device 100 may obtain the warning information from the cloud end, and further determine whether to warn the user in emotion on the terminal device 110, so as to prompt the user to pay attention to emotional health in ways of APP pushing positive energy information, displaying encouragement sentences, and the like.
And S908, obtaining the emotional age of the user according to the emotional score and the number of times of negative emotion appearance of the user.
S909, at least one of the emotional age, the emotional score, or the emotional sentence is displayed to the user.
Optionally, terminal device 100 may further transmit the emotional age to terminal device 110 to display at least one of the emotional age, the emotional score, or the emotional sentence by terminal device 110.
In this embodiment of the application, under the condition that the collected PPG signal and SEMG signal are relatively stable according to the ACC signal, the terminal device 100 may obtain biological characteristics of the user under different emotions, such as heart rate, blood pressure, blood oxygen, blood sugar, muscle contraction, and the like, according to the PPG signal and SEMG signal of the user, and obtain an emotional state evaluation result by combining the age and gender of the user. Since the sleep conditions of the user may be different under different emotions, the terminal device 100 may also monitor the sleep quality of the user according to the PPG signal, the ACC signal, the age and the gender of the user to obtain a sleep quality evaluation result of the user, and further determine the emotion score of the user according to the emotion state evaluation result and the sleep quality evaluation result of the user, so that the obtained emotion score can more accurately reflect the emotion of the user for a period of time, which is beneficial to improving the accuracy of emotion recognition.
It should be understood that the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The emotion monitoring method according to the embodiment of the present application is described in detail above with reference to fig. 1 to 9, and the emotion monitoring device according to the embodiment of the present application will be described in detail below with reference to fig. 10 and 11.
Fig. 10 shows a schematic block diagram of an emotion monitoring apparatus 1000 provided in an embodiment of the present application, where the apparatus 1000 includes an acquisition module 1010 and a processing module 1020.
The obtaining module 1010 is configured to: the method comprises the steps of acquiring photoplethysmography (PPG) signals of a user in real time through a pulse wave sensor, acquiring epidermodynamic (SEMG) signals of the user in real time through an electromyographic electrode sensor, and acquiring ACC signals of the user in real time through an acceleration sensor. The processing module 1020 is configured to: counting the classification results of the emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, wherein the classification results comprise positive emotion, calm emotion or negative emotion; determining an emotional state evaluation result of the user in the first time period based on the classification result of the emotion of the user at a plurality of moments in the first time period; determining a sleep quality evaluation result of the user in a first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user; and determining whether to carry out emotion early warning or not according to the emotion state evaluation result and the sleep quality evaluation result.
Optionally, the processing module 1020 is configured to: and performing feature extraction on the PPG signal, the SEMG signal, the age information of the user and the gender information of the user at a first moment in the plurality of moments to obtain a plurality of groups of biological features of the user at the first moment. Performing feature fusion on the multiple groups of biological features at the first moment to obtain fusion features of the user at the first moment; and obtaining a classification result of the emotion of the user at the first moment based on the fusion characteristics of the first moment.
Optionally, the processing module 1020 is configured to: calculating the weight of each group of biological characteristics in the multiple groups of biological characteristics through the flexible maximum value transmission function; and obtaining the fusion characteristics of the user at the first moment according to each group of biological characteristics and the weight of each group of biological characteristics.
Optionally, the plurality of sets of biometric features includes at least one of: heart rate characteristics, respiration characteristics, blood oxygen characteristics, blood glucose characteristics, blood pressure characteristics, or epidermal myoelectric characteristics.
Optionally, the obtaining module 1010 is configured to: and acquiring the number of negative emotions of the user in a first time period and the duration of each negative emotion from the classification results of the emotions at a plurality of times. The processing module 1020 is configured to: and determining the emotional state evaluation result of the user in the first time period based on the number of negative emotions and the duration of each negative emotion.
Optionally, the processing module 1020 is configured to: and obtaining the sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axis. The obtaining module 1010 is configured to: based on the PPG signal, heart rate variation information of the user in a first time period is obtained. The processing module 1020 is configured to: obtaining the total sleeping time, the deep sleeping time and the light sleeping time of the user in a first time period according to the heart rate change information; and determining a sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep time, the deep sleep time and the light sleep time.
Optionally, the processing module 1020 is configured to: obtaining an emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result; and determining to carry out emotion early warning under the condition that the emotion monitoring result meets the preset condition.
Optionally, the processing module 1020 is configured to: carrying out weighted summation on the numerical value after the emotional state evaluation result is quantized and the numerical value after the sleep quality evaluation result is quantized to obtain the emotional score of the user in the first time period; and determining to perform emotion early warning under the condition that the emotion score is smaller than or equal to a preset threshold value.
Optionally, the processing module 1020 is configured to: determining the emotional age of the user according to the emotion monitoring result of the user in the first time period, the times of negative emotions and the age information of the user; and displaying at least one of an emotional age, an emotional monitoring result, or an emotional statement.
Optionally, the obtaining module 1010 is configured to: peaks and troughs of the ACC signal are acquired. The processing module 1020 is configured to: determining whether the user is in a non-motion state according to the wave crest and the wave trough of the ACC signal; and under the condition that the user is in a non-motion state, counting the classification results of the emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user.
In an alternative example, as will be understood by those skilled in the art, the apparatus 1000 may be embodied as the terminal device in the above-described embodiment, or the functions of the terminal device in the above-described embodiment may be integrated into the apparatus 1000. The above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above. The apparatus 1000 may be configured to perform various processes and/or steps corresponding to the terminal device in the foregoing method embodiments.
It should be appreciated that the apparatus 1000 herein is embodied in the form of functional modules. The term module herein may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an embodiment of the present application, the apparatus 1000 in fig. 10 may also be a chip or a chip system, for example: system on chip (SoC).
Fig. 11 shows a schematic block diagram of another emotion monitoring apparatus 1100 provided in an embodiment of the present application. The apparatus 1100 includes a processor 1110, a transceiver 1120, and a memory 1130. The processor 1110, the transceiver 1120 and the memory 1130 are in communication with each other through an internal connection path, the memory 1130 is used for storing instructions, and the processor 1110 is used for executing the instructions stored in the memory 1130 to control the transceiver 1120 to transmit and/or receive signals.
It should be understood that the apparatus 1100 may be embodied as the terminal device in the foregoing embodiment, or the functions of the terminal device in the foregoing embodiment may be integrated in the apparatus 1100, and the apparatus 1100 may be configured to perform each step and/or flow corresponding to the terminal device in the foregoing method embodiment. Alternatively, the memory 1130 may include both read-only memory and random access memory, and provides instructions and data to the processor. The portion of memory may also include non-volatile random access memory. For example, the memory may also store device type information. The processor 1110 may be configured to execute the instructions stored in the memory, and when the processor executes the instructions, the processor may perform the steps and/or processes corresponding to the electronic device in the above method embodiments.
It should be understood that, in the embodiment of the present application, the processor 1110 may be a Central Processing Unit (CPU), and the processor may also be other general processors, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), Field Programmable Gate Arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and so on. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor executes instructions in the memory, in combination with hardware thereof, to perform the steps of the above-described method. To avoid repetition, it is not described in detail here.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the module described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific implementation of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the embodiments of the present application, and all the changes or substitutions should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (23)

1. A mood monitoring method is characterized by being applied to terminal equipment with a pulse wave sensor, a myoelectric electrode sensor and an acceleration sensor, and comprises the following steps:
acquiring a photoplethysmography (PPG) signal of a user in real time through the pulse wave sensor, acquiring an epidermic myoelectricity (SEMG) signal of the user in real time through the myoelectricity electrode sensor, and acquiring an ACC signal of the user in real time through the acceleration sensor;
counting classification results of emotions of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, wherein the classification results comprise positive emotions, calm emotions or negative emotions;
determining an emotional state evaluation result of the user at the first time period based on the classification result of the user's emotion at a plurality of times within the first time period;
determining a sleep quality evaluation result of the user in the first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user;
and determining whether to carry out emotion early warning or not according to the emotion state evaluation result and the sleep quality evaluation result.
2. The method of claim 1, wherein the combining the PPG signal, the SEMG signal, the age information of the user, and the gender information of the user to count the classification of the mood of the user at a plurality of moments in time over a first time period comprises:
performing feature extraction on a PPG signal, the SEMG signal, the age information of the user and the gender information of the user at a first moment in the multiple moments to obtain multiple groups of biological features of the user at the first moment;
performing feature fusion on the multiple groups of biological features at the first moment to obtain fusion features of the user at the first moment;
and obtaining a classification result of the emotion of the user at the first moment based on the fusion characteristics of the first moment.
3. The method according to claim 2, wherein the feature fusing the plurality of sets of biometric features at the first time to obtain a fused feature of the user at the first time comprises:
calculating the weight of each group of biological characteristics in the plurality of groups of biological characteristics through a flexible maximum value transfer function;
and obtaining the fusion characteristics of the user at the first moment according to the biological characteristics of each group and the weight of the biological characteristics of each group.
4. The method of claim 2 or 3, wherein the plurality of sets of biometrics comprises at least one of:
heart rate characteristics, respiration characteristics, blood oxygen characteristics, blood glucose characteristics, blood pressure characteristics, or epidermal myoelectric characteristics.
5. The method according to any one of claims 1-4, wherein determining the emotional state assessment result of the user at the first time period based on the classification result of the user's emotion at a plurality of times within the first time period comprises:
acquiring the number of negative emotions of the user in the first time period and the duration of each negative emotion from the classification results of the emotions at the plurality of times;
and determining the emotional state evaluation result of the user in the first time period based on the number of negative emotions and the duration of each negative emotion.
6. The method according to any one of claims 1-5, wherein said determining a sleep quality assessment result of the user for the first time period in combination with the PPG signal, the ACC signal, the age information of the user, and the gender information of the user comprises:
obtaining the sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axis;
acquiring heart rate variation information of the user in the first time period based on the PPG signal;
obtaining the total sleeping time, the deep sleeping time and the light sleeping time of the user in the first time period according to the heart rate change information;
and determining the sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep time length, the deep sleep time length and the light sleep time length.
7. The method according to any one of claims 1-6, wherein the determining whether to perform an emotional alert based on the emotional state assessment result and the sleep quality assessment result comprises:
obtaining an emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result;
and determining to carry out emotion early warning under the condition that the emotion monitoring result meets a preset condition.
8. The method of claim 7, wherein obtaining the emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result comprises:
carrying out weighted summation on the numerical value after the emotional state evaluation result is quantized and the numerical value after the sleep quality evaluation result is quantized to obtain the emotional score of the user in the first time period;
and determining to perform emotion early warning under the condition that the emotion monitoring result meets a preset condition, wherein the emotion early warning comprises the following steps:
and determining to carry out emotion early warning under the condition that the emotion score is smaller than or equal to a preset threshold value.
9. The method according to claim 7 or 8, characterized in that the method further comprises:
determining the emotional age of the user according to the emotion monitoring result of the user in the first time period, the number of negative emotions and the age information of the user;
displaying at least one of the emotional age, the emotional monitoring result, or an emotional statement.
10. The method according to any one of claims 1-9, further comprising:
acquiring peaks and troughs of the ACC signal;
determining whether the user is in a non-motion state according to the peaks and the troughs of the ACC signals;
the combining the PPG signal, the SEMG signal, the age information of the user, and the gender information of the user, and counting the classification result of the emotion of the user at a plurality of times within a first time period includes:
and under the condition that the user is in a non-motion state, counting the classification results of the emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user.
11. An emotion monitoring device, comprising:
the acquisition module is used for acquiring a photoplethysmography (PPG) signal of a user in real time through the pulse wave sensor, acquiring an epidermic myoelectricity (SEMG) signal of the user in real time through the myoelectricity electrode sensor and acquiring an ACC signal of the user in real time through the acceleration sensor;
the processing module is used for combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, and counting classification results of emotions of the user at a plurality of moments in a first time period, wherein the classification results comprise positive emotions, calm emotions or negative emotions;
the processing module is further configured to: determining an emotional state evaluation result of the user at the first time period based on the classification result of the user's emotion at a plurality of times within the first time period;
the processing module is further configured to: determining a sleep quality evaluation result of the user in the first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user;
the processing module is further configured to: and determining whether to carry out emotion early warning or not according to the emotion state evaluation result and the sleep quality evaluation result.
12. The apparatus of claim 11, wherein the processing module is configured to:
performing feature extraction on a PPG signal, the SEMG signal, the age information of the user and the gender information of the user at a first moment in the multiple moments to obtain multiple groups of biological features of the user at the first moment;
performing feature fusion on the multiple groups of biological features at the first moment to obtain fusion features of the user at the first moment;
and obtaining a classification result of the emotion of the user at the first moment based on the fusion characteristics of the first moment.
13. The apparatus of claim 12, wherein the processing module is configured to:
calculating the weight of each group of biological characteristics in the plurality of groups of biological characteristics through a flexible maximum value transfer function;
and obtaining the fusion characteristics of the user at the first moment according to the biological characteristics of each group and the weight of the biological characteristics of each group.
14. The apparatus of claim 12 or 13, wherein the plurality of sets of biometric features comprises at least one of:
heart rate characteristics, respiration characteristics, blood oxygen characteristics, blood glucose characteristics, blood pressure characteristics, or epidermal myoelectric characteristics.
15. The apparatus of any one of claims 11-14, wherein the obtaining module is configured to:
acquiring the number of negative emotions of the user in the first time period and the duration of each negative emotion from the classification results of the emotions at the plurality of times;
the processing module is used for: and determining the emotional state evaluation result of the user in the first time period based on the number of negative emotions and the duration of each negative emotion.
16. The apparatus of any one of claims 11-15, wherein the processing module is configured to:
obtaining the sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axis;
the acquisition module is configured to: acquiring heart rate variation information of the user in the first time period based on the PPG signal;
the processing module is further configured to: obtaining the total sleeping time, the deep sleeping time and the light sleeping time of the user in the first time period according to the heart rate change information;
the processing module is further configured to: and determining a sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep time, the deep sleep time and the light sleep time.
17. The apparatus of any one of claims 11-16, wherein the processing module is configured to:
obtaining an emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result;
and determining to carry out emotion early warning under the condition that the emotion monitoring result meets a preset condition.
18. The apparatus of claim 17, wherein the processing module is configured to:
carrying out weighted summation on the numerical value after the emotional state evaluation result is quantized and the numerical value after the sleep quality evaluation result is quantized to obtain the emotional score of the user in the first time period;
and determining to carry out emotion early warning under the condition that the emotion score is smaller than or equal to a preset threshold value.
19. The apparatus of claim 17 or 18, wherein the processing module is configured to:
determining the emotional age of the user according to the emotion monitoring result of the user in the first time period, the number of negative emotions and the age information of the user;
displaying at least one of the emotional age, the emotional monitoring result, or an emotional statement.
20. The apparatus according to any of claims 11-19, wherein the obtaining module is configured to:
acquiring peaks and troughs of the ACC signals;
the processing module is used for: determining whether the user is in a non-motion state according to the peaks and the troughs of the ACC signals;
the processing module is further configured to: and under the condition that the user is in a non-motion state, counting the classification results of the emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user.
21. An emotion monitoring device, comprising: a processor coupled with a memory for storing a computer program that, when invoked by the processor, causes the apparatus to perform the method of any of claims 1-10.
22. A computer-readable storage medium for storing a computer program comprising instructions for implementing the method of any one of claims 1-10.
23. A computer program product, characterized in that computer program code is included in the computer program product, which, when run on a computer, causes the computer to carry out the method according to any one of claims 1-10.
CN202111510699.2A 2021-12-10 2021-12-10 Emotion monitoring method and emotion monitoring device Active CN115054248B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111510699.2A CN115054248B (en) 2021-12-10 2021-12-10 Emotion monitoring method and emotion monitoring device
PCT/CN2022/119258 WO2023103512A1 (en) 2021-12-10 2022-09-16 Emotion monitoring method and emotion monitoring apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111510699.2A CN115054248B (en) 2021-12-10 2021-12-10 Emotion monitoring method and emotion monitoring device

Publications (2)

Publication Number Publication Date
CN115054248A true CN115054248A (en) 2022-09-16
CN115054248B CN115054248B (en) 2023-10-20

Family

ID=83196907

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111510699.2A Active CN115054248B (en) 2021-12-10 2021-12-10 Emotion monitoring method and emotion monitoring device

Country Status (2)

Country Link
CN (1) CN115054248B (en)
WO (1) WO2023103512A1 (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016144284A1 (en) * 2015-03-06 2016-09-15 Елизавета Сергеевна ВОРОНКОВА Method for recognising the movement and psycho-emotional state of a person and device for carrying out said method
US20160302711A1 (en) * 2015-01-29 2016-10-20 Affectomatics Ltd. Notifying a user about a cause of emotional imbalance
CN106419841A (en) * 2016-09-13 2017-02-22 深圳市迈迪加科技发展有限公司 Method, device and system for evaluating sleep
US9596997B1 (en) * 2015-09-14 2017-03-21 Whoop, Inc. Probability-based usage of multiple estimators of a physiological signal
US20170238858A1 (en) * 2015-07-30 2017-08-24 South China University Of Technology Depression assessment system and depression assessment method based on physiological information
WO2017193497A1 (en) * 2016-05-09 2017-11-16 包磊 Fusion model-based intellectualized health management server and system, and control method therefor
CN107874750A (en) * 2017-11-28 2018-04-06 华南理工大学 Pulse frequency variability and the psychological pressure monitoring method and device of sleep quality fusion
JP2018082730A (en) * 2016-11-15 2018-05-31 都築 北村 Biological risk acquisition device, biological risk acquisition method, biological risk acquisition program, and recording medium
JP2018166653A (en) * 2017-03-29 2018-11-01 アイシン精機株式会社 Mood determination device
CN109460752A (en) * 2019-01-10 2019-03-12 广东乐心医疗电子股份有限公司 Emotion analysis method and device, electronic equipment and storage medium
CN110876613A (en) * 2019-09-27 2020-03-13 深圳先进技术研究院 Human motion state identification method and system and electronic equipment
CN112880686A (en) * 2021-01-20 2021-06-01 湖南赫兹信息技术有限公司 Object motion monitoring and positioning method, device and storage medium
US20210161482A1 (en) * 2017-07-28 2021-06-03 Sony Corporation Information processing device, information processing method, and computer program
US20210219891A1 (en) * 2018-11-02 2021-07-22 Boe Technology Group Co., Ltd. Emotion Intervention Method, Device and System, and Computer-Readable Storage Medium and Healing Room
WO2021208902A1 (en) * 2020-04-15 2021-10-21 华为技术有限公司 Sleep report generation method and apparatus, terminal, and storage medium
WO2021213263A1 (en) * 2020-04-21 2021-10-28 深圳市万普拉斯科技有限公司 Call window control method and apparatus, mobile terminal, and readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105231997A (en) * 2015-10-10 2016-01-13 沈阳熙康阿尔卑斯科技有限公司 Sleep quality judging method and sleep instrument

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160302711A1 (en) * 2015-01-29 2016-10-20 Affectomatics Ltd. Notifying a user about a cause of emotional imbalance
WO2016144284A1 (en) * 2015-03-06 2016-09-15 Елизавета Сергеевна ВОРОНКОВА Method for recognising the movement and psycho-emotional state of a person and device for carrying out said method
US20170238858A1 (en) * 2015-07-30 2017-08-24 South China University Of Technology Depression assessment system and depression assessment method based on physiological information
US9596997B1 (en) * 2015-09-14 2017-03-21 Whoop, Inc. Probability-based usage of multiple estimators of a physiological signal
WO2017193497A1 (en) * 2016-05-09 2017-11-16 包磊 Fusion model-based intellectualized health management server and system, and control method therefor
CN106419841A (en) * 2016-09-13 2017-02-22 深圳市迈迪加科技发展有限公司 Method, device and system for evaluating sleep
JP2018082730A (en) * 2016-11-15 2018-05-31 都築 北村 Biological risk acquisition device, biological risk acquisition method, biological risk acquisition program, and recording medium
JP2018166653A (en) * 2017-03-29 2018-11-01 アイシン精機株式会社 Mood determination device
US20210161482A1 (en) * 2017-07-28 2021-06-03 Sony Corporation Information processing device, information processing method, and computer program
CN107874750A (en) * 2017-11-28 2018-04-06 华南理工大学 Pulse frequency variability and the psychological pressure monitoring method and device of sleep quality fusion
US20210219891A1 (en) * 2018-11-02 2021-07-22 Boe Technology Group Co., Ltd. Emotion Intervention Method, Device and System, and Computer-Readable Storage Medium and Healing Room
CN109460752A (en) * 2019-01-10 2019-03-12 广东乐心医疗电子股份有限公司 Emotion analysis method and device, electronic equipment and storage medium
CN110876613A (en) * 2019-09-27 2020-03-13 深圳先进技术研究院 Human motion state identification method and system and electronic equipment
WO2021208902A1 (en) * 2020-04-15 2021-10-21 华为技术有限公司 Sleep report generation method and apparatus, terminal, and storage medium
WO2021213263A1 (en) * 2020-04-21 2021-10-28 深圳市万普拉斯科技有限公司 Call window control method and apparatus, mobile terminal, and readable storage medium
CN112880686A (en) * 2021-01-20 2021-06-01 湖南赫兹信息技术有限公司 Object motion monitoring and positioning method, device and storage medium

Also Published As

Publication number Publication date
CN115054248B (en) 2023-10-20
WO2023103512A1 (en) 2023-06-15

Similar Documents

Publication Publication Date Title
Naeini et al. A real-time PPG quality assessment approach for healthcare Internet-of-Things
JP6679051B2 (en) Biological information analysis device, system, and program
WO2020119245A1 (en) Wearable bracelet-based emotion recognition system and method
RU2602797C2 (en) Method and device for measuring stress
Preejith et al. Design, development and clinical validation of a wrist-based optical heart rate monitor
US10524676B2 (en) Apparatus and method for determining a health parameter of a subject
KR20170109554A (en) A method and apparatus for deriving a mental state of a subject
Ayesha et al. Heart rate monitoring using PPG with smartphone camera
US10932715B2 (en) Determining resting heart rate using wearable device
Matkovič et al. Wi-mind: Wireless mental effort inference
Ngoc-Thang et al. A dynamic reconfigurable wearable device to acquire high quality PPG signal and robust heart rate estimate based on deep learning algorithm for smart healthcare system
CN115054248B (en) Emotion monitoring method and emotion monitoring device
CN115120236A (en) Emotion recognition method and device, wearable device and storage medium
Chen et al. A wearable physiological detection system to monitor blink from faint motion artifacts by machine learning method
WO2023286313A1 (en) Signal processing device and method
US20240090827A1 (en) Methods and Systems for Improving Measurement of Sleep Data by Classifying Users Based on Sleeper Type
Ederli et al. Sleep-Wake Classification using Recurrence Plots from Smartwatch Accelerometer Data
Sayyaf et al. Heart Rate Evaluation by Smartphone: An Overview
KR20230046586A (en) Electronic device and method for controlling the electronic device
Gu Research on Optimized Algorithm for Heart Rate Value Deviation of Elderly Bracelet
CN116570249A (en) Wearable device wearing state detection method and device and wearable device
CN116584943A (en) Wearable device-based depression state monitoring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant