CN115054248B - Emotion monitoring method and emotion monitoring device - Google Patents

Emotion monitoring method and emotion monitoring device Download PDF

Info

Publication number
CN115054248B
CN115054248B CN202111510699.2A CN202111510699A CN115054248B CN 115054248 B CN115054248 B CN 115054248B CN 202111510699 A CN202111510699 A CN 202111510699A CN 115054248 B CN115054248 B CN 115054248B
Authority
CN
China
Prior art keywords
user
emotion
signal
time period
evaluation result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111510699.2A
Other languages
Chinese (zh)
Other versions
CN115054248A (en
Inventor
邸皓轩
李丹洪
张晓武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111510699.2A priority Critical patent/CN115054248B/en
Priority to PCT/CN2022/119258 priority patent/WO2023103512A1/en
Publication of CN115054248A publication Critical patent/CN115054248A/en
Application granted granted Critical
Publication of CN115054248B publication Critical patent/CN115054248B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition

Abstract

The application provides an emotion monitoring method and an emotion monitoring device, which are beneficial to improving the accuracy of emotion recognition. The method comprises the following steps: acquiring a photoplethysmography (PPG) signal of a user in real time, acquiring a skin myoelectricity (SEMG) signal of the user in real time, and acquiring an ACC signal of the user in real time; combining the PPG signal, the SEMG signal, age information of the user and gender information of the user, and counting classification results of emotion of the user at a plurality of moments in a first time period; determining an emotional state evaluation result of the user in the first time period based on classification results of the emotion of the user at a plurality of moments in the first time period; combining the PPG signal, the ACC signal, age information of the user and gender information of the user to determine a sleep quality evaluation result of the user in a first time period; and determining whether to perform emotion early warning according to the emotion state evaluation result and the sleep quality evaluation result.

Description

Emotion monitoring method and emotion monitoring device
Technical Field
The present application relates to the field of terminal devices, and more particularly, to an emotion monitoring method and an emotion monitoring device.
Background
In modern society, people pay more attention to physical and mental health, and long-term accumulation of negative moods such as anxiety, tension, anger, depression, sadness, pain and the like can cause negative psychological diseases such as mania, depression, autism, anxiety and the like.
Psychological and physiological supply researches show that physiological responses are greatly related to emotional states of people, so that wearable devices represented by intelligent bracelets are gradually popularized at present for paying attention to physiological and psychological health of users, and emotion, movement, sleep, health indexes and the like of the users can be monitored.
At present, emotion monitoring mainly takes physiological signal measurement as a main principle, in one possible implementation manner, due to different time domain characteristics and frequency domain characteristics of different emotions, photoplethysmography (photo plethysmo graphic, PPG) signals of a user can be collected, pulse rate variability (pulse rate variability, PRV) sequences are extracted from the PPG signals, analysis of time domain and frequency domain is performed on the extracted PRV sequences to obtain the time domain characteristics and the frequency domain characteristics, and the time domain characteristics and the frequency domain characteristics are used as inputs of a neural network to perform emotion recognition.
However, the above method cannot fully embody physiological characteristics under different emotions, and the accuracy of emotion recognition is not high.
Disclosure of Invention
The application provides an emotion monitoring method and an emotion monitoring device, which are beneficial to improving the accuracy of emotion recognition.
In a first aspect, an emotion monitoring method is provided, which is applied to a terminal device provided with a pulse wave sensor, an myoelectric electrode sensor and an acceleration sensor. The method comprises the following steps: the photoelectric volume pulse wave tracing PPG signal of the user is obtained in real time through the pulse wave sensor, the epidermis myoelectricity SEMG signal of the user is obtained in real time through the myoelectricity electrode sensor, and the ACC signal of the user is obtained in real time through the acceleration sensor. And counting classification results of the emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, wherein the classification results comprise positive emotion, calm emotion or negative emotion. And determining an emotional state evaluation result of the user in the first time period based on the classification results of the emotions of the user at a plurality of moments in the first time period. And determining a sleep quality evaluation result of the user in the first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user. And determining whether to perform emotion early warning according to the emotion state evaluation result and the sleep quality evaluation result.
In the application, the terminal equipment can combine the PPG signal, the SEMG signal, the ACC signal, the age information and the gender information of the user to obtain the emotion state evaluation result of the user in the first time period, and combine the PPG signal, the ACC signal, the age information of the user and the gender information of the user to determine the sleep quality evaluation result of the user in the first time period, and comprehensively consider the emotion of the user according to the emotion state evaluation result and the sleep quality evaluation result of the user in the first time period, thereby being beneficial to improving the accuracy of emotion recognition and providing better mental health service for the user.
With reference to the first aspect, in certain implementation manners of the first aspect, in combination with the PPG signal, the SEMG signal, age information of the user, and gender information of the user, the statistics of classification results of emotions of the user at a plurality of moments in time within the first period include: and extracting characteristics of the PPG signal, the SEMG signal, the age information of the user and the sex information of the user at a first moment in the plurality of moments to obtain a plurality of groups of biological characteristics of the user at the first moment. And carrying out feature fusion on multiple groups of biological features at the first moment to obtain fusion features of the user at the first moment. And obtaining a classification result of the emotion of the user at the first moment based on the fusion characteristics at the first moment.
In the application, the terminal equipment can acquire a plurality of groups of biological characteristics of the user at the first moment through characteristic extraction, the interpretability and the characterization capability of the fusion characteristics are strong, and emotion recognition based on the fusion characteristics after fusion is beneficial to improving the accuracy of emotion recognition.
With reference to the first aspect, in some implementations of the first aspect, feature fusion is performed on multiple sets of biological features at a first moment to obtain a fused feature of a user at the first moment, including: the weights of each of the plurality of sets of biometric features are calculated by a flexible maximum transfer function. And obtaining the fusion characteristic of the user at the first moment according to each group of biological characteristics and the weight of each group of biological characteristics.
In the application, different physiological characteristics have different weight values, so that the importance degree of different physiological signals can be obtained, and the emotion recognition result is more accurate and reliable.
With reference to the first aspect, in certain implementations of the first aspect, the plurality of sets of biometric features includes at least one of: heart rate characteristics, respiration characteristics, blood oxygen characteristics, blood glucose characteristics, blood pressure characteristics, or epidermal myoelectricity characteristics.
With reference to the first aspect, in certain implementation manners of the first aspect, determining an emotional state evaluation result of the user in the first period based on classification results of emotions of the user at a plurality of moments in the first period includes: and acquiring the number of negative emotions of the user in the first time period and the duration of each negative emotion from the classification results of the emotions at a plurality of moments. Based on the number of negative emotions and the duration of each negative emotion, an emotional state evaluation result of the user in the first period is determined.
In the application, the terminal equipment can acquire the times of the negative emotions of the user in the first time period and the duration of each negative emotion, and the longer the accumulated negative emotion time is, the lower the emotion state evaluation score is, so that the emotion of the user in the first time period can be evaluated accurately.
With reference to the first aspect, in certain implementation manners of the first aspect, in combination with the PPG signal, the ACC signal, age information of the user, and gender information of the user, determining a sleep quality evaluation result of the user in a first period of time includes: and obtaining the sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axes. Based on the PPG signal, heart rate variation information of the user over a first period of time is acquired. And obtaining the total sleeping time, the deep sleeping time and the shallow sleeping time of the user in the first time period according to the heart rate change information. And determining a sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep duration, the deep sleep duration and the shallow sleep duration.
In the application, because the sleeping conditions of the user under different emotions can be different, the terminal equipment acquires the sleeping evaluation result of the user by collecting the heart rate variation information of the user in the sleeping time period, and the accuracy rate of emotion recognition is improved according to the sleeping evaluation result.
With reference to the first aspect, in certain implementation manners of the first aspect, determining whether to perform emotion pre-warning according to the emotion state evaluation result and the sleep quality evaluation result includes: and obtaining an emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result. And determining to perform emotion early warning under the condition that the emotion monitoring result meets the preset condition.
In the application, if the terminal equipment monitors that the emotion monitoring result of the user in the first time period meets the preset condition, the terminal equipment can perform emotion early warning on the user, so that the mental health of the user can be timely concerned, and the use experience of the user is improved.
With reference to the first aspect, in some implementation manners of the first aspect, according to the emotional state evaluation result and the sleep quality evaluation result, obtaining an emotion monitoring result of the user in the first period includes: and carrying out weighted summation on the numerical value quantized by the emotional state evaluation result and the numerical value quantized by the sleep quality evaluation result to obtain the emotional score of the user in the first time period. Under the condition that the emotion monitoring result meets the preset condition, determining to perform emotion pre-warning comprises the following steps: and determining to perform emotion early warning under the condition that the emotion score is smaller than or equal to a preset threshold value.
With reference to the first aspect, in certain implementation manners of the first aspect, the emotion age of the user is determined according to the emotion monitoring result of the user in the first period, the number of negative emotions and age information of the user. At least one of an emotion age, an emotion monitoring result, or an emotion statement is displayed.
In the application, the terminal equipment can display at least one of emotion age, emotion monitoring result or emotion statement to the user, so that the user can adjust emotion.
With reference to the first aspect, in certain implementations of the first aspect, peaks and valleys of the ACC signal are obtained. And determining whether the user is in a non-motion state according to the peaks and the troughs of the ACC signal. In combination with the PPG signal, the SEMG signal, age information of the user, and gender information of the user, statistics is made of classification results of emotions of the user at a plurality of moments in a first period of time, including: under the condition that the user is in a non-motion state, the classification results of the emotion of the user at a plurality of moments in a first time period are counted by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user.
In the application, because the PPG signal is greatly interfered by movement, emotion recognition needs to be realized in a non-movement state, so that the terminal equipment can judge whether a user is in the non-movement state through the ACC signal, if the user is in the movement state, the collected PPG signal and SEMG signal are invalid signals and need to be collected again, thereby being beneficial to obtaining the effective PPG signal and SEMG signal and improving the accuracy of emotion recognition.
In a second aspect, there is provided an emotion monitoring device comprising: for performing the method in any of the possible implementations of the first aspect described above. In particular, the apparatus comprises means for performing the method in any one of the possible implementations of the first aspect described above.
In a third aspect, there is provided another emotion monitoring device comprising a processor coupled to a memory operable to execute instructions in the memory to implement a method as in any one of the possible implementations of the first aspect. Optionally, the apparatus further comprises a memory. Optionally, the apparatus further comprises a communication interface, the processor being coupled to the communication interface.
In one implementation, the emotion monitoring device is a terminal device. When the emotion monitoring device is a terminal device, the communication interface may be a transceiver, or an input/output interface.
In another implementation, the emotion monitoring device is a chip configured in the terminal device. When the emotion monitoring device is a chip configured in a terminal device, the communication interface may be an input/output interface.
In a fourth aspect, there is provided a processor comprising: input circuit, output circuit and processing circuit. The processing circuitry is configured to receive signals via the input circuitry and to transmit signals via the output circuitry such that the processor performs the method of any one of the possible implementations of the first aspect described above.
In a specific implementation process, the processor may be a chip, the input circuit may be an input pin, the output circuit may be an output pin, and the processing circuit may be a transistor, a gate circuit, a trigger, various logic circuits, and the like. The input signal received by the input circuit may be received and input by, for example and without limitation, a receiver, the output signal may be output by, for example and without limitation, a transmitter and transmitted by a transmitter, and the input circuit and the output circuit may be the same circuit, which functions as the input circuit and the output circuit, respectively, at different times. The application is not limited to the specific implementation of the processor and various circuits.
In a fifth aspect, a processing device is provided that includes a processor and a memory. The processor is configured to read instructions stored in the memory and to receive signals via the receiver and to transmit signals via the transmitter to perform the method of any one of the possible implementations of the first aspect.
Optionally, the processor is one or more and the memory is one or more.
Alternatively, the memory may be integrated with the processor or the memory may be separate from the processor.
In a specific implementation process, the memory may be a non-transient (non-transitory) memory, for example, a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips.
It should be appreciated that the related data interaction process, for example, transmitting the indication information, may be a process of outputting the indication information from the processor, and the receiving the capability information may be a process of receiving the input capability information by the processor. Specifically, the data output by the processing may be output to the transmitter, and the input data received by the processor may be from the receiver. Wherein the transmitter and receiver may be collectively referred to as a transceiver.
The processing means in the fifth aspect may be a chip, and the processor may be implemented by hardware or by software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor, implemented by reading software code stored in a memory, which may be integrated in the processor, or may reside outside the processor, and exist separately.
In a sixth aspect, there is provided a computer program product comprising: a computer program (which may also be referred to as code, or instructions) which, when executed, causes a computer to perform the method of any one of the possible implementations of the first aspect.
In a seventh aspect, a computer readable storage medium is provided, which stores a computer program (which may also be referred to as code, or instructions) which, when run on a computer, causes the computer to perform the method of any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic diagram of a terminal device to which an embodiment of the present application is applicable;
fig. 2 is a schematic entity structure diagram of a terminal device according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of an emotion monitoring method provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of an emotion classification process provided by an embodiment of the present application;
FIG. 5 is a graph of heart rate variation during sleep provided by an embodiment of the present application;
fig. 6 is a schematic diagram of a system architecture of a terminal device according to an embodiment of the present application;
fig. 7 is an interface schematic diagram of a terminal device according to an embodiment of the present application;
Fig. 8 is a schematic waveform diagram of an ACC signal according to an embodiment of the present application;
FIG. 9 is a schematic flow chart of another emotion monitoring method provided by an embodiment of the present application;
FIG. 10 is a schematic block diagram of an emotion monitoring device provided by an embodiment of the present application;
fig. 11 is a schematic block diagram of another emotion monitoring device provided by an embodiment of the present application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
Before describing the emotion monitoring method and the emotion monitoring device provided by the embodiment of the application, the following description is made.
First, in the embodiments shown below, terms and english abbreviations, such as PPG signal, ACC signal, pulse wave sensor, etc., are given as exemplary examples for convenience of description, and should not be construed as limiting the present application in any way. The present application does not exclude the possibility of defining other terms in existing or future protocols that perform the same or similar functions.
Second, the first, second and various numerical numbers in the embodiments shown below are merely for convenience of description and are not intended to limit the scope of the embodiments of the present application. For example, different terminal devices, etc.
Third, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, and c may represent: a, b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Generally, emotions can be classified into three categories of calm, positive and negative, wherein positive emotion can include excitement, relaxation, surprise, etc., and negative emotion can include anxiety, tension, anger, depression, sadness, pain, etc.
The emotion age is a measure for marking the emotion development level, different age groups may correspond to different emotion development levels, and the emotion of a person corresponds to what age should be represented by emotion, i.e. has the corresponding emotion age. In general, people often have their emotion management ability gradually increased with age, and gradually grow from the young's naughty, the childhood's world, the young's vigorous and healthy, until the middle-aged and the elderly are stable. However, some people develop and mature their bodies, but the hearts do not develop. Thus, it is possible to try to measure the emotional age of a person by using a criterion for classifying the emotional control ability level according to the age according to the time law of the development of the emotional quotient (emotional quotient, EQ). Table one is an exemplary EQ look-up table.
List one
The emotion ages can be summarized into four periods in the application: juvenile stage, active stage, stationary stage, and smart stage.
The juvenile period corresponds to a grade in the first table, and the emotion in the period is quick in change and difficult to control, and the duration of each emotion is short.
The active period corresponds to the second grade in the first table, and the emotion in the period is easily influenced by academic, loving and family relations, and the negative emotion is easily excited and has longer duration and certain self-regulating capacity.
The stationary phase corresponds to the third and fourth grades in the first table, and the period is to understand the convergence emotion, so that the negative emotion is not easy to excite and the control capability is strong.
The wisdom period corresponds to five, six and seven stages in table one, during which the mood changes are small.
Sex differences can also lead to mood differences. It has been found that females have a stronger emotion expression than males, i.e. females tend to exaggerate the emotion perceived by themselves, whereas the heart rate variation in the occurrence of emotion in males is more pronounced than females, i.e. males have a stronger experience with emotion, are more affected by emotion, and are only unwilling to express for various reasons.
Emotion may lead to physiological reactions, e.g. changes in autonomic nerve activity, including frequency of breathing, speed of breathing and exhalation, quality of breathing, heart rate, vascular volume, blood pressure, galvanic skin, internal and external glands, etc. Also for example, brain waves vary, e.g., alpha, beta, delta, theta, with emotion, with alpha wave varying the most. In addition, the emotion may cause adverse reactions such as dizziness, distending pain and the like.
Respiratory changes under different emotions: illustratively, the breath at calm is 20 breaths/min. The depth of breath is small at happy times, the frequency is slightly faster, the rhythm is relatively regular, and the breath is 23 times/min at calm, for example, compared with calm emotion. The breathing rate is slow with sadness and the intermittent time is long, and the breathing is 9 times/min with sadness, for example. The fear-time breathing rate is very fast, intermittent and pausing, the amplitude is irregular, and the fear-time breathing is 64 times/min for example. The breathing frequency increases at anger and the breathing depth increases drastically, illustratively 40 breaths/min at anger.
Heart rate variability under different emotions: the heart rate is 60-80 times/min when the emotion is calm, and the heart rate is accelerated under the influence of adrenal hormone when the emotion is happy, but the rhythm is regular. The heart beat accelerates during sadness, arteries shrink, and arrhythmia occurs during severe cases. The fear is that the heart beats accelerate, accompanied by arrhythmia. The heart beat accelerates when anger, with the sensation of palpitation and palpitation.
Blood, muscles, sleep, electrical signals, etc. may change in a negative emotion. Among these, negative emotions may lead to increased secretion of hormones and vasoactive substances such as cortisol, norepinephrine, epinephrine, catecholamine, etc., leading to systemic vasoconstriction, increased heart rate, increased blood pressure, and a change in blood oxygen content. Negative emotions may cause muscle tension and contraction. Negative emotions may lead to reduced sleep quality, difficulty falling asleep, and dreaminess.
Through the physiological change investigation of emotion change, the difference exists between the changes of the physiological indexes caused by positive emotion and negative emotion, wherein the differences exist in respiration, heart rate, blood pressure, blood sugar, blood oxygen, electromyographic signals and the like of different emotions. Currently, photoplethysmography (photo plethysmo graphic, PPG) signals can be applied to monitoring of physiological indexes such as heart rate, blood pressure, blood oxygen, blood glucose, respiration, and the like.
Illustratively, blood glucose monitoring may be based on infrared light having a wavelength in the range 1250-1333 nm, 1600-1666 nm, or 2025 nm. Heart rate can be calculated from the time-series peaks, frequency-domain characteristics acquired by the PPG signal.
For example, the blood oxygen saturation level may be determined based on information of detecting oxyhemoglobin HbO2 and hemoglobin Hb by infrared (600 nm to 800 nm) and near infrared (800 nm to 1000 nm) light, respectively, to perform blood oxygen monitoring.
Illustratively, since the speed of pulse wave transfer is directly related to blood pressure, pulse wave transfer is fast at high blood pressure and slow at low blood pressure. Therefore, the systolic pressure and the diastolic pressure of the pulse wave of the human body can be estimated by establishing a characteristic equation, so that the noninvasive continuous blood pressure monitoring is realized.
Illustratively, the respiratory signal may be extracted by wavelet decomposition or empirical mode decomposition (empirical mode decomposition, EMD) of the PPG signal.
Fig. 1 is a schematic hardware structure of a terminal device 100 to which an embodiment of the present application is applicable. As shown in fig. 1, the terminal device 100 includes a pulse wave sensor 101, a myoelectric electrode sensor 102, an acceleration sensor 103, and a signal processing unit 104.
The pulse wave sensor 101 obtains a PPG signal of the user, and sends the pulse wave signal to the signal processing unit 104. The myoelectric electrode sensor 102 is configured to acquire a SEMG signal of a user, and send the SEMG signal to the signal processing unit 104. The acceleration sensor 103 is used to acquire an ACC signal of the user and transmit the ACC signal to the signal processing unit 104. The signal processing unit 104 is configured to receive the PPG signal, the SEMG signal, and the ACC signal, so as to monitor and identify the emotion of the user.
Optionally, the terminal device 100 may further comprise a display 105, a storage unit 106, interaction hardware 107, a wireless unit 108 and a battery 109.
The user may perform operations such as touch control on the terminal device 100 through the interaction hardware 107 to implement interaction operations between the user and the terminal device 100. After the signal processing unit 104 obtains the emotion monitoring result of the user, the emotion monitoring result may be displayed to the user through the display 105, and saved in the storage unit 106. The signal processing unit 104 may also send the emotion monitoring result to other terminal devices associated with the terminal device 100 or to the cloud via the wireless unit 108.
The terminal device in the embodiments of the present application may refer to a user device, an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, or a user apparatus. The terminal in the embodiments of the present application may be a mobile phone (mobile phone), a tablet computer (pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal, an augmented reality (augmented reality, AR) terminal, a Mixed Reality (MR) terminal, an extended reality (XR) terminal, a holographic display terminal, a wireless terminal in an industrial control (industrial control), a wireless terminal in a self driving (self driving), or other processing device connected to a wireless modem, a vehicle-mounted device, a terminal in a 5G network, or a terminal in a future evolution network, etc.
By way of example, and not limitation, in embodiments of the present application, the terminal device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
Furthermore, the terminal device may also be a terminal device in an internet of things (internet of things, ioT) system. IoT is an important component of future information technology development, and its main technical feature is to connect an item with a network through a communication technology, so as to implement man-machine interconnection and an intelligent network for object interconnection. The present application is not limited to the specific form of the terminal device.
It should be understood that in the embodiment of the present application, the terminal device may be a device for implementing a function of the terminal device, or may be a device capable of supporting the terminal device to implement the function, for example, a chip system, and the device may be installed in the terminal. In the embodiment of the application, the chip system can be composed of chips, and can also comprise chips and other discrete devices.
In the following, a terminal device 100 is described as an example of a smart watch. Fig. 2 is a schematic physical structure diagram of a smart watch 200 according to an embodiment of the present application. As shown in fig. 2, the smart watch 200 includes a dial 201 and a wristband 202. The pulse wave sensor 101 and the myoelectric electrode sensor 102 shown in fig. 1 may be located at the bottom of the dial 201, and the acceleration sensor 103 and the signal processing unit 104 may be located inside the dial 201. The user may wear the terminal device 100 on the wrist through the wristband 202 to collect PPG signals, SEMG signals, and ACC signals of the user. Optionally, the dial 201 is configured with a display screen 105, and the user may input age information and/or gender information through a touch operation on the display screen 105, and the PPG signal, the SEMG signal, the ACC signal, and the input age information and/or gender information of the user acquired by the terminal device may be stored in the storage unit 106.
It should be noted that the terms of "up" and "bottom" and the like adopted by the terminal device 100 in the embodiment of the present application are mainly used for describing the device, and do not form a limitation on the orientation of the terminal device 100 in the actual application scenario.
Fig. 3 is a schematic flow chart of an emotion monitoring method 300 provided by an embodiment of the present application. The method 300 may be applied to a terminal device with a pulse wave sensor, a myoelectricity electrode sensor, and an acceleration sensor, where the hardware structure of the terminal device in the embodiment of the present application may be the hardware structure of the terminal device 100 shown in fig. 1 and the physical structure may be the smart watch 200 shown in fig. 2, which is not limited in the embodiment of the present application. The method 300 includes the steps of:
s301, acquiring a PPG signal of a user in real time through a photoelectric volume pulse wave sensor, acquiring a skin myoelectricity SEMG signal of the user in real time through a myoelectricity electrode sensor, and acquiring an ACC signal of the user in real time through an acceleration sensor.
S302, counting classification results of emotions of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, age information of the user and gender information of the user, wherein the classification results comprise positive emotion, calm emotion or negative emotion.
S303, determining an emotion state evaluation result of the user in the first time period based on classification results of emotions of the user at a plurality of moments in the first time period.
S304, combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user, and determining the sleep quality evaluation result of the user in the first time period.
S305, determining whether emotion pre-warning is carried out according to the emotion state evaluation result and the sleep quality evaluation result.
The pulse wave sensor may be a PPG sensor, a pressure sensor, or an ultrasonic sensor, for example.
For example, the PPG signal, the SEMG signal, the ACC signal, the age information of the user, and the gender information of the user may be collectively referred to as physiological signals in the embodiments of the present application.
In the embodiment of the present application, the terminal device 100 may obtain the emotion classification result of the user at a plurality of measurement moments in the first period according to the collected PPG signal, SEMG signal, age information of the user, and gender information of the user. The terminal device 100 may obtain a sleep quality evaluation result of the user in the first period according to the collected PPG signal, ACC signal, age information of the user and gender information of the user, and combine the emotion state evaluation result and the sleep quality evaluation result of the user in the first period to obtain an emotion monitoring result of the user in the first period, where the emotion monitoring result combines the PPG signal, SEMG signal, ACC signal, age information of the user and gender information of the user, and also considers the sleep condition of the user, so that it is beneficial to obtain a more accurate emotion monitoring result, early warning to the user in time, and further provides a guarantee for mental health of the user.
As an alternative embodiment, S302 includes: PPG signal, SEMG signal, age information of the user, and gender information of the user for a first time instant of the plurality of time instants. And extracting characteristics of the PPG signal, the SEMG signal, the age information of the user and the sex information of the user to obtain a plurality of groups of biological characteristics of the user at the first moment. And carrying out feature fusion on the multiple groups of biological features at the first moment to obtain fusion features of the user at the first moment. And obtaining a classification result of the emotion of the user at the first moment based on the fusion characteristic at the first moment.
In the embodiment of the application, the emotion classification process at the first moment is described by taking the first moment in a plurality of moments in the first time period as an example. It will be appreciated that the first instant may be any of the plurality of instants, at which different instants the user may be affected by the surroundings to produce behavioral, emotional changes, each instant having its corresponding PPG signal, SEMG signal and ACC signal.
For example, the first period of time is 24 hours, the terminal device 100 may classify the emotion of the user in this hour according to the collected physiological signals of the user every one hour, where the multiple moments are 24 moments in this example, and the terminal device 100 may obtain 24 emotion classification results in the first period of time.
Illustratively, positive emotions may be represented by the value 1, calm emotions by the value 0, and negative emotions by the value-1.
Fig. 4 is a schematic diagram of an emotion classification process according to an embodiment of the present application. Illustratively, fig. 4 shows the emotion classification process at the first moment described above. As shown in fig. 4, the terminal device 100 may perform feature extraction on the collected SEMG signal, PPG signal, age information, and gender information to obtain multiple groups of biological features, then perform feature fusion on the multiple groups of biological features to obtain fusion features, and input the fusion features into a classification network for classification, so as to obtain an emotion classification result.
Optionally, the terminal device 100 performs feature extraction on the SEMG signal, including preprocessing the SEMG signal to obtain a preprocessed SEMG signal.
For example, the preprocessing of the SEMG signal may be filtering the SEMG signal using wavelet transforms. In the embodiment of the application, the biological characteristics obtained by the filtered SEMG signals can be called epidermis myoelectric characteristics.
Alternatively, the terminal device 100 performs feature extraction on the age information and the sex information of the user, including the terminal device 100 preprocessing the age information and the sex information of the user. For example, preprocessing the age information and the sex information of the user may be digitizing the age information and the sex information of the user.
For example, the actual ages of the users may be classified and then subjected to numerical processing, for example, 0 to 10 years old may be represented by a value 1, 11 to 20 years old may be represented by a value 2, 21 to 30 years old may be represented by a value 3, 31 to 40 years old may be represented by a value 4, 41 to 50 years old may be represented by a value 5, 51 to 60 years old may be represented by a value 6, 61 to 70 years old may be represented by a value 7, and 70 years old or older may be represented by a value 8.
Illustratively, a male may be represented by a value of 0 and a female by a value of 1.
Optionally, the terminal device 100 performs feature extraction on the PPG signal, including preprocessing the PPG signal. The terminal device 100 may input the preprocessed PPG signal to a feature extraction network to perform feature extraction, so as to obtain multiple groups of biological features of the user, and may obtain feature values corresponding to each group of biological features according to the regression of the biological features of the user.
For example, the pre-processing of the PPG signal may be filtering the PPG signal, e.g. filtering the PPG signal is achieved with a wavelet transform.
In one possible implementation, the feature extraction network may perform signal decomposition and frequency domain signal transformation on the preprocessed PPG signal to extract the biological features.
When performing signal decomposition on the preprocessed PPG signal, the terminal device 100 may detect a time domain feature of the preprocessed PPG signal, extract a time domain feature value and an optimal band of the preprocessed PPG signal, perform waveform segmentation on the extracted optimal band to obtain a plurality of monocycle waveforms, and perform sparse decomposition on the monocycle waveforms to obtain atomic feature parameter features of the signal.
In the frequency domain signal transformation of the preprocessed PPG signal, fourier transformation may be used to obtain the frequency domain eigenvalue of the PPG signal, for example.
After performing signal decomposition and frequency domain signal transformation on the preprocessed PPG signal, the terminal device 100 may perform feature fusion on the atomic feature parameter feature, the time domain feature value, the frequency domain feature value, and the preprocessed PPG signal.
Feature fusion may be implemented by way of feature stitching or feature summation, for example.
In one possible implementation, the feature extraction network may be a time-sequential network, e.g., a gated loop unit (gated recurrent unit, GRU) network. The GRU is a modified network structure of the recurrent neural network (recurrent neural network, RNN), and important characteristics of input data can be maintained through a gate function. In the embodiment of the application, at least one of heart rate characteristics, breathing characteristics, blood oxygen characteristics, blood sugar characteristics and blood pressure characteristics of the user can be extracted through the characteristic extraction network.
It should be understood that the above manner of feature extraction of the PPG signal, the SEMG signal, the age information of the user, and the gender information of the user is merely an example, and the embodiment of the present application is not limited thereto.
As can be seen from the above description, the respiration, heart rate, blood pressure, blood sugar, blood oxygen, electromyographic signals and the like of different emotions are different, and the changes of the physiological indexes caused by the positive emotion and the negative emotion are different, so that the obtained biological characteristics in the embodiment of the application can clearly reflect the changes of the biological signals under different emotions, and the interpretability and the characterization capability are strong.
As an optional embodiment, feature fusion is performed on multiple groups of biological features at the first moment to obtain a fused feature of the user at the first moment, including: the weights of each of the plurality of sets of biological features are calculated by a flexible maximum transmission softmax function. And obtaining the fusion characteristic of the user at the first moment according to each group of biological characteristics and the weight of each group of biological characteristics.
As shown in fig. 4, after obtaining multiple sets of biological features, the terminal device 100 may perform feature fusion on the multiple sets of biological features to obtain fusion features. Illustratively, the feature fusion in the embodiments of the present application is an adaptive feature fusion, with each set of biometric features having a corresponding feature weight. The feature weight is related to the sample, and can be adaptively adjusted according to different samples, so that the importance of the features with different dimensions can be acquired, and the expression capability of the fusion features is higher.
Illustratively, there are N sets of biological features in total, which are represented as p= [ P ] in the form of a feature matrix 1 ;P 2 ;……P N ]Each group of biological features has M elements, the dimension of the feature matrix P is N x M, where x represents the matrix multiplication.
Illustratively, the weight function F (W, P) =softmax (W M*N *P*P T +B M*1 Axis=0) calculates the weight of each group of biological features, where W M*N Trainable weight matrix representing M x N dimensions, B M*1 Representing a trainable bias moment of M x 1 dimensionsArray, axis=0 indicates that softmax is performed for each row separately, resulting in an M x N dimensional weight matrix.
After obtaining the weight matrix F (W, P), the m×n weight matrix F (W, P) is matrix-multiplied with the n×m biometric matrix P, so that each group of biometric features can be given a corresponding weight, and then a fused feature matrix can be obtained by a diagonal function Diag (x), i.e. the fused feature matrix feature=diag (F (W, P) ×p), where Diag (x) represents taking diagonal elements to form a new matrix.
As an alternative embodiment, S303 includes: the number of negative emotions of the user in the first time period and the duration of each negative emotion are obtained from classification results of the emotions at a plurality of moments. Based on the number of negative emotions and the duration of each negative emotion, determining an emotional state evaluation result of the user in the first period.
In the embodiment of the present application, assuming that the first period is 24 hours, the terminal device 100 performs emotion recognition on the user every 1 hour, and then the first period may be divided into 24 sub-periods, and the duration of each sub-period is 1 hour. The terminal device 100 may obtain an emotion classification result for each sub-period based on the PPG signal, the SEMG signal, age information of the user, and sex information collected for each sub-period, and represent the emotion recognition results for the 24 sub-periods as (0,0,0,0,1,0,0,0,0, -1,0,0,0,1,1,0,0,0,0, -1,0, 1) in a set form, wherein 0 represents a calm emotion, 1 represents a positive emotion, and-1 represents a negative emotion.
The terminal device 100 may obtain the number of negative emotions in the first period from the set of emotion classification results, and in this example it may be seen that the user has appeared 3 negative emotions in the first period. Further, the terminal device 100 may obtain the duration of each negative emotion according to the PPG signal collected during the sub-period when the negative emotion occurs, and the changes of the respiration, the heart rate, the blood pressure, the blood sugar, the blood oxygen, the electromyographic signal, and the like reflected by the SEMG signal.
For example, the terminal device 100 continuously acquires the PPG signal and the SEMG signal, and performs feature extraction on the PPG signal and the SEMG signal every 1s, every 5s, or every 10s, and starts at the initial time when the terminal device 100 detects the negative emotion, the duration of the negative emotion starts to accumulate until the terminal device 100 does not detect the negative emotion, and then the duration of the accumulation from the initial time when the terminal device 100 initially detects the negative emotion to the end time when the terminal device cannot detect the negative emotion is the duration of the negative emotion.
For example, the emotional state evaluation result of the user during the first period may be embodied in the form of a score. The emotional state evaluation score score_e may be obtained by the following formula:
wherein T represents the duration of the first period, n represents the number of negative emotions occurring in the first period, T i Indicating the duration of the ith negative emotion. As can be seen from the above formula, the longer the accumulated negative emotion time, the lower the emotional state evaluation score.
Alternatively, the terminal device 100 may display the real-time emotion obtained at each moment in the first period to the user. It should be understood that the real-time emotion may be embodied in the form of sentences or images, which is not limited by the embodiment of the present application.
For example, when the emotion classification result is positive emotion, the terminal device 100 may display "keep mood-! "," happy mood-! Continue fueling-! The expressions of "," the day when primordial qi is full "or" mood bar and bar are in a state of being full of primordial qi "to" and the like.
For example, when the emotion classification result is a positive emotion, the terminal device 100 may also display a pattern of "sun", "praise", "applause", or "cheering" or the like.
For example, when the emotion classification result is a negative emotion, terminal device 100 may display, for example, "refuel, to open-! The sentences such as 'always in the sun after wind and rain' or 'fast walking in the cloud'.
For example, when the emotion classification result is a negative emotion, the terminal device 100 may also display a pattern of "pumping", "encouraging", or "hugging", for example.
It should be understood that the above emotion-representing forms are merely examples, and that other forms may be used to represent the real-time emotion of the user, which is not limited by the embodiment of the present application.
As an alternative embodiment, S304 includes: and obtaining the sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axes. And acquiring heart rate variation information of the user in the first time period based on the PPG signal, and acquiring total sleeping time, deep sleeping time and shallow sleeping time of the user in the first time period according to the heart rate variation information. And determining a sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep duration, the deep sleep duration and the shallow sleep duration.
In the embodiment of the application, the gravity acceleration components of the ACC signal in the x axis, the y axis and the z axis are different due to the motion state and the non-motion state. Wherein the magnitude of the x-axis direction value represents the horizontal movement of the terminal device 100, the y-axis direction value represents the vertical movement of the terminal device 100, and the z-axis direction value represents the vertical direction of the space of the terminal device 100. In the non-moving state, the modulus of the gravitational acceleration component of the ACC signal in the x-axis, y-axis and z-axis should be stabilized around 9.8. Thus, during sleep of the user, if the terminal device 100 monitors that the amount of change in the modulus of the gravitational acceleration component of the ACC signal in the x-axis, the y-axis and the z-axis exceeds the first preset threshold value and the duration is greater than or equal to the second preset threshold value, the terminal device 100 may consider sleep of the user to be interrupted.
Fig. 5 is a graph showing a change in heart rate during sleep according to an embodiment of the present application. As shown in fig. 5, the heart rate gradually and slowly decreases from falling asleep to light sleep (abbreviated as shallow sleep) to deep sleep (abbreviated as deep sleep). Illustratively, at time T1 the user begins to sleep, and is in a light sleep state for a period of time T1-T2, the duration being T 1 And (3) representing. After the time instant t2 is reached,the heart rate of the user tends to be stable, the user is in a deep sleep state within the period of T2-T3, and the duration is T 2 And (3) representing. Illustratively, the heart rate stabilizes between 55 and 60 beats/minute during deep sleep. After time T3, the heart rate of the user begins to rise slowly, the user is in a light sleep state for a period of time T3-T4, and the duration is T 3 Indicating that the user is gradually awake until time t 4. The total sleep time t_s=t of the user in the first period can be obtained 1 +T 2 +T 3 Deep sleep time period t_d=t 2 The shallow sleep time period t_q=t 1 +T 3
The sleeping quality evaluation result of the user in the first time period can be obtained by combining the total sleeping time length, the deep sleeping time length, the shallow sleeping time length and the sleeping interruption times of the user. The sleep quality evaluation result may be embodied in the form of a score, and the sleep quality evaluation score of the user is recorded as score_s. Table two shows one possible sleep quality assessment score.
Watch II
Score of 90~100 80~90 60~80 40~60 <40
Total length of sleep 7-8 hours Not less than 9 hours For 5 to 6 hours 3-4 hours For 1 to 2 hours
Deep sleep time length proportion >25% 15%~25% 10%~15% 3%~10% <3%
Ratio of shallow sleep time <50% 50%~65% 65%~75% 75%~90% >90%
Number of sleep interruptions ≤1 2~4 4~6 6~8 >8
The following schematically shows a logic judgment code for obtaining the sleep quality evaluation score:
As an alternative embodiment, S305 includes: and obtaining an emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result. And determining to perform emotion early warning under the condition that the emotion monitoring result meets the preset condition.
The embodiment of the application can combine the emotion state evaluation result and the sleep quality evaluation result of the user to judge the emotion monitoring result of the user in the first time period, so that the monitoring is more accurate and comprehensive, and the accuracy of emotion recognition is improved.
For example, the terminal device 100 may quantify the emotional state evaluation result and the sleep quality evaluation result, resulting in an emotional state evaluation score score_e and a sleep quality evaluation score score_s. The terminal device 100 may perform weighted summation on the emotion state evaluation score score_e and the sleep quality evaluation score score_s to obtain an emotion score score_out of the user in the first period, and determine to perform emotion early warning when the emotion score score_out is less than or equal to a third preset threshold.
The emotion score score_out can be obtained by the following formula:
score_out=α×score_s+(1-α)×score_e
where α represents a correction factor. Illustratively, α has a value of 0.5.
In one possible implementation, the terminal device 100 may display early warning information to the user, where the early warning information carries the emotion monitoring result of the user during the first period, for example, the terminal device 100 displays the emotion score to the user and displays a corresponding emotion statement to prompt the user to pay attention to emotional health.
In another possible implementation, in the system architecture 600 of the terminal device as shown in fig. 6, the terminal device 100 may interact with the terminal device 110, for example, the terminal device 100 may send early warning information to the terminal device 110. The terminal device 110 is a terminal device connected to the terminal device 100 via a network, for example, the terminal device 100 is connected to the terminal device 110 via bluetooth.
For example, the foregoing early warning information may include a PPG signal, an ACC signal, and a SEMG signal of the user, and the terminal device 110 may obtain an emotion monitoring result of the user by combining age information and gender information input by the user on the terminal device 110 according to the obtained PPG signal, ACC signal, and SEMG signal, and display the emotion monitoring result and/or a corresponding emotion sentence to the user to play a role in early warning for the user.
Illustratively, the early warning information may include emotion monitoring results obtained by the terminal device 100 according to the PPG signal, the ACC signal, and the SEMG signal of the user, and the terminal device 110 may display the emotion monitoring results and/or corresponding emotion sentences to the user to play a role in early warning for the user.
Table three shows the possible display contents of the terminal device 100 or the terminal device 110 under different scores.
Watch III
Mood score Display statement examples
90~100 You keep your mood today
70~90 The sunlight always being after the wind and rain
<70 Does not experience wind and rain, see the rainbow
Illustratively, the emotional state evaluation result and the sleep quality evaluation result are embodied in the form of evaluation grades. For example, the emotional state evaluation result and the sleep quality evaluation result are classified into A, B, C, D, E five grades, and when the emotional state evaluation result of the user is a or B and the sleep quality evaluation result is a or B, the terminal device 100 may obtain that the emotional monitoring result of the user in the first period is a high grade, which indicates that the emotion is good. When the emotional state evaluation result of the user is C and the sleep quality evaluation result is C, the terminal device 100 may obtain that the emotion monitoring result of the user in the first period is medium, which indicates that the emotion is normal. When the emotional state evaluation result of the user is D or E and the sleep quality evaluation result is D or E, the terminal device 100 may obtain that the emotional monitoring result of the user in the first period is low-level, which indicates that the emotion is abnormal or negative.
For example, when the terminal device 100 obtains that the emotion monitoring result of the user in the first period is low level, the terminal device 100 may send the early warning information to the terminal device 110.
It should be understood that the terminal device 100 may display at least one of the emotional state evaluation result, the sleep quality evaluation result, or the emotion monitoring result to the user, which is not limited in the embodiment of the present application.
It should be further understood that the above emotional state evaluation result, sleep quality evaluation result, and emotion monitoring result embodiment are merely examples, and may be displayed to the user through other forms such as a data histogram, a data pie chart, a line chart, and an expression pack, which is not limited in this embodiment of the present application.
Optionally, the terminal device 100 may store the early warning information in the cloud, and the terminal device 110 may periodically obtain the early warning information from the cloud. For example, the terminal device 100 stores the emotion monitoring result of the user in the cloud end every 24 hours, and the terminal device 110 may obtain the early warning information from the cloud end every 24 hours, every 48 hours or every 72 hours, and display the corresponding early warning content to the user according to the emotion monitoring result of the user carried in the early warning information.
Illustratively, if the emotion monitoring results show that the emotion score of the user is low or low level in the past 24 hours, 48 hours, or 72 hours, the terminal device 110 may consider displaying an encouraging statement to the user through a health class Application (APP) on the terminal device 110, and may also display the emotional state evaluation result and/or sleep quality evaluation result of the user through the health class APP.
For example, if the emotion monitoring result shows that the emotion of the user is equally low or low-grade in the last 24 hours, 48 hours or 72 hours, the terminal device 110 may also alert the user through the monitoring class APP to prompt the user to communicate or conduct psychological counseling in a graceful manner. For example, the monitoring class APP can display sentences such as "weather is good, more and more to walk a bar", "to chat with friends and family frequently", or "find personal complaints to get bad mood bar".
In one possible scenario, the terminal device 100 may be a smart watch, worn by a minor user, the terminal device 110 may be a mobile phone of a parent, the mobile phone of the parent is associated with the smart watch of the minor user, and the parent may obtain at least one of an emotional state evaluation result, a sleep quality evaluation result, or an emotion monitoring result of the minor user through the mobile phone, and communicate and psychologically coach with the minor user in time according to the emotion monitoring result.
For example, if the emotion monitoring result shows that the emotion of the user is equally low or low in the past 24 hours, 48 hours or 72 hours, the terminal device 110 may also intelligently recommend positive multimedia resources to the user through the video APP, music APP or news APP on the terminal device 110, thereby playing a role in alleviating the negative emotion of the user.
It should be understood that the period (24 hours, 48 hours or 72 hours) for the terminal device 110 to obtain the early warning information from the cloud is only an example, and the period for obtaining the early warning information is not limited in the embodiment of the present application.
As an alternative embodiment, the method 300 further comprises: and determining the emotion age of the user according to the emotion monitoring result of the user in the first time period, the number of negative emotion and the age information of the user. At least one of the emotional age, the emotional monitoring result, or the emotional statement is displayed to the user.
In the embodiment of the application, the terminal device 100 can obtain the emotion age of the user, so that the user can know the control capability of the user on the emotion based on the emotion age, thereby being beneficial to improving the user experience.
Table four shows a relationship of mood score, number of negative mood times and mood age for reference only.
Table four
Mood score Number of negative emotions x Age of emotion
80~100 0 Wisdom period
80~100 1≤x≤3 Period of stable weight
80~100 ≥3 In the young children stage
<70 - Active period
In the case that the entity structure of the terminal device 100 is the smart watch 200, fig. 7 is an interface schematic diagram of the smart watch 200 according to the embodiment of the present application. As shown in fig. 7, the smart watch 200 may display the emotional age, the emotional score, and the corresponding emotional statement of the user. For example, the user's emotion score is 98 points, emotion age is stationary, emotion statement is "keep mood-! ".
As an alternative embodiment, the method 300 further comprises: and acquiring the peaks and the troughs of the ACC signals. And determining whether the user is in a non-motion state according to the peaks and the troughs of the ACC signal. S302 includes: and under the condition that the user is in a non-motion state, counting classification results of emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user.
In the embodiment of the application, since the PPG signal is obviously disturbed by movement, emotion monitoring needs to be performed in a non-movement state. The terminal device 100 may determine whether the PPG signal and the SEMG signal are stable signals acquired when the user is in a non-motion state through peaks and troughs of the ACC signal.
Fig. 8 is a waveform schematic diagram of an ACC signal according to an embodiment of the present application. Illustratively, the duration of the ACC signal identified by the terminal device 100 is 10s, and as shown in fig. 8, the ACC signal has a waveform with 1 peak and 1 trough in the identified duration, wherein the peak value is W 1 The trough value is W 2 If W is 1 And W is 2 If the difference value of (a) is smaller than or equal to the fourth preset threshold value, and the sum of the numbers of peaks and valleys is smaller than or equal to the fifth preset threshold value, then the terminal device 100 may determine that the user is in a non-motion state, and the collected PPG signal and SEMG signal are available signals, which may be used for subsequent emotion monitoring.
Fig. 9 is a schematic flow chart of another emotion monitoring method 900 provided by an embodiment of the present application. Method 900 may be performed by terminal device 100, method 900 comprising the steps of:
s901, acquiring PPG signal, SEMG signal, ACC signal, age information and gender information of the user in real time.
In this step, the PPG signal may be acquired by a pulse wave sensor, the SEMG signal may be acquired by an electromyographic electrode sensor, and the ACC signal may be acquired by an acceleration sensor.
S902, judging whether the user is in a non-motion state according to the ACC signal. In the case where the user is in the non-motion state, S903 is performed. If the user is in a motion state, the collected PPG signal and SEMG signal are invalid signals, and re-collection is needed.
S903, classifying the emotion of the user according to the PPG signal, the SEMG signal, the age information of the user and the gender information of the user to obtain an emotion classification result.
In this step, the terminal device 100 may periodically perform emotion recognition on the collected PPG signal, SEMG signal, age information of the user, and gender information of the user to obtain emotion classification results at a plurality of different moments. For example, the terminal device 100 may perform feature extraction on the PPG signal, the SEMG signal, the age information of the user, and the gender information of the user, and perform feature fusion on the extracted biological features, and specific feature extraction and feature fusion processes are described above, which are not repeated herein.
It should be appreciated that the emotional classification results in this step include emotional classification results at a plurality of different moments.
And S904, obtaining the emotion state evaluation score of the user according to the counted times of the negative emotion in the emotion classification result and the duration of each negative emotion.
In this step, it is assumed that the emotion classification result of the user in each hour of 24 hours is included in the emotion classification results, and the emotion state evaluation score of the user in 24 hours can be obtained according to the number of negative emotion occurrences counted in 24 hours and the single duration.
S905, monitoring the sleep quality of the user according to the PPG signal, the ACC signal, the age information of the user and the gender information of the user, and obtaining the sleep quality evaluation score of the user.
In this step, the terminal device 100 may obtain the number of sleep interruptions of the user according to the ACC signal, obtain the time when the user enters sleep, the time when the user exits sleep, the deep sleep time and the shallow sleep time according to the heart rate variation of the user obtained by the PPG signal, and then evaluate the sleep quality of the user by combining the number of sleep interruptions, the time when the user enters sleep, the time when the user exits sleep, the deep sleep time, the shallow sleep time and the age and sex of the user to obtain the sleep quality evaluation score of the user. Wherein a movement state lasting more than 5s may be regarded as sleep disruption.
S906, carrying out weighted summation on the emotion state evaluation score and the sleep quality evaluation score of the user to obtain the emotion score of the user.
S907, sending early warning information to the cloud end under the condition that the emotion score of the user is smaller than or equal to the emotion threshold value, wherein the early warning information carries at least one of the emotion state evaluation score, the sleep quality evaluation score or the emotion score of the user.
In this step, the terminal device 100 may send the early warning information to the cloud under the condition that the monitored emotion score of the user is low, and the terminal device 110 associated with the terminal device 100 may obtain the early warning information from the cloud, further determine whether to perform emotion warning on the user at the terminal device 110, and prompt the user to pay attention to the emotional health in the modes of APP pushing positive energy information, displaying encouragement sentences, and the like.
S908, obtaining the emotion age of the user according to the emotion score of the user and the occurrence times of negative emotion.
S909, displaying at least one of the emotion age, emotion score, or emotion sentence to the user.
Optionally, terminal device 100 may also send the emotional age to terminal device 110, and at least one of the emotional age, the emotional score, or the emotional statement is displayed by terminal device 110.
In the embodiment of the present application, the terminal device 100 may obtain, according to the PPG signal and the SEMG signal of the user, the biological characteristics of the user under different emotions, such as heart rate, blood pressure, blood oxygen, blood sugar, and contraction of muscle, and combine the results of the evaluation of the emotional states of the age, sex, and the like of the user, when judging that the collected PPG signal and SEMG signal are relatively stable according to the ACC signal. Because the sleeping conditions of the user under different emotions may be different, the terminal device 100 may monitor the sleeping quality of the user according to the PPG signal, the ACC signal, the age and sex of the user, to obtain a sleeping quality evaluation result of the user, and further determine the emotion score of the user by combining the emotion state evaluation result and the sleeping quality evaluation result of the user, so that the obtained emotion score can more accurately reflect the emotion of the user in a period of time, and is beneficial to improving the accuracy of emotion recognition.
It should be understood that the sequence numbers of the above processes do not mean the order of execution, and the execution order of the processes should be determined by the functions and internal logic of the processes, and should not be construed as limiting the implementation process of the embodiments of the present application.
The emotion monitoring method according to the embodiment of the present application is described in detail above with reference to fig. 1 to 9, and the emotion monitoring device according to the embodiment of the present application will be described in detail below with reference to fig. 10 and 11.
Fig. 10 shows a schematic block diagram of an emotion monitoring device 1000 according to an embodiment of the present application, the device 1000 comprising an acquisition module 1010 and a processing module 1020.
Wherein, the acquisition module 1010 is configured to: the photoelectric volume pulse wave tracing PPG signal of the user is obtained in real time through the pulse wave sensor, the epidermis myoelectricity SEMG signal of the user is obtained in real time through the myoelectricity electrode sensor, and the ACC signal of the user is obtained in real time through the acceleration sensor. The processing module 1020 is configured to: combining the PPG signal, the SEMG signal, age information of the user and gender information of the user, and counting classification results of the emotion of the user at a plurality of moments in a first time period, wherein the classification results comprise positive emotion, calm emotion or negative emotion; determining an emotional state evaluation result of the user in the first time period based on classification results of the emotion of the user at a plurality of moments in the first time period; combining the PPG signal, the ACC signal, age information of the user and gender information of the user to determine a sleep quality evaluation result of the user in a first time period; and determining whether to perform emotion pre-warning according to the emotion state evaluation result and the sleep quality evaluation result.
Optionally, the processing module 1020 is configured to: and extracting characteristics of the PPG signal, the SEMG signal, the age information of the user and the sex information of the user at a first moment in the plurality of moments to obtain a plurality of groups of biological characteristics of the user at the first moment. Performing feature fusion on multiple groups of biological features at a first moment to obtain fusion features of a user at the first moment; and obtaining a classification result of the emotion of the user at the first moment based on the fusion characteristic at the first moment.
Optionally, the processing module 1020 is configured to: calculating the weight of each group of biological characteristics in the plurality of groups of biological characteristics through the flexible maximum value transfer function; and obtaining the fusion characteristic of the user at the first moment according to each group of biological characteristics and the weight of each group of biological characteristics.
Optionally, the plurality of sets of biometric features includes at least one of: heart rate characteristics, respiration characteristics, blood oxygen characteristics, blood glucose characteristics, blood pressure characteristics, or epidermal myoelectricity characteristics.
Optionally, the obtaining module 1010 is configured to: and acquiring the number of negative emotions of the user in the first time period and the duration of each negative emotion from the classification results of the emotions at a plurality of moments. The processing module 1020 is configured to: based on the number of negative emotions and the duration of each negative emotion, an emotional state evaluation result of the user in the first period is determined.
Optionally, the processing module 1020 is configured to: and obtaining the sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axes. The obtaining module 1010 is configured to: based on the PPG signal, heart rate variation information of the user over a first period of time is acquired. The processing module 1020 is configured to: obtaining the total sleeping time, the deep sleeping time and the shallow sleeping time of the user in the first time period according to the heart rate change information; and determining a sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep time, the deep sleep time and the shallow sleep time.
Optionally, the processing module 1020 is configured to: obtaining an emotion monitoring result of the user in the first time period according to the emotion state evaluation result and the sleep quality evaluation result; and determining to perform emotion early warning under the condition that the emotion monitoring result meets the preset condition.
Optionally, the processing module 1020 is configured to: carrying out weighted summation on the numerical value quantized by the emotional state evaluation result and the numerical value quantized by the sleep quality evaluation result to obtain the emotional score of the user in the first time period; and determining to perform emotion early warning under the condition that the emotion score is smaller than or equal to a preset threshold value.
Optionally, the processing module 1020 is configured to: determining the emotion age of the user according to the emotion monitoring result of the user in the first time period, the number of negative emotion and the age information of the user; and displaying at least one of the emotion age, the emotion monitoring result, or the emotion statement.
Optionally, the obtaining module 1010 is configured to: and acquiring the peaks and the troughs of the ACC signals. The processing module 1020 is configured to: determining whether the user is in a non-motion state according to the peaks and the troughs of the ACC signals; and under the condition that the user is in a non-motion state, combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, and counting classification results of emotions of the user at a plurality of moments in a first time period.
In an alternative example, it will be appreciated by those skilled in the art that the apparatus 1000 may be embodied as a terminal device in the above embodiment, or the functions of the terminal device in the above embodiment may be integrated in the apparatus 1000. The above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above. The apparatus 1000 may be configured to perform the respective processes and/or steps corresponding to the terminal device in the above method embodiments.
It should be appreciated that the apparatus 1000 herein is embodied in the form of functional modules. The term module herein may refer to an application specific integrated circuit (application specific integrated circuit, ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an embodiment of the present application, the apparatus 1000 in fig. 10 may also be a chip or a chip system, for example: system on chip (SoC).
Fig. 11 shows a schematic block diagram of another emotion monitoring device 1100 provided by an embodiment of the present application. The apparatus 1100 includes a processor 1110, a transceiver 1120, and a memory 1130. Wherein the processor 1110, the transceiver 1120 and the memory 1130 are in communication with each other through an internal connection path, the memory 1130 is configured to store instructions, and the processor 1110 is configured to execute the instructions stored in the memory 1130 to control the transceiver 1120 to transmit signals and/or receive signals.
It should be understood that the apparatus 1100 may be specifically a terminal device in the foregoing embodiment, or the functions of the terminal device in the foregoing embodiment may be integrated in the apparatus 1100, and the apparatus 1100 may be configured to perform the steps and/or flows corresponding to the terminal device in the foregoing method embodiment. The memory 1130 may optionally include read-only memory and random access memory, and provide instructions and data to the processor. A portion of the memory may also include non-volatile random access memory. For example, the memory may also store information of the device type. The processor 1110 may be configured to execute instructions stored in the memory, and when the processor executes the instructions, the processor may perform various steps and/or flows corresponding to the electronic device in the above-described method embodiments.
It is to be appreciated that in embodiments of the application, the processor 1110 may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor executes instructions in the memory to perform the steps of the method described above in conjunction with its hardware. To avoid repetition, a detailed description is not provided herein.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific implementation of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and all changes and substitutions are included in the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A mood monitoring method applied to a terminal device equipped with a pulse wave sensor, a myoelectric electrode sensor, and an acceleration sensor, the method comprising:
acquiring a photoelectric volume pulse wave tracing (PPG) signal of a user in real time through the pulse wave sensor, acquiring an epidermis myoelectricity (SEMG) signal of the user in real time through the myoelectricity electrode sensor, and acquiring an ACC signal of the user in real time through the acceleration sensor;
acquiring wave crests and wave troughs of the ACC signals;
determining whether the user is in a non-motion state according to the peaks and the troughs of the ACC signals;
under the condition that the user is in a non-motion state, performing feature extraction on a PPG signal, the SEMG signal, age information of the user and sex information of the user at a first moment in a plurality of moments to obtain a plurality of groups of biological features of the user at the first moment;
Calculating the weight of each group of biological characteristics in the plurality of groups of biological characteristics through a flexible maximum value transfer function;
performing feature fusion on the multiple groups of biological features according to the biological features of each group and the weight of each group of biological features to obtain fusion features of the user at the first moment, wherein the feature fusion is self-adaptive feature fusion;
based on the fusion characteristics of the first moment, obtaining a classification result of the emotion of the user at the first moment, wherein the classification result comprises positive emotion, calm emotion or negative emotion;
acquiring the number of negative emotions of a user in a first time period and the duration of each negative emotion from classification results of the emotions at a plurality of moments;
determining an emotional state evaluation result of the user in the first time period based on the number of negative emotions and the duration of each negative emotion, wherein the emotional state evaluation result of the user in the first time period is expressed in the form of an evaluation score, and the evaluation score is obtained through the following formula;
wherein T represents the duration of the first period, n represents the number of negative emotions occurring in the first period, T i A duration representing the ith negative emotion;
obtaining sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axes;
acquiring heart rate variation information of the user in the first time period based on the PPG signal;
obtaining the total sleeping time length, the deep sleeping time length and the shallow sleeping time length of the user in the first time period according to the heart rate variation information;
determining a sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep duration, the deep sleep duration and the shallow sleep duration;
carrying out weighted summation on the numerical value quantized by the emotional state evaluation result and the numerical value quantized by the sleep evaluation result to obtain the emotional score of the user in the first time period;
determining whether to perform emotion pre-warning according to the emotion score of the user in the first time period, wherein if the emotion score of the user in the first time period is lower, recommending positive multimedia resources to the user through video APP, music APP or news APP on terminal equipment;
determining the emotion age of the user according to the emotion monitoring result of the user in the first time period, the number of negative emotion and the age information of the user;
The method further comprises the steps of: and under the condition that the user is in a motion state, acquiring a PPG signal and a SEMG signal as invalid signals, acquiring the PPG signal and the SEMG signal again, and judging whether the user is in a non-motion state according to the ACC signal.
2. The method of claim 1, wherein the plurality of sets of biometric features comprise at least one of:
heart rate characteristics, respiration characteristics, blood oxygen characteristics, blood glucose characteristics, blood pressure characteristics, or epidermal myoelectricity characteristics.
3. The method according to claim 1 or 2, wherein determining whether to perform emotion pre-warning based on the emotion score of the user during the first period of time comprises:
and determining to perform emotion early warning under the condition that the emotion score meets a preset condition.
4. The method according to claim 1, wherein determining to perform emotion pre-warning if the emotion score satisfies a preset condition comprises:
and determining to perform emotion early warning under the condition that the emotion score is smaller than or equal to a preset threshold value.
5. The method according to claim 4, wherein the method further comprises:
and displaying at least one of the emotion age, the emotion monitoring result or the emotion statement.
6. An emotion monitoring device, comprising:
the acquisition module is used for acquiring the photoplethysmography PPG signal of the user in real time through the pulse wave sensor, acquiring the epidermis myoelectricity SEMG signal of the user in real time through the myoelectricity electrode sensor and acquiring the ACC signal of the user in real time through the acceleration sensor;
the processing module is used for combining the PPG signal, the SEMG signal, the age information of the user and the gender information of the user, and counting classification results of the emotion of the user at a plurality of moments in a first time period, wherein the classification results comprise positive emotion, calm emotion or negative emotion;
the processing module is further configured to: determining an emotional state evaluation result of the user in the first time period based on classification results of emotions of the user at a plurality of moments in the first time period;
the processing module is further configured to: determining a sleep quality evaluation result of the user in the first time period by combining the PPG signal, the ACC signal, the age information of the user and the gender information of the user;
the processing module is further configured to: determining whether to perform emotion early warning according to the emotion state evaluation result and the sleep quality evaluation result;
The processing module is used for:
performing feature extraction on the PPG signal, the SEMG signal, the age information of the user and the gender information of the user at a first moment in the plurality of moments to obtain a plurality of groups of biological features of the user at the first moment;
performing feature fusion on multiple groups of biological features at the first moment to obtain fusion features of the user at the first moment, wherein the feature fusion is self-adaptive feature fusion;
based on the fusion characteristics of the first moment, obtaining a classification result of the emotion of the user at the first moment;
the processing module is used for:
calculating the weight of each group of biological characteristics in the plurality of groups of biological characteristics through a flexible maximum value transfer function;
obtaining fusion characteristics of the user at the first moment according to the biological characteristics of each group and the weight of each group of biological characteristics;
the processing module is used for:
obtaining sleep interruption times of the user in the first time period based on the components of the ACC signals on the coordinate axes;
the acquisition module is used for: acquiring heart rate variation information of the user in the first time period based on the PPG signal;
The processing module is further configured to: obtaining the total sleeping time length, the deep sleeping time length and the shallow sleeping time length of the user in the first time period according to the heart rate variation information;
the processing module is further configured to: determining a sleep evaluation result of the user in the first time period by combining the sleep interruption times, the total sleep duration, the deep sleep duration and the shallow sleep duration;
the acquisition module is used for:
acquiring wave crests and wave troughs of the ACC signals;
the processing module is used for: determining whether the user is in a non-motion state according to the peaks and the troughs of the ACC signals;
the processing module is further configured to: under the condition that the user is in a non-motion state, counting classification results of emotion of the user at a plurality of moments in a first time period by combining the PPG signal, the SEMG signal, age information of the user and gender information of the user;
the processing module is used for:
carrying out weighted summation on the numerical value quantized by the emotional state evaluation result and the numerical value quantized by the sleep quality evaluation result to obtain the emotional score of the user in the first time period;
Determining whether to perform emotion pre-warning according to the emotion score of the user in the first time period, wherein if the emotion score of the user in the first time period is lower, recommending positive multimedia resources to the user through video APP, music APP or news APP on terminal equipment;
the processing module is used for:
determining the emotion age of the user according to the emotion monitoring result of the user in the first time period, the number of negative emotion and the age information of the user;
the acquisition module is used for:
acquiring the number of negative emotions of the user in the first time period and the duration of each negative emotion from classification results of the emotions at the plurality of moments;
the processing module is used for: determining an emotional state evaluation result of the user in the first time period based on the number of negative emotions and the duration of each negative emotion, wherein the emotional state evaluation result of the user in the first time period is expressed in the form of an evaluation score, and the evaluation score is obtained through the following formula;wherein T represents the duration of the first period of time and n represents the secondary occurrence of a negative emotion in the first period of time Number, t i A duration representing the ith negative emotion;
in the case that the user is in a motion state, the collected PPG signal and SEMG signal are invalid signals, and the acquisition module is further configured to: and re-acquiring the PPG signal and the SEMG signal, and judging whether the user is in a non-motion state according to the ACC signal.
7. The apparatus of claim 6, wherein the plurality of sets of biometric features comprise at least one of:
heart rate characteristics, respiration characteristics, blood oxygen characteristics, blood glucose characteristics, blood pressure characteristics, or epidermal myoelectricity characteristics.
8. The apparatus of claim 6 or 7, wherein the processing module is configured to:
and determining to perform emotion early warning under the condition that the emotion score meets a preset condition.
9. The apparatus of claim 8, wherein the processing module is further configured to:
and determining to perform emotion early warning under the condition that the emotion score is smaller than or equal to a preset threshold value.
10. The apparatus of claim 9, wherein the processing module is further configured to:
and displaying at least one of the emotion age, the emotion monitoring result or the emotion statement.
11. An emotion monitoring device, comprising: a processor coupled to a memory for storing a computer program, which when invoked by the processor, causes the apparatus to perform the method of any one of claims 1-5.
12. A computer readable storage medium storing a computer program comprising instructions for implementing the method of any one of claims 1-5.
13. A computer program product comprising computer program code embodied therein, which when run on a computer causes the computer to implement the method of any of claims 1-5.
CN202111510699.2A 2021-12-10 2021-12-10 Emotion monitoring method and emotion monitoring device Active CN115054248B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111510699.2A CN115054248B (en) 2021-12-10 2021-12-10 Emotion monitoring method and emotion monitoring device
PCT/CN2022/119258 WO2023103512A1 (en) 2021-12-10 2022-09-16 Emotion monitoring method and emotion monitoring apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111510699.2A CN115054248B (en) 2021-12-10 2021-12-10 Emotion monitoring method and emotion monitoring device

Publications (2)

Publication Number Publication Date
CN115054248A CN115054248A (en) 2022-09-16
CN115054248B true CN115054248B (en) 2023-10-20

Family

ID=83196907

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111510699.2A Active CN115054248B (en) 2021-12-10 2021-12-10 Emotion monitoring method and emotion monitoring device

Country Status (2)

Country Link
CN (1) CN115054248B (en)
WO (1) WO2023103512A1 (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016144284A1 (en) * 2015-03-06 2016-09-15 Елизавета Сергеевна ВОРОНКОВА Method for recognising the movement and psycho-emotional state of a person and device for carrying out said method
CN106419841A (en) * 2016-09-13 2017-02-22 深圳市迈迪加科技发展有限公司 Method, device and system for evaluating sleep
US9596997B1 (en) * 2015-09-14 2017-03-21 Whoop, Inc. Probability-based usage of multiple estimators of a physiological signal
WO2017193497A1 (en) * 2016-05-09 2017-11-16 包磊 Fusion model-based intellectualized health management server and system, and control method therefor
CN107874750A (en) * 2017-11-28 2018-04-06 华南理工大学 Pulse frequency variability and the psychological pressure monitoring method and device of sleep quality fusion
JP2018082730A (en) * 2016-11-15 2018-05-31 都築 北村 Biological risk acquisition device, biological risk acquisition method, biological risk acquisition program, and recording medium
JP2018166653A (en) * 2017-03-29 2018-11-01 アイシン精機株式会社 Mood determination device
CN109460752A (en) * 2019-01-10 2019-03-12 广东乐心医疗电子股份有限公司 Emotion analysis method and device, electronic equipment and storage medium
CN110876613A (en) * 2019-09-27 2020-03-13 深圳先进技术研究院 Human motion state identification method and system and electronic equipment
CN112880686A (en) * 2021-01-20 2021-06-01 湖南赫兹信息技术有限公司 Object motion monitoring and positioning method, device and storage medium
WO2021208902A1 (en) * 2020-04-15 2021-10-21 华为技术有限公司 Sleep report generation method and apparatus, terminal, and storage medium
WO2021213263A1 (en) * 2020-04-21 2021-10-28 深圳市万普拉斯科技有限公司 Call window control method and apparatus, mobile terminal, and readable storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9955902B2 (en) * 2015-01-29 2018-05-01 Affectomatics Ltd. Notifying a user about a cause of emotional imbalance
CN105147248B (en) * 2015-07-30 2019-02-05 华南理工大学 Depression assessment system and its appraisal procedure based on physiologic information
CN105231997A (en) * 2015-10-10 2016-01-13 沈阳熙康阿尔卑斯科技有限公司 Sleep quality judging method and sleep instrument
US20210161482A1 (en) * 2017-07-28 2021-06-03 Sony Corporation Information processing device, information processing method, and computer program
CN111145871A (en) * 2018-11-02 2020-05-12 京东方科技集团股份有限公司 Emotional intervention method, device and system, and computer-readable storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016144284A1 (en) * 2015-03-06 2016-09-15 Елизавета Сергеевна ВОРОНКОВА Method for recognising the movement and psycho-emotional state of a person and device for carrying out said method
US9596997B1 (en) * 2015-09-14 2017-03-21 Whoop, Inc. Probability-based usage of multiple estimators of a physiological signal
WO2017193497A1 (en) * 2016-05-09 2017-11-16 包磊 Fusion model-based intellectualized health management server and system, and control method therefor
CN106419841A (en) * 2016-09-13 2017-02-22 深圳市迈迪加科技发展有限公司 Method, device and system for evaluating sleep
JP2018082730A (en) * 2016-11-15 2018-05-31 都築 北村 Biological risk acquisition device, biological risk acquisition method, biological risk acquisition program, and recording medium
JP2018166653A (en) * 2017-03-29 2018-11-01 アイシン精機株式会社 Mood determination device
CN107874750A (en) * 2017-11-28 2018-04-06 华南理工大学 Pulse frequency variability and the psychological pressure monitoring method and device of sleep quality fusion
CN109460752A (en) * 2019-01-10 2019-03-12 广东乐心医疗电子股份有限公司 Emotion analysis method and device, electronic equipment and storage medium
CN110876613A (en) * 2019-09-27 2020-03-13 深圳先进技术研究院 Human motion state identification method and system and electronic equipment
WO2021208902A1 (en) * 2020-04-15 2021-10-21 华为技术有限公司 Sleep report generation method and apparatus, terminal, and storage medium
WO2021213263A1 (en) * 2020-04-21 2021-10-28 深圳市万普拉斯科技有限公司 Call window control method and apparatus, mobile terminal, and readable storage medium
CN112880686A (en) * 2021-01-20 2021-06-01 湖南赫兹信息技术有限公司 Object motion monitoring and positioning method, device and storage medium

Also Published As

Publication number Publication date
WO2023103512A1 (en) 2023-06-15
CN115054248A (en) 2022-09-16

Similar Documents

Publication Publication Date Title
WO2020119245A1 (en) Wearable bracelet-based emotion recognition system and method
Naeini et al. A real-time PPG quality assessment approach for healthcare Internet-of-Things
US20170249445A1 (en) Portable devices and methods for measuring nutritional intake
WO2017179696A1 (en) Biological information analysis device and system, and program
RU2602797C2 (en) Method and device for measuring stress
US10058253B2 (en) System, method, and article for heart rate variability monitoring
US20140121540A1 (en) System and method for monitoring the health of a user
US10524676B2 (en) Apparatus and method for determining a health parameter of a subject
Phillips et al. WristO2: Reliable peripheral oxygen saturation readings from wrist-worn pulse oximeters
KR20170109554A (en) A method and apparatus for deriving a mental state of a subject
Prawiro et al. Integrated wearable system for monitoring heart rate and step during physical activity
Park et al. Prediction of daily mental stress levels using a wearable photoplethysmography sensor
Alafeef Smartphone-based photoplethysmographic imaging for heart rate monitoring
EP4169042A1 (en) Pulse shape analysis
Wang et al. Emotionsense: An adaptive emotion recognition system based on wearable smart devices
US10932715B2 (en) Determining resting heart rate using wearable device
Ngoc-Thang et al. A dynamic reconfigurable wearable device to acquire high quality PPG signal and robust heart rate estimate based on deep learning algorithm for smart healthcare system
CN115054248B (en) Emotion monitoring method and emotion monitoring device
Mena et al. Mobile personal health care system for noninvasive, pervasive, and continuous blood pressure monitoring: development and usability study
US11963748B2 (en) Portable monitor for heart rate detection
US20230033353A1 (en) Portable monitor for heart rate detection
US20240090827A1 (en) Methods and Systems for Improving Measurement of Sleep Data by Classifying Users Based on Sleeper Type
Ederli et al. Sleep-Wake Classification using Recurrence Plots from Smartwatch Accelerometer Data
Sayyaf et al. Heart Rate Evaluation by Smartphone: An Overview
Rizal Step Rate Estimator from Wearable Photopletysmography Signal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant