CN117694892A - Emotion health state analysis method and electronic equipment - Google Patents

Emotion health state analysis method and electronic equipment Download PDF

Info

Publication number
CN117694892A
CN117694892A CN202211097691.2A CN202211097691A CN117694892A CN 117694892 A CN117694892 A CN 117694892A CN 202211097691 A CN202211097691 A CN 202211097691A CN 117694892 A CN117694892 A CN 117694892A
Authority
CN
China
Prior art keywords
user
index
interface
electronic device
recommended
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211097691.2A
Other languages
Chinese (zh)
Inventor
圣荣
王润森
韩羽佳
蒋秋实
郭宗豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211097691.2A priority Critical patent/CN117694892A/en
Publication of CN117694892A publication Critical patent/CN117694892A/en
Pending legal-status Critical Current

Links

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application provides an emotion health state analysis method and electronic equipment. The method comprises the following steps: acquiring sensor data of a user during sleep for a predetermined period of time; obtaining heartbeat interval data of a user according to the sensor data; and obtaining an emotion health index of the user according to the heartbeat interval data of the user, wherein the emotion health index of the user is used for indicating the emotion health state of the user. Therefore, the non-inductive analysis of the emotional health state is realized, and a digital emotion auxiliary tool is provided for the user.

Description

Emotion health state analysis method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to an emotion health state analysis method and electronic equipment.
Background
With the pace of life, learning and work becoming faster and faster, emotional health is a psychological affliction commonly faced by users of all ages. The harm to emotional health has been confirmed by a great deal of scientific researches, and is liable to cause psychological health problems, mood disorders and nervous system diseases, such as anxiety disorders, depression and the like, of users. In this regard, how to obtain an emotional state of health becomes a concern for users.
Currently, users are required to actively fill out questionnaires for evaluating emotional well-being, such as depression, anxiety and stress scales (DASS-21 scales), and based on the filled-in contents of the questionnaires, the emotional well-being state of the user, such as which of the mild, moderate, or severe states of depression is available.
However, the above manner depends on the filling of the questionnaire, and the emotional state of the user cannot be automatically obtained.
Disclosure of Invention
The emotion health state analysis method and the electronic device can automatically analyze the emotion health state of the user, and improve the use experience of the user.
In a first aspect, the present application provides a method for analyzing emotional health states, the method comprising:
acquiring sensor data of a user during sleep for a predetermined period of time;
obtaining heartbeat interval data of a user according to the sensor data;
and obtaining an emotion health index of the user according to the heartbeat interval data of the user, wherein the emotion health index of the user is used for indicating the emotion health state of the user.
According to the emotion health state analysis method provided by the first aspect, the electronic equipment acquires the sensor data of the user in the sleeping process of the user in the preset time period, and the physiological characteristic information of the user can be obtained through the sensor data. And obtaining heartbeat interval data of the user according to the sensor data, and obtaining indexes related to the emotion health state of the user through the heartbeat interval data of the user. And obtaining an emotion health index of the user according to the heartbeat interval data of the user, wherein the emotion health index of the user is used for indicating the emotion health state of the user. Therefore, based on the evaluation capability of the emotion health state of the user, the non-sensing measurement and analysis of the emotion health state are realized, and a digital emotion auxiliary tool is provided for the user, so that the user obtains good use experience.
In one possible design, obtaining the emotional well-being index of the user according to the heartbeat interval data of the user includes:
obtaining heart rate variability indexes of a target sleep period according to heart beat interval data of a user, wherein the target sleep period is a deep sleep period and/or a rapid eye movement period which are entered by the user in a sleep process of a preset time period;
and obtaining the emotion health index of the user according to the heart rate variability index of the target sleep period.
Therefore, the electronic equipment can accurately obtain the emotion health index of the user based on the heart rate variability index of the user in the deep sleep period and/or the rapid eye movement period.
In one possible design, the method further comprises:
obtaining a fragmentation degree index of the target sleep stage according to the heartbeat interval data of the user;
obtaining the emotion health index of the user according to the heart rate variability index of the target sleep period, wherein the method comprises the following steps: and obtaining the emotion health index of the user according to the heart rate variability index and the fragmentation degree index during the target sleep.
Therefore, the electronic equipment can comprehensively obtain the emotion health index of the user based on the heart rate variability index and the fragmentation degree index of the user in the deep sleep period and/or the rapid eye movement period.
In one possible design, deriving the emotional well-being index of the user from the heart rate variability index and the degree of fragmentation index during the target sleep, comprising:
the heart rate variability index and the fragmentation degree index during the target sleep period are input into an emotion health state model, emotion health indexes of a user are output, and the emotion health state model is used for indicating the incidence relations between the emotion health indexes corresponding to different emotion health states, and the heart rate variability index and the fragmentation degree index.
Therefore, the electronic equipment can accurately obtain the emotion health index of the user based on the heart rate variability index and the fragmentation degree index of the user in the deep sleep period and/or the rapid eye movement period by using the emotion health state model.
In one possible design, the heart rate variability index includes: at least one of low frequency LF, high frequency HF, very low frequency VLF, low frequency normalized value LFnu, high frequency normalized value HFnu, very low frequency normalized value VLFnu, low frequency to high frequency ratio LF/HF ratio, standard deviation of total sinus cardiac RR intervals SDNN, root mean square rMSSD of adjacent sinus cardiac RR interval differences, average standard deviation of sinus cardiac RR intervals SDNN every 5 minutes period, percentage of number of adjacent sinus cardiac RR differences greater than 50 milliseconds pNN50, average heart rate mean-HR, triangular index triangularity, transverse axis SD1, longitudinal axis SD2, or ratio of transverse axis to longitudinal axis SD1/SD2 io.
Therefore, rich implementation modes are provided for heart rate variability indexes of the user, so that the electronic equipment flexibly selects one or more items to obtain the emotion health index of the user.
In one possible design, the method further comprises:
recording an emotion health index of a user;
an emotional well-being index of the user is displayed in the first interface.
Therefore, the electronic equipment can record the emotion health index of the user in a single, one-period, periodic, continuous and other modes, display the emotion health index of the user, and conveniently display the emotion health index of the user to the user according to the time dimension, so that the noninductive recording and display of the emotion health state are realized.
In one possible design, the method further comprises:
obtaining recommended breathing duration and recommended breathing frequency according to the emotion health index of the user;
and displaying a second interface, wherein the second interface is used for prompting the user to conduct breathing training of the recommended breathing duration according to the recommended breathing frequency.
Therefore, the electronic equipment can personally recommend the breathing training scheme for the user according to the emotion health index of the user.
In one possible design, the method further comprises:
displaying the first control in the second interface;
In response to operation on the first control, a second curve is drawn in suspension on the first curve, the first curve being derived from the recommended breath duration and the recommended breath frequency, the second curve being derived from the real-time breath frequency of the user.
Thus, the electronic device intuitively displays the difference between the first curve and the second curve, and guides the user to keep good breath training.
In one possible design, the method further comprises:
when the real-time respiratory rate is higher than the recommended respiratory rate, displaying first information in a second interface, wherein the first information is used for prompting a user to slow down breathing;
or when the real-time respiratory rate is smaller than the recommended respiratory rate, displaying second information in a second interface, wherein the second information is used for prompting the user to accelerate the respiration.
Therefore, the electronic equipment can prompt in time under the condition that the user breathes fast or slow, and guide the user to smoothly perform breathing training.
In one possible design, the method further comprises:
responsive to an operation on the first control, recording respiratory training data of the user;
after the user breath training is finished, according to the user breath training data, displaying a user breath training result in a second interface, wherein the user breath training result comprises: at least one of total expiration time, total inspiration time, real-time respiratory rate, number of exhalations, or number of inhalations.
Therefore, after the user training is finished, the electronic equipment can display the breathing training result of the user, and the user can be reminded conveniently.
In one possible design, the method further comprises:
and adjusting the recommended breathing duration and/or the recommended breathing frequency in response to the operation on the second control in the second interface or according to the breathing training result of the user, and updating the first curve.
Therefore, before the user training starts and/or after the user training ends, the electronic device can adjust the breathing training scheme based on the user will or automatically, so that the adjusted breathing training scheme is closer to the actual training of the user.
In one possible design, the method further comprises:
obtaining a recommended exercise type and a recommended exercise duration according to the emotion health index of the user;
and displaying a third interface, wherein the third interface is used for prompting the user to adopt the recommended exercise type to conduct exercise training of the recommended exercise duration.
Therefore, the electronic equipment can personally recommend the exercise training plan for the user according to the emotion health index of the user.
In one possible design, the method further comprises:
displaying a third control in a third interface;
responsive to an operation on the third control, recording athletic training data for the user;
And after the exercise training of the user is finished, displaying the exercise execution condition of the user in a third interface according to the exercise training data of the user.
Therefore, after the exercise training of the user is finished, the electronic equipment can display the exercise execution condition of the user and is convenient for reminding the user.
In one possible design, the method further comprises:
displaying a fourth control in the third interface;
and adjusting the recommended movement type and/or the recommended movement duration in response to the operation on the fourth control.
Thus, the electronic device may adjust the athletic training program based on the user's will or automatically before the user's movement begins and/or after the user's movement ends, such that the adjusted athletic training program is closer to the user's actual training.
In a second aspect, the present application provides an emotional well-being analysis device, the device comprising: means for implementing the emotional well-being state analysis method in the first aspect and any one of the possible designs of the first aspect.
In a third aspect, the present application provides an electronic device, comprising: a memory and a processor; the memory is used for storing program instructions; the processor is configured to invoke the program instructions in the memory to cause the electronic device to perform the method of emotional well-being analysis of the first aspect and any of the possible designs of the first aspect.
In a fourth aspect, the present application provides a chip system for use in an electronic device comprising a memory, a display screen, and a sensor; the chip system includes: a processor; when the processor executes the computer instructions stored in the memory, the electronic device implements the emotional well-being state analysis method of the first aspect and any of the possible designs of the first aspect.
In a fifth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, causes an electronic device to implement the method for analyzing emotional health states in the first aspect and any of the possible designs of the first aspect.
In a sixth aspect, the present application provides a computer program product comprising: executing instructions stored in a readable storage medium, the executing instructions readable by at least one processor of the electronic device, the executing instructions executable by the at least one processor causing the electronic device to implement the method of analyzing an emotional health state in the first aspect and any of the possible designs of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a block diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 3 is a flow chart of an emotion health condition analysis method according to an embodiment of the present application;
fig. 4A-4C are schematic diagrams of a man-machine interaction interface according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a human-computer interaction interface according to an embodiment of the present disclosure;
FIGS. 6A-6F are schematic diagrams illustrating a human-computer interaction interface according to an embodiment of the present application;
fig. 7A-7B are schematic diagrams of a man-machine interaction interface according to an embodiment of the present application.
Detailed Description
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c alone may represent: a alone, b alone, c alone, a combination of a and b, a combination of a and c, b and c, or a combination of a, b and c, wherein a, b, c may be single or plural. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The terms "center," "longitudinal," "transverse," "upper," "lower," "left," "right," "front," "rear," and the like refer to an orientation or positional relationship based on that shown in the drawings, merely for convenience of description and to simplify the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore should not be construed as limiting the present application.
The application provides an emotion health state analysis method, electronic equipment, a chip system, a computer readable storage medium and a computer program product, wherein heartbeat interval data of a user can be analyzed by means of collected sensor data, emotion health indexes of the user can be obtained according to the heartbeat interval data of the user, the non-sensing measurement and analysis of the emotion health state of the user can be realized, and a digital emotion auxiliary tool is provided for the user.
The electronic device may be a wearable device, such as a watch, a bracelet, a head-mounted device (e.g., an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, wearable glasses, headphones), etc.
In addition, the electronic device may also be in communication connection with the wearable device by means of bluetooth, mobile hotspot, other short-range communication, etc., where the electronic device may be a mobile phone, other wearable device, a tablet computer, a notebook computer, a vehicle-mounted device, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), an intelligent television, a smart screen, a high-definition television, a 4K television, an intelligent sound box, an intelligent projector, etc., and the specific type of the electronic device is not limited in this application.
The electronic device is taken as an example of a mobile phone, and referring to fig. 1, the electronic device related to the application is described below.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 1, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated herein does not constitute a specific limitation on the electronic device 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other devices, such as AR devices, etc.
It should be understood that the connection between the modules illustrated in the present application is merely illustrative, and does not limit the structure of the electronic device 100. In other embodiments, the electronic device 100 may also employ different interfaces in the above embodiments, or a combination of interfaces.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Taking an Android system with a layered architecture as an example, the application exemplifies a software structure of the electronic device 100. The type of the operating system of the electronic device is not limited in this application. For example, an Android system, a Linux system, a Windows system, an iOS system, a hong OS system (harmony operating system, hong OS), and the like.
Fig. 2 is a software architecture block diagram of an electronic device according to an embodiment of the present application. As shown in fig. 2, the hierarchical architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer (APP), an application framework layer (APP framework), an Zhuoyun row (Android run) and system library (library), and a kernel layer (kernel).
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include Applications (APP) such as camera, gallery, calendar, call, map, navigation, WLAN, bluetooth, music, video, game, chat, shopping, travel, instant messaging (e.g., short message), smart home, device control, etc.
The intelligent home application can be used for controlling or managing home equipment with networking function. For example, home appliances may include electric lights, televisions, and air conditioners. For another example, the home appliances may also include a burglarproof door lock, a speaker, a floor sweeping robot, a socket, a body fat scale, a desk lamp, an air purifier, a refrigerator, a washing machine, a water heater, a microwave oven, an electric cooker, a curtain, a fan, a television, a set-top box, a door and window, and the like.
In addition, the application package may further include: desktop (i.e., home screen), negative one, control center, notification center, etc. applications.
The negative one screen, which may be referred to as "-1 screen", refers to a User Interface (UI) that slides the screen rightward on the main screen of the electronic device until it slides to the leftmost split screen. For example, the negative screen may be used to place shortcut service functions and notification messages, such as global search, shortcut entries (payment codes, weChat, etc.) for a page of an application, instant messaging and reminders (express information, expense information, commute road conditions, driving travel information, schedule information, etc.), and attention dynamics (football stand, basketball stand, stock information, etc.), etc. The control center is an up-slide message notification bar of the electronic device, that is, a user interface displayed by the electronic device when a user starts an up-slide operation at the bottom of the electronic device. The notification center is a drop-down message notification bar of the electronic device, i.e., a user interface displayed by the electronic device when a user begins to operate downward on top of the electronic device.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
Window managers (window manager) are used to manage window programs such as manage window states, attributes, view (view) additions, deletions, updates, window order, message collection and processing, and the like. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. And, the window manager accesses the portal of the window for the outside world.
The content provider is used to store and retrieve data and make such data accessible to the application. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
A resource manager (resource manager) provides various resources for applications such as localization strings, icons, pictures, layout files of user interfaces (layout xml), video files, fonts, colors, identification numbers (identity document, IDs) of user interface components (user interface module, UI components) (also known as serial numbers or account numbers), etc. And the resource manager is used for uniformly managing the resources.
The notification manager (notification manager) allows applications to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The android runtime includes a core library and virtual machines. And the Android running process is responsible for scheduling and managing the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of the Android system.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGLES), 2D graphics engine (e.g., SGL), image processing library and desktop launcher (launcher), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the software and hardware of the electronic device 100 is illustrated below in connection with a scenario in which sound is played using a smart speaker.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch click operation, taking a control corresponding to the click operation as an example of a control of an intelligent sound box icon, the intelligent sound box application calls an interface of an application framework layer, starts the intelligent sound box application, further starts audio driving by calling a kernel layer, and converts an audio electric signal into a sound signal by a loudspeaker 170A.
It is to be understood that the structure illustrated herein does not constitute a specific limitation on the electronic device 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Based on the foregoing description, the following embodiments of the present application take an electronic device having the structure shown in fig. 1 and fig. 2 as an example, and the method for analyzing an emotional health state provided in the present application will be described in detail with reference to the drawings and application scenarios.
For ease of illustration, this application is illustrated with an electronic device in communication with a wearable device.
Referring to fig. 3, fig. 3 is a flow chart illustrating a method for analyzing an emotional state of health according to an embodiment of the present application. As shown in fig. 3, the emotional health state analysis method of the present application may include:
s101, acquiring sensor data of a user in the sleeping process of a preset time period.
The user may wear the wearable device in a position such as a wrist, abdomen, leg, etc. Thus, during sleep of the user for a predetermined period of time, the wearable device may acquire in real time a photoplethysmography (PPG) signal of a PPG sensor in the wearable device.
Among them, PPG is a non-invasive detection method for detecting a change in blood volume in living tissue by means of photoelectric means. When a light beam of a certain wavelength is irradiated to the skin surface of the user, the light beam will be transmitted to the light sensor by means of transmission or reflection. During this process, the volume of blood within the blood vessel changes in a pulsatile manner under the action of systole and diastole. Therefore, at diastole, the peripheral vascular capacity of the heart decreases and the light intensity detected by the light sensor is large. When the heart contracts, the peripheral blood volume of the heart increases, and the light intensity detected by the light sensor is smaller, so that the light intensity detected by the light sensor is changed in a pulsation manner, and the light intensity change signal can be converted into an electric signal, namely a PPG signal.
Based on the above description, the PPG signal may characterize physiological characteristic information of the user such as blood pressure, blood oxygen, brain oxygen, blood glucose, pulse rate, blood oxygen saturation, heart rate, and respiration rate. Thus, the wearable device may send the PPG signal as sensor data to the electronic device. Alternatively, the wearable device may perform processing on the PPG signal, such as separation from the noise signal, filtering, etc., to obtain sensor data, and send the sensor data to the electronic device.
Therefore, the electronic device can acquire the sensor data of the user in the sleeping process of the preset time period by means of the wearable device, so that the electronic device can acquire the physiological characteristic information of the user through the sensor data.
The specific implementation manner of the predetermined time period is not limited in this application. In some embodiments, the predetermined period of time may be generally set as a period of time during which the user is at rest for sleeping, such as during the night or daytime.
S102, obtaining heartbeat interval data of the user according to the sensor data.
Among them, the heartbeat interval data, i.e., RR interval (RRI) data. RRI data may be used to characterize an indicator related to the emotional state of health of a user. The RRI data includes RRI-related data. RRI refers to the time (or duration) between two P-waves in the PPG signal. In the PPG signal, the P-wave is the main wave, and the P-wave is the maximum amplitude wave caused by the maximum blood released by the ventricle during systole. The time (or duration) that elapses from this main peak to the next main peak may be referred to as RRI.
Typically, a normal RRI is between 0.6 seconds and 1.0 seconds. If the time is less than 0.6 seconds, the heart beat is too fast; if the time is greater than 1.0 second, the heart skip is slow. In addition, arrhythmia is indicated when RRIs are not equally spaced.
Thus, the electronic device can analyze the physiological characteristic information of the user indicated by the sensor data, and can obtain RRI data of the user.
S103, according to heartbeat interval data of the user, obtaining an emotion health index of the user, wherein the emotion health index of the user is used for indicating the emotion health state of the user.
Based on the description of S102, the heartbeat interval data of the user may characterize an indicator related to the emotional health state of the user. Therefore, the electronic equipment can obtain the emotion health index of the user according to the heartbeat interval data of the user.
Wherein the emotional health index of the user is used to represent the emotional health state of the user. In general, the higher the emotional health index, the better the emotional health state.
Taking depression as an example, a high emotional health index of the user indicates that the emotional health state of the user may be mild depression. The user's emotional health index is moderate, indicating that the user's emotional health state may be moderate depression. A low emotional well-being index of the user indicates that the emotional well-being state of the user may be major depressive disorder.
In addition, the electronic equipment can record the emotion health index of the user in a single, a period of time, periodicity, continuity and other modes, and the emotion health index is in the dimensions of the day, week, month, year and the like. The electronic device may display an emotional well-being index of the user in the first interface. Therefore, the daily emotional health state of the user is conveniently and timely fed back to the user, and the change trend of the emotional health state of the user in a period of time is also favorably counted.
The specific implementation manner of the first interface is not limited in this application. In some embodiments, the first interface may be at least one of a user interface of an application installed in the electronic device, a web page displayed in the electronic device, or a page of a public number displayed in the electronic device, etc.
For example, the first interface may be one user interface of a innovative research application in an electronic device.
For example, taking an electronic device as a mobile phone, and taking the mobile phone and the watch as an example, after the user wears the watch to sleep for one night, the user can check the emotion health index of the user in the application program of the mobile phone for the second day, so that the user can conveniently learn the emotion health state of the user.
After the user continuously wears the watch to sleep and rest for a plurality of evenings, the user can also check the emotion health index and the corresponding change condition of the user in the period of time in the application program of the mobile phone, so that the emotion health state and the corresponding change trend of the user can be conveniently known.
According to the emotion health state analysis method, through acquiring the sensor data of the user in the sleeping process of the preset time period, the physiological characteristic information of the user can be obtained through the sensor data. And obtaining heartbeat interval data of the user according to the sensor data, and obtaining indexes related to the emotion health state of the user through the heartbeat interval data of the user. And obtaining an emotion health index of the user according to the heartbeat interval data of the user, wherein the emotion health index of the user is used for indicating the emotion health state of the user. Therefore, based on the evaluation capability of the emotion health state of the user, the non-sensing measurement and analysis of the emotion health state are realized, and a digital emotion auxiliary tool is provided for the user, so that the user obtains good use experience.
In one possible implementation in S103, it is found by a queue study (corortstudent) that there is a significant difference in the heart rate variability (heart rate variability, HRV, also known as heart rate variability) index of the user during deep sleep and rapid eye movement (rapid eye movement, REM) periods under different emotional health states.
Accordingly, the electronic device may analyze the emotional health index of the user based on an association between HRV indicators of one or more sleep stages (e.g., deep sleep periods and/or REM periods) and the emotional health state of the user.
In addition, the predetermined period of time may be regarded as one sleep period. The sleep cycle may include a plurality of sleep stages.
Wherein deep sleep session refers to a sleep stage that eliminates fatigue.
In other words, during this stage of sleep, the user's brain and body can be thoroughly rested and relaxed, and the user's energy and physical strength can be thoroughly recovered.
Wherein, REM phase is also called heterogeneous sleep (para-sleep) phase and fast-phase sleep phase. REM refers to one sleep stage of dreaming, the shallowest of all sleep stages.
In other words, during this stage of sleep, the user does not have deep sleep and the user's eyes may appear to be moving rapidly involuntarily. The brain neurons of the user are the same as those of the awake state, and the user after the REM period is awake is full of vigilance and spirit.
Wherein HRV is the result of an Electrocardiogram (ECG) examination. HRV refers to the variation of the beat-to-beat cycle difference, or the variation of the rate of the heart beat. HRV is determined by the length of time of two adjacent RRIs, i.e., the small difference from the first cardiac cycle to the next cardiac cycle.
The specific implementation manner of the HRV index is not limited in this application. For example, HRV indicators may include, but are not limited to: low Frequency (LF), high Frequency (HF), very low frequency (very low frequency, VLF), low frequency normalized value LFnu, high frequency normalized value HFnu, very low frequency normalized value VLFnu, low frequency to high frequency ratio LF/HF ratio, standard deviation of total sinus cardiac RR intervals (standard deviation of normal to normal RR intervals, SDNN), root mean square of adjacent sinus cardiac RR interval differences (root-mean square of differences between adjacent normal RR intervals in a time interval, rMSSD), average standard deviation of sinus cardiac RR intervals per 5 minute period SDNN, number of adjacent sinus cardiac RR differences greater than a 50 millisecond (percentage of adjacent RR intervals with a difference of duration >50ms, pnn 50), average heart rate mean-HR, triangular index triangu, lateral axis SD1, longitudinal axis SD2, or ratio SD1/SD2 of lateral axis to longitudinal axis.
In the process of analyzing the HRV index by using the Poincare-plot, for a section of continuous heartbeat interval, taking the ith heartbeat interval as an abscissa and taking the (i+1) heartbeat intervals as an ordinate, a graph can be drawn in a two-dimensional plane, and the graph can be regarded as an ellipse. i is a positive integer. Wherein the center of the ellipse is located at a coordinate point determined by (average value of heartbeat interval ), SD1 is a semi-major axis of the ellipse, and SD2 is a semi-minor axis of the ellipse.
Based on the above description, the electronic device may obtain, according to RRI data of the user, an HRV indicator of a target sleep period, that is, an HRV indicator of a deep sleep period, and/or an HRV indicator of a REM period, where the indicator related to the emotional health state of the user is represented by the RRI data.
Wherein the target sleep period may include one or more sleep stages in the sleep cycle. The target sleep period is a deep sleep period, or REM period, or deep sleep period or REM period entered by the user during sleep for a predetermined period of time.
Therefore, the electronic equipment can obtain the emotion health index of the user according to the HRV index of the deep sleep period and/or the HRV index of the REM period.
In addition, the electronic device may further consider different sleep fragmentation degrees caused by dreaming of the user in different emotional health states of the user, in addition to the association relationship between the HRV index of one or more sleep stages and the emotional health state of the user, and comprehensively obtain the emotional health index of the user based on the association relationship between the HRV index of one or more sleep stages and the fragmentation degree index and the emotional health state of the user.
Wherein the fragmentation degree index of one sleep stage is used for representing the sleep fragmentation degree of the sleep stage in the sleep process of the user in a preset time period. The degree of sleep fragmentation may characterize the persistence (or continuity) of sleep stages. In general, the longer the fragmentation index, the better the emotional state of health.
Taking depression as an example, the index of the degree of fragmentation of the deep sleep stage is small, the degree of fragmentation of the user in the deep sleep stage is low, and the user is not seriously fragmented in the deep sleep stage, and the duration of the deep sleep stage is high, so that the emotional health state of the user can be mild depression.
The degree of fragmentation index of the deep sleep stage is moderate, the degree of fragmentation of the user in the deep sleep stage is moderate, and the persistence of the deep sleep stage is moderate, so that the emotional health state of the user can be indicated to be moderate depression.
The degree of fragmentation index of the deep sleep stage is large, the degree of fragmentation of the user in the deep sleep stage is high, and the fact that the user is severely fragmented in the deep sleep stage and continuously in the deep sleep stage is indicated, the emotional health state of the user can be major depression.
Based on the above description, the electronic device may further obtain a fragmentation degree index of the target sleep period, that is, a fragmentation degree index of the deep sleep period, and/or a fragmentation degree index of the REM period according to the heartbeat interval data of the user.
Therefore, the electronic equipment can comprehensively obtain the emotion health index of the user according to the HRV index and the fragmentation degree index of the deep sleep stage and/or the HRV index and the fragmentation degree index of the REM stage.
Based on the description, the electronic device can analyze the emotion health index of the user according to the HRV index and the fragmentation degree index of the deep sleep period and/or the HRV index and the fragmentation degree index of the REM period by combining the emotion health state model obtained by expert experience.
The emotional health state model is used for indicating the incidence relation between the emotional health indexes corresponding to different emotional health states, the HRV indexes and the fragmentation degree indexes.
In one possible implementation of the emotional state of health model, assuming that the metrics input to the emotional state of health model may include: index 1, index 2, index 3, index 4, index 5, and index 6.
Wherein, index 1 is low frequency LF of deep sleep stage, index 2 is low frequency LF of REM stage, index 3 is root mean square rmSSD of adjacent sinus heart beat RR interval difference value of deep sleep stage, index 4 is root mean square rmSSD of adjacent sinus heart beat RR interval difference value of REM stage, index 5 is fragmentation degree index of deep sleep stage, and index 6 is fragmentation degree index of REM stage.
Then, the emotional health index of the user output by the emotional health state model may be equal to a1×index 1+a2×index 2+a3×index 3+a4×index 4+a5×index 5+a6.
Wherein a1, a2, a3, a4, a5 and a6 are weights of the respective corresponding indices, and each of a1, a2, a3, a4, a5 and a6 is a natural number of 0 or more and 1 or less, a1+a2+a3+a4+a5+a6=1.
In some embodiments, an emotional state of health model may be stored in the electronic device.
Thus, the electronic device may input the HRV index and the fragmentation degree index of the deep sleep period and/or the HRV index and the fragmentation degree index of the REM period into the emotional health state model, and output the emotional health index of the user.
In other embodiments, the electronic device may invoke the emotional health state model stored in the other device.
Thus, the electronic device may send HRV and fragmentation level indicators for the deep sleep period and/or for the REM period to other devices. Other devices may input HRV and fragmentation level indicators for the deep sleep period and/or HRV and fragmentation level indicators for the REM period into the emotional health state model, outputting the emotional health index of the user. Other devices may send the emotional well-being index of the user to the electronic device.
The other devices may be other electronic devices or devices such as a server. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (content delivery network, CDN), basic cloud computing services such as big data and artificial intelligent platforms, and the like.
In addition, the electronic equipment can record the emotion health index of the user in a single, one-period, periodic, continuous and other modes, and can display the emotion health index of the user to the user according to the time dimension, so that the noninductive recording and displaying of the emotion health state are realized.
In some embodiments, the electronic device may determine the emotional health index of the user based on the value of the low frequency LF for the target sleep session and the fragmentation index for the target sleep session.
Therefore, the electronic device can display the emotional health index of the user in the first interface according to the time dimension, and introduce the value of the low frequency LF of the target sleep period and the change of the sleep fragmentation degree of the target sleep period to the user, so that the user can know the weight and the reason of the emotional health state of the user.
Taking depression as an example, the low frequency LF of the target sleep stage has a small value, the target sleep stage has a small index of the degree of fragmentation, and if the target sleep stage has a low degree of sleep fragmentation, the emotional health state of the user may be mild depression.
The low frequency LF of the target sleep session has a moderate value and the sleep fragmentation of the target sleep session is moderate, which indicates that the emotional health status of the user may be moderate depression.
The low frequency LF of the target sleep stage has a large value, the fragmentation degree index of the target sleep stage is large, and if the sleep fragmentation degree of the target sleep stage is high, the emotional health state of the user can be represented as major depression.
In the following, referring to fig. 4A-4C, it is illustrated that the electronic device may display the emotional well-being index of the user in the first interface according to a time dimension.
Referring to fig. 4A-4C, fig. 4A-4C are schematic diagrams illustrating a man-machine interface according to an embodiment of the present application. For convenience of explanation, in fig. 4A to 4C, an electronic device is illustrated as a mobile phone.
The electronic device may display a user interface 10 as illustratively shown in any one of fig. 4A-4C.
The user interface 10 is a first interface mentioned in the application, and the user interface 10 is used for displaying an emotion health index of a user and an emotion health state of the user according to a time dimension.
In fig. 4A and 4B, the user interface 10 displays the daily user's emotional health index, the user's emotional health status, the reason for this, and the ratio between different emotional health statuses in the week, etc., and the reason may be represented by parameters such as the value of the low frequency LF of the target sleep period, and the fragmentation index of the target sleep period.
In fig. 4C, the number of days of each emotional state of the user per month in a year, the ratio between different emotional states, and the like are displayed in the user interface 10.
In fig. 4A, the user interface 10 may include: control 101, control 101 is used to display the reason why the user's emotional well-being index has fallen, and the emotional well-being index of the user has fallen. Among these, the above reasons may include a numerical rise in low frequency LF for the deep sleep period, and an increase in the degree of sleep fragmentation for the deep sleep period.
In fig. 4B, the user interface 10 may include: the control 102, the control 102 is configured to display a reason for the obvious increase of the emotional well-being index of the user and the obvious increase of the emotional well-being index of the user. Among these reasons, the above-mentioned reasons may include a decrease in the value of low frequency LF for the deep sleep period, and a decrease in the degree of sleep fragmentation for the deep sleep period.
It can be seen that the electronic device may provide the user with the emotional health status of the user, and may also introduce the user with the reason why the emotional health index of the user changed.
It should be understood that the present application includes, but is not limited to, implementations described above that present the user's emotional health status to the user.
Based on the above description, the electronic device may also recommend a personalized respiratory training regimen to the user according to the emotional well-being index of the user. In some embodiments, the electronic device may obtain the recommended breath duration and the recommended breath frequency based on the emotional well-being index of the user.
The recommended breath duration may be 1 minute, 2 minutes, 3 minutes, 5 minutes, 10 minutes, etc. The recommended breathing rate may be 6 times per minute, 8 times per minute, 10 times per minute, etc.
Thus, the electronic device may provide respiratory training advice for users of different emotional health states for different durations and/or at different frequencies.
Before the respiration training of the user starts, the electronic device can display a respiration training scheme, and can remind the user to perform proper respiration training.
In the breathing training process of the user, the electronic equipment combines the breathing training data of the user in the actual process, can interactively guide the user to perform proper breathing training, and counts and records breathing training indexes of the user.
After the user's breath training is completed, the electronic device may also adaptively adjust the user's breath training regimen so that the user is then able to perform the proper breath training.
Next, a specific implementation process of the electronic device to recommend the respiratory training scheme to the user will be described in detail with reference to fig. 5 and fig. 6A to 6F.
Referring to fig. 5 and fig. 6A to fig. 6F, fig. 5 and fig. 6A to fig. 6F show schematic diagrams of a man-machine interaction interface according to an embodiment of the present application. For convenience of explanation, fig. 5 and fig. 6A to 6F illustrate an electronic device as an example of a mobile phone.
The electronic device may display a user interface 20 as exemplarily shown in fig. 5, the user interface 20 being for recommending a respiratory training regimen.
In fig. 5, the user interface 20 may include: control 201, control 201 is used to trigger a breath training protocol.
Upon receiving an operation on control 201, the electronic device may change from displaying user interface 20 as exemplarily shown in fig. 5 to displaying second interface 31 as exemplarily shown in fig. 6A, second interface 31 for prompting the user to perform a respiratory training of the recommended respiratory duration at the recommended respiratory rate.
The specific implementation of the second interface 31 is not limited herein. In some embodiments, the second interface 31 may be at least one of a user interface of an application installed in the electronic device, a web page displayed in the electronic device, or a page of a public number displayed in the electronic device, etc.
For example, the second interface 31 is one user interface of a Innovative research application or a sports health application in an electronic device.
In fig. 6A, the second interface 31 may include: first control 301 and hint information 302. First control 301 is used to initiate respiratory training. The prompt message 302 is used to prompt the recommended respiratory rate and the recommended respiratory duration.
Parameters such as size, shape, location, content, etc. of first control 301 and hint information 302 are not limited in this application. Operations on first control 301 may include, but are not limited to: single click, double click, long press, etc. Additionally, the first control may be a control that includes one or more controls.
Upon receiving an operation on first control 301, the electronic device may determine that the user began a respiratory workout. Thus, the electronic device may change from displaying the second interface 31 illustrated in the example of fig. 6A to displaying the second interface 31 illustrated in the example of fig. 6B or 6C.
In fig. 6B, the electronic device may hover and draw the second curve B1 over the first curve a. In fig. 6C, the electronic device may hover and draw the second curve b2 over the first curve a. Thus, the electronic device can intuitively display the difference between the first curve a and the second curve (b 1 or b 2) and guide the user to keep good breathing training.
Wherein the first curve a may be used to present a recommended respiratory training regimen to the user. The first curve a is derived from the recommended breath duration and the recommended breath frequency.
Wherein the second curve (b 1 or b 2) may be used to show the user the real-time breathing frequency of the user. The second curve (b 1 or b 2) is derived from the real-time breathing frequency of the user.
In addition, the form of the first curve a and the second curve (b 1 or b 2) may include, but is not limited to: trigonometric curves, broken lines, histograms, etc.
For convenience of distinction, in fig. 6B to 6C, the second curve (B1 or B2) is shown with a solid line, and the first curve a is shown with a broken line.
In fig. 6B, in case the user breathes faster, the second curve B1 presents a real-time breathing rate faster than the recommended breathing rate indicated by the first curve a. When the real-time respiratory rate is greater than the recommended respiratory rate, the electronic device may display the first information 305 in the second interface 31.
Wherein the first information 305 is used to prompt the user to slow down the breath. The present application does not limit parameters such as content, manner, location, duration, etc. of the first information 305. In some embodiments, the first information 305 may be displayed in at least one of text, voice, image, physical vibration, and the like.
In fig. 6C, in case the user breathes slowly, the second curve b2 presents a real-time breathing rate slower than the recommended breathing rate indicated by the first curve a. When the real-time respiratory rate is less than the recommended respiratory rate, the electronic device may display second information 306 in the second interface 31.
Wherein the second information 306 is used to prompt the user to accelerate breathing. The parameters of the second information 306, such as content, manner, location, duration, etc., are not limited in this application. In some embodiments, the second information 306 may be displayed in at least one of text, voice, image, physical vibration, and the like.
Additionally, upon receiving an operation on first control 301, the electronic device may also record respiration training data for the user. The user's respiratory training data may include, among other things, data generated by the user during the actual respiratory training process.
In some embodiments, sensors in the wearable device such as gyroscopes and Accelerometers (ACC), PPG, etc. may collect breathing training data of the user in real time. The wearable device may send the user's respiratory training data to the electronic device such that the electronic device may obtain the user's respiratory training data and draw a second curve (b 1 or b 2).
It should be appreciated that in addition to the implementations described above, the electronic device may collect respiratory training data of the user in real-time while the electronic device is a wearable device.
After the user's breath training is completed, the electronic device may display a second interface 31 as exemplarily shown in fig. 6D, where the second interface 31 is used to remind the user that the breath training has been completed and to feed back the result of the user's breath training.
In fig. 6D, a control 307 may be included in the second interface 31, the control 307 being used to trigger entry into displaying the user's respiratory training results.
The breathing training result of the user is obtained according to the breathing training data of the user. The user's respiratory training results may include, but are not limited to: total expiration time (e.g., how many minutes the user exhales), total inspiration time (e.g., how many minutes the user inhales), real-time respiratory rate, number of exhalations, number of inhalations, overall respiratory cadence of 1: x, or whether within an optimal respiratory frequency. Parameters such as content, mode, position, duration and the like of the breathing training result of the user are not limited. In some embodiments, the user's respiratory training results may be displayed in at least one of text, speech, images, and the like.
Upon receiving an operation on control 307, the electronic device can change from displaying second interface 31 as exemplarily shown in fig. 6D to displaying second interface 31 as exemplarily shown in fig. 6E.
In fig. 6E, a control 308 may be included in the second interface 31, the control 308 being configured to display the breathing training results of the user.
It should be appreciated that after the user's respiratory training is completed, the electronic device may also display a second interface 31 as exemplarily shown in fig. 6E. Thus, the breathing training results of the user are intuitively presented to the user.
In addition, after the user breath training is finished, the electronic device may further adjust the recommended breath duration and/or the recommended breath frequency according to the breath training data of the user, and update the first curve a mentioned in the application.
And/or, after receiving the operation on the second control in the second interface, the electronic device may further adjust the recommended breath duration and/or the recommended breath frequency, and update the first curve a mentioned in the application.
The second control is used for triggering an inlet for adjusting the recommended breathing duration and/or the recommended breathing frequency. Parameters such as size, shape, position and the like of the second control are not limited. Operations on the second control may include, but are not limited to: single click, double click, long press, etc. Additionally, the second control may be a control that includes one or more controls.
For example, in fig. 6A before the user breath training is started and/or in fig. 6E after the user breath training is finished, the second interface 31 may further include: control 303, control 303 is used to provide a portal for a user to adjust the recommended breath duration and/or the recommended breath frequency.
In addition, in fig. 6A, the second interface 31 may further include: control 304, control 304 is used to adjust the recommended breath duration, control 304 may also be considered a second control referred to herein.
Taking fig. 6E as an example after the end of the user's respiratory training, upon receiving the operation on control 303 shown in fig. 6E, the electronic device may change from displaying second interface 31 shown in fig. 6E by way of example to displaying second interface 31 shown in fig. 6F by way of example.
In fig. 6F, the second interface 31 may include: control 309 and control 310, control 309 being for adjusting the recommended breath frequency, control 310 being for adjusting the recommended breath duration.
It can be seen that controls 303 and 309 can be considered as second controls referred to herein that can adjust the recommended breath frequency and/or controls 303 and 310 can be considered as second controls referred to herein that can adjust the recommended breath duration. Thus, the user is enabled to autonomously select the duration and/or frequency of the respiratory training.
In addition, in fig. 6F, the second interface 31 may further include: control 311, control 311 is used to continue to begin respiratory training.
In summary, the electronic device may provide a personalized breath training scheme for the user according to the emotional health index of the user, and may further provide a complete set of user interface interaction logic for the user based on the personalized breath training scheme. Therefore, the breathing training behaviors of the user can be standardized, the breathing training quality of the user can be guaranteed, and the user can efficiently perform breathing training.
Based on the above description, the electronic device may also recommend a personalized athletic training program to the user based on the user's emotional well-being index.
In some embodiments, the electronic device may obtain the recommended exercise type and the recommended exercise duration according to the emotional health index of the user by using heat consumption, counting high-intensity heart rate accumulation duration, and the like.
In addition, the electronic equipment adopts modes of heat consumption, high-intensity heart rate accumulation duration and the like, and can also be combined with basic information (such as gender, age, province, city, height, weight and other authorized information of the user) of the user, so that the recommended exercise type and the recommended exercise duration are obtained according to the emotion health index of the user.
Among other things, recommended athletic performance types may include, but are not limited to: walking, riding, fast walking, swimming, etc. The recommended athletic duration is for the purpose of the athletic training program. The recommended exercise duration may be 6 minutes, 20 minutes, 60 minutes, etc.
Thus, the electronic device may provide matching athletic training advice for users of different emotional health states.
Before the athletic training of the user begins, the electronic device may display an athletic training program that can remind the user to perform the proper athletic training.
In the exercise training process of the user, the electronic equipment is combined with the exercise training data of the user in the actual process, so that the exercise execution condition of the user can be adaptively matched into an exercise training plan, the process of automatically matching the exercise training plan can be realized even if the recommended exercise type selected by the user or recommended by the electronic equipment is different from the actual exercise type of the user in the actual process, and the problem of poor exercise execution caused by poor emotional health state is avoided.
After the exercise training of the user is finished, the electronic equipment can also display the completion progress/exercise execution condition of the exercise training plan of the user, and guide the user to continue the exercise training.
The following describes in detail, with reference to fig. 5 and fig. 7A-7B, a specific implementation process in which the electronic device recommends a sports training program to the user.
Referring to fig. 7A-7B, fig. 7A-7B illustrate a schematic diagram of a man-machine interface according to an embodiment of the present application. For convenience of explanation, fig. 7A to 7B illustrate an electronic device as an example of a mobile phone.
In fig. 5, the user interface 20 may further include: control 202, control 202 is used to trigger a athletic training program.
Upon receiving an operation on control 202, the electronic device can change from displaying user interface 20 as exemplarily shown in fig. 5 to displaying third interface 41 as exemplarily shown in fig. 7A, third interface 41 for prompting the user to perform an athletic training of the recommended athletic duration with the recommended athletic type.
The specific implementation of the third interface 41 is not limited herein. For example, the third interface 41 may be at least one of a user interface of an application installed in the electronic device, a web page displayed in the electronic device, or a page of a public number displayed in the electronic device, or the like.
In fig. 7A, a third control 401 may be included in the third interface 41, the third control 401 being used to initiate athletic training, and the third control 401 also being used to trigger entry of athletic training data for the recording user.
Parameters such as size, shape, and position of the third control 401 are not limited in this application. Operations on the third control 401 may include, but are not limited to: single click, double click, long press, etc. Additionally, the third control may be a control that includes one or more controls.
Upon receiving the operation on the third control 401, the electronic device can determine that the user began athletic training. Thus, the electronic device may record athletic training data for the user. The athletic training data of the user may include, among other things, data generated by the user during the actual athletic training process.
In some embodiments, gyroscopes and sensors such as ACC, PPG, etc. in the wearable device may collect the user's athletic training data in real time, or the user's basic information and the user's athletic training data. The wearable device may send the athletic training data of the user, or the base information of the user and the athletic training data of the user, to the electronic device, so that the electronic device may obtain the athletic training data of the user, or the base information of the user and the athletic training data of the user.
It should be appreciated that in addition to the above implementations, where the electronic device is a wearable device, the electronic device may collect athletic training data of the user, or the user's underlying information and the user's athletic training data, in real time.
After the exercise training of the user is finished, the electronic device may display a third interface 41 as exemplarily shown in fig. 7B, where the third interface 41 is used to remind the user that the exercise training is finished and feedback the exercise performance of the user according to the time dimension.
The exercise execution condition of the user is obtained according to exercise training data of the user, or basic information of the user and the exercise training data of the user. The user's athletic performance may include, but is not limited to: at least one of a total duration of the exercise, an actual type of the exercise, an exercise date, an exercise start time point, an exercise end time point, or a completion schedule of the exercise training program. The application does not limit parameters such as content, mode, position, duration and the like of the exercise execution condition of the user. In some embodiments, the motion execution of the user may be displayed in at least one of text, voice, image, etc.
In addition, after the exercise training of the user is finished, the electronic equipment can adjust the recommended exercise type and/or the recommended exercise duration according to the exercise training data of the user.
And/or, before user athletic training begins, the electronic device may adjust the recommended athletic type and/or recommended athletic length after receiving an operation on the fourth control in the third interface 41. Thus, the user is enabled to autonomously select the type and/or duration of athletic training.
The fourth control is used for triggering an entry for adjusting the recommended movement type and/or the recommended movement duration. Parameters such as size, shape, position and the like of the fourth control are not limited. Operations on the fourth control may include, but are not limited to: single click, double click, long press, etc. Additionally, the fourth control may be a control that includes one or more controls.
For example, in fig. 7A, a control 402 may be further included in the third interface 41, where the control 402 is used to adjust the recommended motion type, and the control 402 may be regarded as a fourth control mentioned in the present application. Thus, upon receiving an operation on control 402, the electronic device can adjust the recommended motion type.
Taking the heat consumption as an example, assume that the recommended exercise type selected by the user or recommended by the electronic device is running. If the actual type of movement of the user in the actual process is walking, the electronic device may match the heat consumed by the walking to the athletic training program to update the completion schedule of the athletic training program. Therefore, the user is guided to complete the corresponding exercise training plan by taking the recommended exercise type selected by the user or recommended by the electronic equipment as traction.
In sum, the electronic equipment can provide a personalized exercise training plan for the user according to the emotion health index of the user, and can automatically match actual exercise training data of the user into the exercise training plan, so that the problem of poor execution degree of the exercise training plan due to poor emotion health state is avoided, and the user is guided to sequentially complete the exercise training plan.
Illustratively, the present application provides an emotional well-being analysis device, comprising: a module for implementing the emotional wellness state analysis method in the previous embodiments.
Illustratively, the present application provides an electronic device comprising: a memory and a processor; the memory is used for storing program instructions; the processor is configured to invoke the program instructions in the memory to cause the electronic device to perform the emotional well-being analysis method of the previous embodiments.
Illustratively, the present application provides a chip comprising: the interface circuit is used for receiving signals from other chips outside the chip and transmitting the signals to the logic circuit, or sending the signals from the logic circuit to the other chips outside the chip, and the logic circuit is used for realizing the emotion health state analysis method in the previous embodiment.
Illustratively, the present application provides a chip system for use with an electronic device including a memory, a display screen, and a sensor; the chip system includes: a processor; the electronic device performs the emotional well-being analysis method of the previous embodiments when the processor executes the computer instructions stored in the memory.
Illustratively, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes an electronic device to implement the emotional health state analysis method in the previous embodiments.
Illustratively, the present application provides a computer program product comprising: executing instructions, the executing instructions being stored in a readable storage medium, from which at least one processor of the electronic device can read the executing instructions, the at least one processor executing the executing instructions causing the electronic device to implement the method of emotional well-being state analysis in the previous embodiments.
In the above-described embodiments, all or part of the functions may be implemented by software, hardware, or a combination of software and hardware. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer readable storage medium. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: a read-only memory (ROM) or a random access memory (random access memory, RAM), a magnetic disk or an optical disk, or the like.

Claims (17)

1. A method of emotional wellness state analysis, the method comprising:
acquiring sensor data of a user during sleep for a predetermined period of time;
obtaining heartbeat interval data of the user according to the sensor data;
and obtaining an emotion health index of the user according to the heartbeat interval data of the user, wherein the emotion health index of the user is used for representing the emotion health state of the user.
2. The method according to claim 1, wherein the obtaining the emotional health index of the user according to the heartbeat interval data of the user comprises:
Obtaining heart rate variability indexes of a target sleep period according to the heart beat interval data of the user, wherein the target sleep period is a deep sleep period and/or a rapid eye movement period which the user enters in the sleep process of the preset time period;
and obtaining the emotion health index of the user according to the heart rate variability index of the target sleep period.
3. The method according to claim 2, wherein the method further comprises:
obtaining a fragmentation degree index of the target sleep stage according to the heartbeat interval data of the user;
obtaining the emotion health index of the user according to the heart rate variability index of the target sleep period, wherein the method comprises the following steps: and obtaining the emotion health index of the user according to the heart rate variability index and the fragmentation degree index during the target sleep.
4. A method according to claim 3, wherein the deriving the emotional well-being index of the user from the heart rate variability index and the degree of fragmentation index during the target sleep comprises:
inputting the heart rate variability index and the fragmentation degree index during the target sleep period into an emotion health state model, and outputting the emotion health index of the user, wherein the emotion health state model is used for indicating the incidence relations between the emotion health indexes corresponding to different emotion health states, the heart rate variability index and the fragmentation degree index.
5. The method according to any one of claims 2-4, wherein the heart rate variability index comprises: at least one of low frequency LF, high frequency HF, very low frequency VLF, low frequency normalized value LFnu, high frequency normalized value HFnu, very low frequency normalized value VLFnu, low frequency to high frequency ratio LF/HF ratio, standard deviation of total sinus cardiac RR intervals SDNN, root mean square rMSSD of adjacent sinus cardiac RR interval differences, average standard deviation of sinus cardiac RR intervals SDNN every 5 minutes period, percentage of number of adjacent sinus cardiac RR differences greater than 50 milliseconds pNN50, average heart rate mean-HR, triangular index triangularity, transverse axis SD1, longitudinal axis SD2, or ratio of transverse axis to longitudinal axis SD1/SD2 io.
6. The method according to any one of claims 1-5, further comprising:
recording an emotional health index of the user;
and displaying the emotion health index of the user in a first interface.
7. The method according to any one of claims 1-6, further comprising:
obtaining recommended breathing duration and recommended breathing frequency according to the emotion health index of the user;
And displaying a second interface, wherein the second interface is used for prompting the user to conduct breathing training of the recommended breathing duration according to the recommended breathing frequency.
8. The method of claim 7, wherein the method further comprises:
displaying a first control in the second interface;
and responding to the operation on the first control, and suspending and drawing a second curve on a first curve, wherein the first curve is obtained according to the recommended breathing duration and the recommended breathing frequency, and the second curve is obtained according to the real-time breathing frequency of the user.
9. The method of claim 8, wherein the method further comprises:
displaying first information in the second interface when the real-time respiratory rate is greater than the recommended respiratory rate, wherein the first information is used for prompting the user to slow down breathing;
or when the real-time respiratory rate is smaller than the recommended respiratory rate, displaying second information in the second interface, wherein the second information is used for prompting the user to accelerate breathing.
10. The method according to claim 8 or 9, characterized in that the method further comprises:
Responsive to an operation on the first control, recording respiratory training data of the user;
after the user breath training is finished, according to the user breath training data, displaying the user breath training result in the second interface, wherein the user breath training result comprises the following steps: at least one of total expiration time, total inspiration time, real-time respiratory rate, number of exhalations, or number of inhalations.
11. The method according to any one of claims 8-10, further comprising:
and responding to the operation on a second control in the second interface or according to the breathing training result of the user, adjusting the recommended breathing duration and/or the recommended breathing frequency, and updating the first curve.
12. The method according to any one of claims 1-11, further comprising:
obtaining a recommended exercise type and a recommended exercise duration according to the emotion health index of the user;
and displaying a third interface, wherein the third interface is used for prompting the user to adopt the recommended exercise type to conduct exercise training of the recommended exercise duration.
13. The method according to claim 12, wherein the method further comprises:
Displaying a third control in the third interface;
responsive to an operation on the third control, recording athletic training data for the user;
and after the exercise training of the user is finished, displaying the exercise execution condition of the user in the third interface according to the exercise training data of the user.
14. The method according to claim 12 or 13, characterized in that the method further comprises:
displaying a fourth control in the third interface;
and responding to the operation on the fourth control, and adjusting the recommended movement type and/or the recommended movement duration.
15. An electronic device, comprising: a memory and a processor;
the memory is used for storing a computer program or instructions;
the processor is configured to invoke a computer program or instructions in the memory to cause the electronic device to perform the method of any of claims 1-14.
16. A chip system, characterized in that the chip system is applied to an electronic device comprising a memory and a sensor; the chip system includes: a processor; the electronic device performs the method of any of claims 1-14 when the processor executes the computer program or instructions stored in the memory.
17. A computer readable storage medium comprising a computer program or instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-14.
CN202211097691.2A 2022-09-08 2022-09-08 Emotion health state analysis method and electronic equipment Pending CN117694892A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211097691.2A CN117694892A (en) 2022-09-08 2022-09-08 Emotion health state analysis method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211097691.2A CN117694892A (en) 2022-09-08 2022-09-08 Emotion health state analysis method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117694892A true CN117694892A (en) 2024-03-15

Family

ID=90148524

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211097691.2A Pending CN117694892A (en) 2022-09-08 2022-09-08 Emotion health state analysis method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117694892A (en)

Similar Documents

Publication Publication Date Title
US10051177B2 (en) Method for control of camera module based on physiological signal
CN113827185B (en) Wearing tightness degree detection method and device for wearing equipment and wearing equipment
WO2021036568A1 (en) Fitness-assisted method and electronic apparatus
WO2021238460A1 (en) Risk pre-warning method, risk behavior information acquisition method, and electronic device
CN113545745B (en) Usage monitoring method and medium for wearable electronic device and electronic device thereof
CN115525372B (en) Method and device for displaying interface
CN114424927A (en) Sleep monitoring method and device, electronic equipment and computer readable storage medium
WO2022100407A1 (en) Intelligent eye mask, terminal device, and health management method and system
CN113996046B (en) Warming-up judgment method and device and electronic equipment
CN116127082A (en) Data acquisition method, system and related device
CN114073520B (en) Blood oxygen detection equipment based on green light, blood oxygen detection method and medium thereof
WO2022237598A1 (en) Sleep state testing method and electronic device
WO2022100597A1 (en) Adaptive action evaluation method, electronic device, and storage medium
WO2021244186A1 (en) User health management and control method, and electronic device
CN117694892A (en) Emotion health state analysis method and electronic equipment
CN113539487A (en) Data processing method and device and terminal equipment
WO2022247606A1 (en) Respiratory disease screening method and related apparatus
CN115445170B (en) Exercise reminding method and related equipment
EP4293684A1 (en) Hypertension risk measurement method and related apparatus
CN118155837A (en) Arrhythmia risk prediction method and related electronic equipment
WO2022083363A1 (en) Method for periodically measuring blood oxygen, and electronic device
CN113509144B (en) Prompting method and device
CN117796760A (en) Sleep quality prediction method, electronic equipment and system
WO2024099121A1 (en) Risk detection method for vestibular function and electronic device
WO2024146486A1 (en) Blood glucose management method and related electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination